Skip to content

Cli

ZenML CLI.

The ZenML CLI tool is usually downloaded and installed via PyPI and a pip install zenml command. Please see the Installation & Setup section above for more information about that process.

How to use the CLI

Our CLI behaves similarly to many other CLIs for basic features. In order to find out which version of ZenML you are running, type:

   zenml version

If you ever need more information on exactly what a certain command will do, use the --help flag attached to the end of your command string.

For example, to get a sense of all the commands available to you while using the zenml command, type:

   zenml --help

If you were instead looking to know more about a specific command, you can type something like this:

   zenml artifact-store register --help

This will give you information about how to register an artifact store. (See below for more on that).

If you want to instead understand what the concept behind a group is, you can use the explain sub-command. For example, to see more details behind what a artifact-store is, you can type:

zenml artifact-store explain

This will give you an explanation of that concept in more detail.

Beginning a Project

In order to start working on your project, initialize a ZenML repository within your current directory with ZenML's own config and resource management tools:

zenml init

This is all you need to begin using all the MLOps goodness that ZenML provides!

By default, zenml init will install its own hidden .zen folder inside the current directory from which you are running the command. You can also pass in a directory path manually using the --path option:

zenml init --path /path/to/dir

If you wish to use one of the available ZenML project templates to generate a ready-to-use project scaffold in your repository, you can do so by passing the --template option:

zenml init --template <name_of_template>

Running the above command will result in input prompts being shown to you. If you would like to rely on default values for the ZenML project template - you can add --template-with-defaults to the same command, like this:

zenml init --template <name_of_template> --template-with-defaults

In a similar fashion, if you would like to quickly explore the capabilities of ZenML through a notebook, you can also use:

zenml go

Cleaning up

If you wish to delete all data relating to your workspace from the directory, use the zenml clean command. This will:

  • delete all pipelines, pipeline runs and associated metadata
  • delete all artifacts

Using Integrations

Integrations are the different pieces of a project stack that enable custom functionality. This ranges from bigger libraries like kubeflow for orchestration down to smaller visualization tools like facets. Our CLI is an easy way to get started with these integrations.

To list all the integrations available to you, type:

zenml integration list

To see the requirements for a specific integration, use the requirements command:

zenml integration requirements INTEGRATION_NAME

If you wish to install the integration, using the requirements listed in the previous command, install allows you to do this for your local environment:

zenml integration install INTEGRATION_NAME

Note that if you don't specify a specific integration to be installed, the ZenML CLI will install all available integrations.

If you want to install all integrations apart from one or multiple integrations, use the following syntax, for example, which will install all integrations except feast and aws:

zenml integration install -i feast -i aws

Uninstalling a specific integration is as simple as typing:

zenml integration uninstall INTEGRATION_NAME

For all these zenml integration commands, you can pass the --uv flag and we will use uv as the package manager instead of pip. This will resolve and install much faster than with pip, but note that it requires uv to be installed on your machine. This is an experimental feature and may not work on all systems. In particular, note that installing onto machines with GPU acceleration may not work as expected.

If you would like to export the requirements of all ZenML integrations, you can use the command:

zenml integration export-requirements

Here, you can also select a list of integrations and write the result into and output file:

zenml integration export-requirements gcp kubeflow -o OUTPUT_FILE

Filtering when listing

Certain CLI list commands allow you to filter their output. For example, all stack components allow you to pass custom parameters to the list command that will filter the output. To learn more about the available filters, a good quick reference is to use the --help command, as in the following example:

zenml orchestrator list --help

You will see a list of all the available filters for the list command along with examples of how to use them.

The --sort_by option allows you to sort the output by a specific field and takes an asc or desc argument to specify the order. For example, to sort the output of the list command by the name field in ascending order, you would type:

zenml orchestrator list --sort_by "asc:name"

For fields marked as being of type TEXT or UUID, you can use the contains, startswith and endswith keywords along with their particular identifier. For example, for the orchestrator list command, you can use the following filter to find all orchestrators that contain the string sagemaker in their name:

zenml orchestrator list --name "contains:sagemaker"

For fields marked as being of type BOOL, you can use the 'True' or 'False' values to filter the output.

Finally, for fields marked as being of type DATETIME, you can pass in datetime values in the %Y-%m-%d %H:%M:%S format. These can be combined with the gte, lte, gt and lt keywords (greater than or equal, less than or equal, greater than and less than respectively) to specify the range of the filter. For example, if I wanted to find all orchestrators that were created after the 1st of January 2021, I would type:

zenml orchestrator list --created "gt:2021-01-01 00:00:00"

This syntax can also be combined to create more complex filters using the or and and keywords.

Artifact Stores

In ZenML, the artifact store is where all the inputs and outputs of your pipeline steps are stored. By default, ZenML initializes your repository with an artifact store with everything kept on your local machine. You can get a better understanding about the concept of artifact stores by executing:

zenml artifact-store explain

If you wish to register a new artifact store, do so with the register command:

zenml artifact-store register ARTIFACT_STORE_NAME --flavor=ARTIFACT_STORE_FLAVOR [--OPTIONS]

You can also add any labels to your stack component using the --label or -l flag:

zenml artifact-store register ARTIFACT_STORE_NAME --flavor=ARTIFACT_STORE_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new artifact store, you have to choose a flavor. To see the full list of available artifact store flavors, you can use the command:

zenml artifact-store flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml artifact-store flavor describe FLAVOR_NAME

If you wish to list the artifact stores that have already been registered within your ZenML:

zenml artifact-store list

If you want the name of the artifact store in the active stack, you can also use the get command:

zenml artifact-store get

For details about a particular artifact store, use the describe command. By default, (without a specific artifact store name passed in) it will describe the active or currently used artifact store:

zenml artifact-store describe ARTIFACT_STORE_NAME

If you wish to update/rename an artifact store, you can use the following commands respectively:

zenml artifact-store update ARTIFACT_STORE_NAME --property_to_update=new_value
zenml artifact-store rename ARTIFACT_STORE_OLD_NAME ARTIFACT_STORE_NEW_NAME

If you wish to delete a particular artifact store, pass the name of the artifact store into the CLI with the following command:

zenml artifact-store delete ARTIFACT_STORE_NAME

If you would like to connect/disconnect your artifact store to/from a service connector, you can use the following commands:

zenml artifact-store connect ARTIFACT_STORE_NAME -c CONNECTOR_NAME
zenml artifact-store disconnect

The ZenML CLI provides a few more utility functions for you to manage your artifact stores. In order to get a full list of available functions, use the command:

zenml artifact-store --help

Orchestrators

An orchestrator is a special kind of backend that manages the running of each step of the pipeline. Orchestrators administer the actual pipeline runs. By default, ZenML initializes your repository with an orchestrator that runs everything on your local machine. In order to get a more detailed explanation, you can use the command:

zenml orchestrator explain

If you wish to register a new orchestrator, do so with the register command:

zenml orchestrator register ORCHESTRATOR_NAME --flavor=ORCHESTRATOR_FLAVOR [--ORCHESTRATOR_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml orchestrator register ORCHESTRATOR_NAME --flavor=ORCHESTRATOR_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new orchestrator, you have to choose a flavor. To see the full list of available orchestrator flavors, you can use the command:

zenml orchestrator flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml orchestrator flavor describe FLAVOR_NAME

If you wish to list the orchestrators that have already been registered within your ZenML workspace / repository, type:

zenml orchestrator list

If you want the name of the orchestrator in the active stack, you can also use the get command:

zenml orchestrator get

For details about a particular orchestrator, use the describe command. By default, (without a specific orchestrator name passed in) it will describe the active or currently used orchestrator:

zenml orchestrator describe [ORCHESTRATOR_NAME]

If you wish to update/rename an orchestrator, you can use the following commands respectively:

zenml orchestrator update ORCHESTRATOR_NAME --property_to_update=new_value
zenml orchestrator rename ORCHESTRATOR_OLD_NAME ORCHESTRATOR_NEW_NAME

If you wish to delete a particular orchestrator, pass the name of the orchestrator into the CLI with the following command:

zenml orchestrator delete ORCHESTRATOR_NAME

If you would like to connect/disconnect your orchestrator to/from a service connector, you can use the following commands:

zenml orchestrator connect ORCHESTRATOR_NAME -c CONNECTOR_NAME
zenml orchestrator disconnect

The ZenML CLI provides a few more utility functions for you to manage your orchestrators. In order to get a full list of available functions, use the command:

zenml orchestrators --help

Container Registries

The container registry is where all the images that are used by a container-based orchestrator are stored. To get a better understanding regarding container registries, use the command:

zenml container-registry explain

By default, a default ZenML local stack will not register a container registry. If you wish to register a new container registry, do so with the register command:

zenml container-registry register REGISTRY_NAME --flavor=REGISTRY_FLAVOR [--REGISTRY_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml container-registry register REGISTRY_NAME --flavor=REGISTRY_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new container registry, you have to choose a flavor. To see the full list of available container registry flavors, you can use the command:

zenml container-registry flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml container-registry flavor describe FLAVOR_NAME

To list all container registries available and registered for use, use the list command:

zenml container-registry list

If you want the name of the container registry in the active stack, you can also use the get command:

zenml container-registry get

For details about a particular container registry, use the describe command. By default, (without a specific registry name passed in) it will describe the active or currently used container registry:

zenml container-registry describe [CONTAINER_REGISTRY_NAME]

If you wish to update/rename a container registry, you can use the following commands respectively:

zenml container-registry update CONTAINER_REGISTRY_NAME --property_to_update=new_value
zenml container-registry rename CONTAINER_REGISTRY_OLD_NAME CONTAINER_REGISTRY_NEW_NAME

To delete a container registry (and all of its contents), use the delete command:

zenml container-registry delete REGISTRY_NAME

If you would like to connect/disconnect your container registry to/from a service connector, you can use the following commands:

zenml container-registry connect CONTAINER_REGISTRY_NAME -c CONNECTOR_NAME
zenml container-registry disconnect

The ZenML CLI provides a few more utility functions for you to manage your container registries. In order to get a full list of available functions, use the command:

zenml container-registry --help

Data Validators

In ZenML, data validators help you profile and validate your data.

By default, a default ZenML local stack will not register a data validator. If you wish to register a new data validator, do so with the register command:

zenml data-validator register DATA_VALIDATOR_NAME --flavor DATA_VALIDATOR_FLAVOR [--DATA_VALIDATOR_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml data-validator register DATA_VALIDATOR_NAME --flavor DATA_VALIDATOR_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new data validator, you have to choose a flavor. To see the full list of available data validator flavors, you can use the command:

zenml data-validator flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml data-validator flavor describe FLAVOR_NAME

To list all data validators available and registered for use, use the list command:

zenml data-validator list

If you want the name of the data validator in the active stack, use the get command:

zenml data-validator get

For details about a particular data validator, use the describe command. By default, (without a specific data validator name passed in) it will describe the active or currently-used data validator:

zenml data-validator describe [DATA_VALIDATOR_NAME]

If you wish to update/rename a data validator, you can use the following commands respectively:

zenml data-validator update DATA_VALIDATOR_NAME --property_to_update=new_value
zenml data-validator rename DATA_VALIDATOR_OLD_NAME DATA_VALIDATOR_NEW_NAME

To delete a data validator (and all of its contents), use the delete command:

zenml data-validator delete DATA_VALIDATOR_NAME

If you would like to connect/disconnect your data validator to/from a service connector, you can use the following commands:

zenml data-validator connect DATA_VALIDATOR_NAME -c CONNECTOR_NAME
zenml data-validator disconnect

The ZenML CLI provides a few more utility functions for you to manage your data validators. In order to get a full list of available functions, use the command:

zenml data-validator --help

Experiment Trackers

Experiment trackers let you track your ML experiments by logging the parameters and allow you to compare between different runs. To get a better understanding regarding experiment trackers, use the command:

zenml experiment-tracker explain

By default, a default ZenML local stack will not register an experiment tracker. If you want to use an experiment tracker in one of your stacks, you need to first register it:

zenml experiment-tracker register EXPERIMENT_TRACKER_NAME     --flavor=EXPERIMENT_TRACKER_FLAVOR [--EXPERIMENT_TRACKER_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml experiment-tracker register EXPERIMENT_TRACKER_NAME       --flavor=EXPERIMENT_TRACKER_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new experiment tracker, you have to choose a flavor. To see the full list of available experiment tracker flavors, you can use the command:

zenml experiment-tracker flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml experiment-tracker flavor describe FLAVOR_NAME

To list all experiment trackers available and registered for use, use the list command:

zenml experiment-tracker list

If you want the name of the experiment tracker in the active stack, use the get command:

zenml experiment-tracker get

For details about a particular experiment tracker, use the describe command. By default, (without a specific experiment tracker name passed in) it will describe the active or currently-used experiment tracker:

zenml experiment-tracker describe [EXPERIMENT_TRACKER_NAME]

If you wish to update/rename an experiment tracker, you can use the following commands respectively:

zenml experiment-tracker update EXPERIMENT_TRACKER_NAME --property_to_update=new_value
zenml experiment-tracker rename EXPERIMENT_TRACKER_OLD_NAME EXPERIMENT_TRACKER_NEW_NAME

To delete an experiment tracker, use the delete command:

zenml experiment-tracker delete EXPERIMENT_TRACKER_NAME

If you would like to connect/disconnect your experiment tracker to/from a service connector, you can use the following commands:

zenml experiment-tracker connect EXPERIMENT_TRACKER_NAME -c CONNECTOR_NAME
zenml experiment-tracker disconnect

The ZenML CLI provides a few more utility functions for you to manage your experiment trackers. In order to get a full list of available functions, use the command:

zenml experiment-tracker --help

Model Deployers

Model deployers are stack components responsible for online model serving. They are responsible for deploying models to a remote server. Model deployers also act as a registry for models that are served with ZenML. To get a better understanding regarding model deployers, use the command:

zenml model-deployer explain

By default, a default ZenML local stack will not register a model deployer. If you wish to register a new model deployer, do so with the register command:

zenml model-deployer register MODEL_DEPLOYER_NAME --flavor MODEL_DEPLOYER_FLAVOR [--MODEL_DEPLOYER_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml model-deployer register MODEL_DEPLOYER_NAME --flavor MODEL_DEPLOYER_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new model deployer, you have to choose a flavor. To see the full list of available model deployer flavors, you can use the command:

zenml model-deployer flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml model-deployer flavor describe FLAVOR_NAME

To list all model deployers available and registered for use, use the list command:

zenml model-deployer list

If you want the name of the model deployer in the active stack, use the get command:

zenml model-deployer get

For details about a particular model deployer, use the describe command. By default, (without a specific operator name passed in) it will describe the active or currently used model deployer:

zenml model-deployer describe [MODEL_DEPLOYER_NAME]

If you wish to update/rename a model deployer, you can use the following commands respectively:

zenml model-deployer update MODEL_DEPLOYER_NAME --property_to_update=new_value
zenml model-deployer rename MODEL_DEPLOYER_OLD_NAME MODEL_DEPLOYER_NEW_NAME

To delete a model deployer (and all of its contents), use the delete command:

zenml model-deployer delete MODEL_DEPLOYER_NAME

If you would like to connect/disconnect your model deployer to/from a service connector, you can use the following commands:

zenml model-deployer connect MODEL_DEPLOYER_NAME -c CONNECTOR_NAME
zenml model-deployer disconnect

Moreover, ZenML features a set of CLI commands specific to the model deployer interface. If you want to simply see what models have been deployed within your stack, run the following command:

zenml model-deployer models list

This should give you a list of served models containing their uuid, the name of the pipeline that produced them including the run id and the step name as well as the status. This information should help you identify the different models.

If you want further information about a specific model, simply copy the UUID and the following command.

zenml model-deployer models describe <UUID>

If you are only interested in the prediction url of the specific model you can also run:

zenml model-deployer models get-url <UUID>

Finally, you will also be able to start/stop the services using the following two commands:

zenml model-deployer models start <UUID>
zenml model-deployer models stop <UUID>

If you want to completely remove a served model you can also irreversibly delete it using:

zenml model-deployer models delete <UUID>

The ZenML CLI provides a few more utility functions for you to manage your model deployers. In order to get a full list of available functions, use the command:

zenml model-deployer --help

Step Operators

Step operators allow you to run individual steps in a custom environment different from the default one used by your active orchestrator. One example use-case is to run a training step of your pipeline in an environment with GPUs available. To get a better understanding regarding step operators, use the command:

zenml step-operator explain

By default, a default ZenML local stack will not register a step operator. If you wish to register a new step operator, do so with the register command:

zenml step-operator register STEP_OPERATOR_NAME --flavor STEP_OPERATOR_FLAVOR [--STEP_OPERATOR_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml step-operator register STEP_OPERATOR_NAME --flavor STEP_OPERATOR_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new step operator, you have to choose a flavor. To see the full list of available step operator flavors, you can use the command:

zenml step-operator flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml step-operator flavor describe FLAVOR_NAME

To list all step operators available and registered for use, use the list command:

zenml step-operator list

If you want the name of the step operator in the active stack, use the get command:

zenml step-operator get

For details about a particular step operator, use the describe command. By default, (without a specific operator name passed in) it will describe the active or currently used step operator:

zenml step-operator describe [STEP_OPERATOR_NAME]

If you wish to update/rename a step operator, you can use the following commands respectively:

zenml step-operator update STEP_OPERATOR_NAME --property_to_update=new_value
zenml step-operator rename STEP_OPERATOR_OLD_NAME STEP_OPERATOR_NEW_NAME

To delete a step operator (and all of its contents), use the delete command:

zenml step-operator delete STEP_OPERATOR_NAME

If you would like to connect/disconnect your step operator to/from a service connector, you can use the following commands:

zenml step-operator connect STEP_OPERATOR_NAME -c CONNECTOR_NAME
zenml step-operator disconnect

The ZenML CLI provides a few more utility functions for you to manage your step operators. In order to get a full list of available functions, use the command:

zenml step-operator --help

Alerters

In ZenML, alerters allow you to send alerts from within your pipeline.

By default, a default ZenML local stack will not register an alerter. If you wish to register a new alerter, do so with the register command:

zenml alerter register ALERTER_NAME --flavor ALERTER_FLAVOR [--ALERTER_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml alerter register ALERTER_NAME --flavor ALERTER_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new alerter, you have to choose a flavor. To see the full list of available alerter flavors, you can use the command:

zenml alerter flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml alerter flavor describe FLAVOR_NAME

To list all alerters available and registered for use, use the list command:

zenml alerter list

If you want the name of the alerter in the active stack, use the get command:

zenml alerter get

For details about a particular alerter, use the describe command. By default, (without a specific alerter name passed in) it will describe the active or currently used alerter:

zenml alerter describe [ALERTER_NAME]

If you wish to update/rename an alerter, you can use the following commands respectively:

zenml alerter update ALERTER_NAME --property_to_update=new_value
zenml alerter rename ALERTER_OLD_NAME ALERTER_NEW_NAME

To delete an alerter (and all of its contents), use the delete command:

zenml alerter delete ALERTER_NAME

If you would like to connect/disconnect your alerter to/from a service connector, you can use the following commands:

zenml alerter connect ALERTER_NAME -c CONNECTOR_NAME
zenml alerter disconnect

The ZenML CLI provides a few more utility functions for you to manage your alerters. In order to get a full list of available functions, use the command:

zenml alerter --help

Feature Stores

Feature stores allow data teams to serve data via an offline store and an online low-latency store where data is kept in sync between the two. To get a better understanding regarding feature stores, use the command:

zenml feature-store explain

By default, a default ZenML local stack will not register a feature store. If you wish to register a new feature store, do so with the register command:

zenml feature-store register FEATURE_STORE_NAME --flavor FEATURE_STORE_FLAVOR [--FEATURE_STORE_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml feature-store register FEATURE_STORE_NAME --flavor FEATURE_STORE_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new feature store, you have to choose a flavor. To see the full list of available feature store flavors, you can use the command:

zenml feature-store flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

Note: Currently, ZenML only supports connecting to a Redis-backed Feast feature store as a stack component integration.

zenml feature-store flavor describe FLAVOR_NAME

To list all feature stores available and registered for use, use the list command:

zenml feature-store list

If you want the name of the feature store in the active stack, use the get command:

zenml feature-store get

For details about a particular feature store, use the describe command. By default, (without a specific feature store name passed in) it will describe the active or currently-used feature store:

zenml feature-store describe [FEATURE_STORE_NAME]

If you wish to update/rename a feature store, you can use the following commands respectively:

zenml feature-store update FEATURE_STORE_NAME --property_to_update=new_value
zenml feature-store rename FEATURE_STORE_OLD_NAME FEATURE_STORE_NEW_NAME

To delete a feature store (and all of its contents), use the delete command:

zenml feature-store delete FEATURE_STORE_NAME

If you would like to connect/disconnect your feature store to/from a service connector, you can use the following commands:

zenml feature-store connect FEATURE_STORE_NAME -c CONNECTOR_NAME
zenml feature-store disconnect

The ZenML CLI provides a few more utility functions for you to manage your feature stores. In order to get a full list of available functions, use the command:

zenml feature-store --help

Annotators

Annotators enable the use of data annotation as part of your ZenML stack and pipelines.

By default, a default ZenML local stack will not register an annotator. If you wish to register a new annotator, do so with the register command:

zenml annotator register ANNOTATOR_NAME --flavor ANNOTATOR_FLAVOR [--ANNOTATOR_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml annotator register ANNOTATOR_NAME --flavor ANNOTATOR_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new annotator, you have to choose a flavor. To see the full list of available annotator flavors, you can use the command:

zenml annotator flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml annotator flavor describe FLAVOR_NAME

To list all annotator available and registered for use, use the list command:

zenml annotator list

If you want the name of the annotator in the active stack, use the get command:

zenml annotator get

For details about a particular annotator, use the describe command. By default, (without a specific annotator name passed in) it will describe the active or currently used annotator:

zenml annotator describe [ANNOTATOR_NAME]

If you wish to update/rename an annotator, you can use the following commands respectively:

zenml annotator update ANNOTATOR_NAME --property_to_update=new_value
zenml annotator rename ANNOTATOR_OLD_NAME ANNOTATOR_NEW_NAME

To delete an annotator (and all of its contents), use the delete command:

zenml annotator delete ANNOTATOR_NAME

If you would like to connect/disconnect your annotator to/from a service connector, you can use the following commands:

zenml annotator connect ANNOTATOR_NAME -c CONNECTOR_NAME
zenml annotator disconnect

Finally, you can use the dataset command to interact with your annotation datasets:

zenml annotator dataset --help

The ZenML CLI provides a few more utility functions for you to manage your annotator. In order to get a full list of available functions, use the command:

zenml annotator --help

Image Builders

In ZenML, image builders allow you to build container images such that your machine-learning pipelines and steps can be executed in remote environments.

By default, a default ZenML local stack will not register an image builder. If you wish to register a new image builder, do so with the register command:

zenml image-builder register IMAGE_BUILDER_NAME --flavor IMAGE_BUILDER_FLAVOR [--IMAGE_BUILDER_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml image-builder register IMAGE_BUILDER_NAME --flavor IMAGE_BUILDER_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new image builder, you have to choose a flavor. To see the full list of available image builder flavors, you can use the command:

zenml image-builder flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml image-builder flavor describe FLAVOR_NAME

To list all image builders available and registered for use, use the list command:

zenml image-builder list

If you want the name of the image builder in the active stack, use the get command:

zenml image-builder get

For details about a particular image builder, use the describe command. By default, (without a specific image builder name passed in) it will describe the active or currently used image builder:

zenml image-builder describe [IMAGE_BUILDER_NAME]

If you wish to update/rename an image builder, you can use the following commands respectively:

zenml image-builder update IMAGE_BUILDER_NAME --property_to_update=new_value
zenml image-builder rename IMAGE_BUILDER_OLD_NAME IMAGE_BUILDER_NEW_NAME

To delete a image builder (and all of its contents), use the delete command:

zenml image-builder delete IMAGE_BUILDER_NAME

If you would like to connect/disconnect your image builder to/from a service connector, you can use the following commands:

zenml image-builder connect IMAGE_BUILDER_NAME -c CONNECTOR_NAME
zenml image-builder disconnect

The ZenML CLI provides a few more utility functions for you to manage your image builders. In order to get a full list of available functions, use the command:

zenml image-builder --help

Model Registries

Model registries are centralized repositories that facilitate the collaboration and management of machine learning models. To get a better understanding regarding model registries as a concept, use the command:

zenml model-registry explain

By default, a default ZenML local stack will not register a model registry. If you wish to register a new model registry, do so with the register command:

zenml model-registry register MODEL_REGISTRY_NAME --flavor MODEL_REGISTRY_FLAVOR [--MODEL_REGISTRY_OPTIONS]

You can also add any label to your stack component using the --label or -l flag:

zenml model-registry register MODEL_REGISTRY_NAME --flavor MODEL_REGISTRY_FLAVOR -l key1=value1 -l key2=value2

As you can see from the command above, when you register a new model registry, you have to choose a flavor. To see the full list of available model registry flavors, you can use the command:

zenml model-registry flavor list

This list will show you which integration these flavors belong to and which service connectors they are adaptable with. If you would like to get additional information regarding a specific flavor, you can utilize the command:

zenml model-registry flavor describe FLAVOR_NAME

To list all model registries available and registered for use, use the list command:

zenml model-registry list

If you want the name of the model registry in the active stack, use the get command:

zenml model-registry get

For details about a particular model registry, use the describe command. By default, (without a specific operator name passed in) it will describe the active or currently used model registry:

zenml model-registry describe [MODEL_REGISTRY_NAME]

If you wish to update/rename a model registry, you can use the following commands respectively:

zenml model-registry update MODEL_REGISTRY_NAME --property_to_update=new_value
zenml model-registry rename MODEL_REGISTRY_OLD_NAME MODEL_REGISTRY_NEW_NAME

To delete a model registry (and all of its contents), use the delete command:

zenml model-registry delete MODEL_REGISTRY_NAME

If you would like to connect/disconnect your model registry to/from a service connector, you can use the following commands:

zenml model-registry connect MODEL_REGISTRY_NAME -c CONNECTOR_NAME
zenml model-registry disconnect

The ZenML CLI provides a few more utility functions for you to manage your model registries. In order to get a full list of available functions, use the command:

zenml model-registry --help

Managing your Stacks

The stack is a grouping of your artifact store, your orchestrator, and other optional MLOps tools like experiment trackers or model deployers. With the ZenML tool, switching from a local stack to a distributed cloud environment can be accomplished with just a few CLI commands.

To register a new stack, you must already have registered the individual components of the stack using the commands listed above.

Use the zenml stack register command to register your stack. It takes four arguments as in the following example:

zenml stack register STACK_NAME        -a ARTIFACT_STORE_NAME        -o ORCHESTRATOR_NAME

Each corresponding argument should be the name, id or even the first few letters of the id that uniquely identify the artifact store or orchestrator.

To create a new stack using the new service connector with a set of minimal components, use the following command:

zenml stack register STACK_NAME        -p CLOUD_PROVIDER

To create a new stack using the existing service connector with a set of minimal components, use the following command:

zenml stack register STACK_NAME        -sc SERVICE_CONNECTOR_NAME

To create a new stack using the existing service connector with existing components ( important, that the components are already registered in the service connector), use the following command:

zenml stack register STACK_NAME        -sc SERVICE_CONNECTOR_NAME        -a ARTIFACT_STORE_NAME        -o ORCHESTRATOR_NAME        ...

If you want to immediately set this newly created stack as your active stack, simply pass along the --set flag.

zenml stack register STACK_NAME ... --set

To list the stacks that you have registered within your current ZenML workspace, type:

zenml stack list

To delete a stack that you have previously registered, type:

zenml stack delete STACK_NAME

By default, ZenML uses a local stack whereby all pipelines run on your local computer. If you wish to set a different stack as the current active stack to be used when running your pipeline, type:

zenml stack set STACK_NAME

This changes a configuration property within your local environment.

To see which stack is currently set as the default active stack, type:

zenml stack get

If you want to copy a stack, run the following command:

zenml stack copy SOURCE_STACK_NAME TARGET_STACK_NAME

If you wish to transfer one of your stacks to another machine, you can do so by exporting the stack configuration and then importing it again.

To export a stack to YAML, run the following command:

zenml stack export STACK_NAME FILENAME.yaml

This will create a FILENAME.yaml containing the config of your stack and all of its components, which you can then import again like this:

zenml stack import STACK_NAME -f FILENAME.yaml

If you wish to update a stack that you have already registered, first make sure you have registered whatever components you want to use, then use the following command:

# assuming that you have already registered a new orchestrator
# with NEW_ORCHESTRATOR_NAME
zenml stack update STACK_NAME -o NEW_ORCHESTRATOR_NAME

You can update one or many stack components at the same time out of the ones that ZenML supports. To see the full list of options for updating a stack, use the following command:

zenml stack update --help

To remove a stack component from a stack, use the following command:

# assuming you want to remove the image builder and the feature-store
# from your stack
zenml stack remove-component -i -f

If you wish to rename your stack, use the following command:

zenml stack rename STACK_NAME NEW_STACK_NAME

If you would like to export the requirements of your stack, you can use the command:

zenml stack export-requirements <STACK_NAME>

If you want to copy a stack component, run the following command:

zenml STACK_COMPONENT copy SOURCE_COMPONENT_NAME TARGET_COMPONENT_NAME

If you wish to update a specific stack component, use the following command, switching out "STACK_COMPONENT" for the component you wish to update (i.e. 'orchestrator' or 'artifact-store' etc.):

zenml STACK_COMPONENT update --some_property=NEW_VALUE

Note that you are not permitted to update the stack name or UUID in this way. To change the name of your stack component, use the following command:

zenml STACK_COMPONENT rename STACK_COMPONENT_NAME NEW_STACK_COMPONENT_NAME

If you wish to remove an attribute (or multiple attributes) from a stack component, use the following command:

zenml STACK_COMPONENT remove-attribute STACK_COMPONENT_NAME ATTRIBUTE_NAME [OTHER_ATTRIBUTE_NAME]

Note that you can only remove optional attributes.

If you want to register secrets for all secret references in a stack, use the following command:

zenml stack register-secrets [<STACK_NAME>]

If you want to connect a service connector to a stack's components, you can use the connect command:

zenml stack connect STACK_NAME -c CONNECTOR_NAME

Note that this only connects the service connector to the current components of the stack and not to the stack itself, which means that you need to rerun the command after adding new components to the stack.

The ZenML CLI provides a few more utility functions for you to manage your stacks. In order to get a full list of available functions, use the command:

zenml stack --help

Managing your Models

ZenML provides several CLI commands to help you administer your models and their versions as part of the Model Control Plane.

To register a new model, you can use the following CLI command:

zenml model register --name <NAME> [--MODEL_OPTIONS]

To list all registered models, use:

zenml model list [MODEL_FILTER_OPTIONS]

To update a model, use:

zenml model update <MODEL_NAME_OR_ID> [--MODEL_OPTIONS]

If you would like to add or remove tags from the model, use:

zenml model update <MODEL_NAME_OR_ID> --tag <TAG> --tag <TAG> ..
   --remove-tag <TAG> --remove-tag <TAG> ..

To delete a model, use:

zenml model delete <MODEL_NAME_OR_ID>

The CLI interface for models also helps to navigate through artifacts linked to a specific model versions.

zenml model data_artifacts <MODEL_NAME_OR_ID> [-v <VERSION>]
zenml model deployment_artifacts <MODEL_NAME_OR_ID> [-v <VERSION>]
zenml model model_artifacts <MODEL_NAME_OR_ID> [-v <VERSION>]

You can also navigate the pipeline runs linked to a specific model versions:

zenml model runs <MODEL_NAME_OR_ID> [-v <VERSION>]

To list the model versions of a specific model, use:

zenml model version list [--model-name <MODEL_NAME> --name <MODEL_VERSION_NAME> OTHER_OPTIONS]

To delete a model version, use:

zenml model version delete <MODEL_NAME_OR_ID> <VERSION>

To update a model version, use:

zenml model version update <MODEL_NAME_OR_ID> <VERSION> [--MODEL_VERSION_OPTIONS]

These are some of the more common uses of model version updates:

  • stage (i.e. promotion)
zenml model version update <MODEL_NAME_OR_ID> <VERSION> --stage <STAGE>
  • tags
zenml model version update <MODEL_NAME_OR_ID> <VERSION> --tag <TAG> --tag <TAG> ..
   --remove-tag <TAG> --remove-tag <TAG> ..

Managing your Pipelines & Artifacts

ZenML provides several CLI commands to help you administer your pipelines and pipeline runs.

To explicitly register a pipeline you need to point to a pipeline instance in your Python code. Let's say you have a Python file called run.py and it contains the following code:

from zenml import pipeline

@pipeline
def my_pipeline(...):
   # Connect your pipeline steps here
   pass

You can register your pipeline like this:

zenml pipeline register run.my_pipeline

To list all registered pipelines, use:

zenml pipeline list

To delete a pipeline, run:

zenml pipeline delete <PIPELINE_NAME>

This will delete the pipeline and change all corresponding pipeline runs to become unlisted (not linked to any pipeline).

To list all pipeline runs that you have executed, use:

zenml pipeline runs list

To delete a pipeline run, use:

zenml pipeline runs delete <PIPELINE_RUN_NAME_OR_ID>

To refresh the status of a pipeline run, you can use the refresh command ( only supported for pipelines executed on Vertex, Sagemaker or AzureML).

zenml pipeline runs refresh <PIPELINE_RUN_NAME_OR_ID>

If you run any of your pipelines with pipeline.run(schedule=...), ZenML keeps track of the schedule and you can list all schedules via:

zenml pipeline schedule list

To delete a schedule, use:

zenml pipeline schedule delete <SCHEDULE_NAME_OR_ID>

Note, however, that this will only delete the reference saved in ZenML and does NOT stop/delete the schedule in the respective orchestrator. This still needs to be done manually. For example, using the Airflow orchestrator you would have to open the web UI to manually click to stop the schedule from executing.

Each pipeline run automatically saves its artifacts in the artifact store. To list all artifacts that have been saved, use:

zenml artifact list

Each artifact has one or several versions. To list artifact versions, use:

zenml artifact versions list

If you would like to rename an artifact or adjust the tags of an artifact or artifact version, use the corresponding update command:

zenml artifact update <NAME> -n <NEW_NAME>
zenml artifact update <NAME> -t <TAG1> -t <TAG2> -r <TAG_TO_REMOVE>
zenml artifact version update <NAME> -v <VERSION> -t <TAG1> -t <TAG2> -r <TAG_TO_REMOVE>

The metadata of artifacts or artifact versions stored by ZenML can only be deleted once they are no longer used by any pipeline runs. I.e., an artifact version can only be deleted if the run that produced it and all runs that used it as an input have been deleted. Similarly, an artifact can only be deleted if all its versions can be deleted.

To delete all artifacts and artifact versions that are no longer linked to any pipeline runs, use:

zenml artifact prune

You might find that some artifacts throw errors when you try to prune them, likely because they were stored locally and no longer exist. If you wish to continue pruning and to ignore these errors, please add the --ignore-errors flag. Warning messages will still be output to the terminal during this process.

Each pipeline run that requires Docker images also stores a build which contains the image names used for this run. To list all builds, use:

zenml pipeline builds list

To delete a specific build, use:

zenml pipeline builds delete <BUILD_ID>

Managing the local ZenML Dashboard

The ZenML dashboard is a web-based UI that allows you to visualize and navigate the stack configurations, pipelines and pipeline runs tracked by ZenML among other things. You can start the ZenML dashboard locally by running the following command:

zenml login --local

This will start the dashboard on your local machine where you can access it at the URL printed to the console.

If you have closed the dashboard in your browser and want to open it again, you can run:

zenml show

If you want to stop the dashboard, simply run:

zenml logout --local

The zenml login --local command has a few additional options that you can use to customize how the ZenML dashboard is running.

By default, the dashboard is started as a background process. On some operating systems, this capability is not available. In this case, you can use the --blocking flag to start the dashboard in the foreground:

zenml login --local --blocking

This will block the terminal until you stop the dashboard with CTRL-C.

Another option you can use, if you have Docker installed on your machine, is to run the dashboard in a Docker container. This is useful if you don't want to install all the Zenml server dependencies on your machine. To do so, simply run:

zenml login --local --docker

The TCP port and the host address that the dashboard uses to listen for connections can also be customized. Using an IP address that is not the default localhost or 127.0.0.1 is especially useful if you're running some type of local ZenML orchestrator, such as the k3d Kubeflow orchestrator or Docker orchestrator, that cannot directly connect to the local ZenML server.

For example, to start the dashboard on port 9000 and have it listen on all locally available interfaces on your machine, run:

zenml login --local --port 9000 --ip-address 0.0.0.0

Note that the above 0.0.0.0 IP address also exposes your ZenML dashboard externally through your public interface. Alternatively, you can choose an explicit IP address that is configured on one of your local interfaces, such as the Docker bridge interface, which usually has the IP address 172.17.0.1:

zenml login --local --port 9000 --ip-address 172.17.0.1

If you would like to take a look at the logs for the local ZenML server:

zenml logs

Connecting to a ZenML Server

The ZenML client can be configured to connect to a local ZenML server, a remote database or a remote ZenML server with the zenml login command.

To connect or re-connect to any ZenML server, if you know its URL, you can simply run:

zenml login https://zenml.example.com:8080

Running zenml login without any arguments will check if your current ZenML server session is still valid. If it is not, you will be prompted to log in again. This is useful if you quickly want to refresh your CLI session when the current session expires.

You can open the ZenML dashboard of your currently connected ZenML server using the following command:

zenml server show

Note that if you have set your AUTO_OPEN_DASHBOARD environment variable to false then this will not open the dashboard until you set it back to true.

The CLI can be authenticated to multiple ZenML servers at the same time, even though it can only be connected to one server at a time. You can list all the ZenML servers that the client is currently authenticated to by running:

zenml server list

To disconnect from the current ZenML server and revert to using the local default database, use the following command:

zenml logout

You can inspect the current ZenML configuration at any given time using the following command:

zenml status

Example output:

$ zenml status
-----ZenML Client Status-----
Connected to a ZenML Pro server: `test-zenml-login` [16f8a35d-5c2f-44aa-a564-b34186fbf6d6]
  ZenML Pro Organization: My Organization
  ZenML Pro authentication: valid until 2024-10-26 10:18:51 CEST (in 20h37m9s)
  Dashboard: https://cloud.zenml.io/organizations/bf873af9-aaf9-4ad1-a08e-3dc6d920d590/tenants/16f8336d-5c2f-44aa-a534-b34186fbf6d6
  API: https://6784e58f-zenml.staging.cloudinfra.zenml.io
  Server status: 'available'
  Server authentication: never expires
  The active user is: 'user'
  The active stack is: 'default' (global)
Using configuration from: '/home/user/.config/zenml'
Local store files are located at: '/home/user/.config/zenml/local_stores'

-----Local ZenML Server Status-----
The local daemon server is running at: http://127.0.0.1:8237

When connecting to a ZenML server using the web login flow, you will be provided with the option to Trust this device. If you opt out of it a 24-hour token will be issued for the authentication service. If you opt-in, you will be issued a 30-day token instead.

If you would like to see a list of all trusted devices, you can use:

zenml authorized-device list

or if you would like to get the details regarding a specific device, you can use:

zenml authorized-device describe DEVICE_ID_OR_PREFIX

Alternatively, you can lock and unlock an authorized device by using the following commands:

zenml authorized-device lock DEVICE_ID_OR_PREFIX
zenml authorized-device unlock DEVICE_ID_OR_PREFIX

Finally, you can remove an authorized device by using the delete command:

zenml authorized-device delete DEVICE_ID_OR_PREFIX

Secrets management

ZenML offers a way to securely store secrets associated with your other stack components and infrastructure. A ZenML Secret is a collection or grouping of key-value pairs stored by the ZenML secrets store. ZenML Secrets are identified by a unique name which allows you to fetch or reference them in your pipelines and stacks.

Depending on how you set up and deployed ZenML, the secrets store keeps secrets in the local database or uses the ZenML server your client is connected to:

  • if you are using the default ZenML client settings, or if you connect your ZenML client to a local ZenML server started with zenml login --local, the secrets store is using the same local SQLite database as the rest of ZenML
  • if you connect your ZenML client to a remote ZenML server, the secrets are no longer managed on your local machine, but through the remote server instead. Secrets are stored in whatever secrets store back-end the remote server is configured to use. This can be a SQL database, one of the managed cloud secrets management services, or even a custom back-end.

To create a secret, use the create command and pass the key-value pairs as command-line arguments:

zenml secret create SECRET_NAME --key1=value1 --key2=value2 --key3=value3 ...

# Another option is to use the '--values' option and provide key-value pairs in either JSON or YAML format.
zenml secret create SECRET_NAME --values='{"key1":"value2","key2":"value2","key3":"value3"}'

Note that when using the previous command the keys and values will be preserved in your bash_history file, so you may prefer to use the interactive create command instead:

zenml secret create SECRET_NAME -i

As an alternative to the interactive mode, also useful for values that are long or contain newline or special characters, you can also use the special @ syntax to indicate to ZenML that the value needs to be read from a file:

zenml secret create SECRET_NAME    --aws_access_key_id=1234567890    --aws_secret_access_key=abcdefghij    --aws_session_token=@/path/to/token.txt

# Alternatively for providing key-value pairs, you can utilize the '--values' option by specifying a file path containing
# key-value pairs in either JSON or YAML format.
zenml secret create SECRET_NAME --values=@/path/to/token.txt

To list all the secrets available, use the list command:

zenml secret list

To get the key-value pairs for a particular secret, use the get command:

zenml secret get SECRET_NAME

To update a secret, use the update command:

zenml secret update SECRET_NAME --key1=value1 --key2=value2 --key3=value3 ...

# Another option is to use the '--values' option and provide key-value pairs in either JSON or YAML format.
zenml secret update SECRET_NAME --values='{"key1":"value2","key2":"value2","key3":"value3"}'

Note that when using the previous command the keys and values will be preserved in your bash_history file, so you may prefer to use the interactive update command instead:

zenml secret update SECRET_NAME -i

Finally, to delete a secret, use the delete command:

zenml secret delete SECRET_NAME

Secrets can be scoped to a workspace or a user. By default, secrets are scoped to the current workspace. To scope a secret to a user, use the --scope user argument in the register command.

Auth management

Building and maintaining an MLOps workflow can involve numerous third-party libraries and external services. In most cases, this ultimately presents a challenge in configuring uninterrupted, secure access to infrastructure resources. In ZenML, Service Connectors streamline this process by abstracting away the complexity of authentication and help you connect your stack to your resources. You can find the full docs on the ZenML service connectors here.

The ZenML CLI features a variety of commands to help you manage your service connectors. First of all, to explore all the types of service connectors available in ZenML, you can use the following commands:

# To get the complete list
zenml service-connector list-types

# To get the details regarding a single type
zenml service-connector describe-type

For each type of service connector, you will also see a list of supported resource types. These types provide a way for organizing different resources into logical classes based on the standard and/or protocol used to access them. In addition to the resource types, each type will feature a different set of authentication methods.

Once you decided which service connector to use, you can create it with the register command as follows:

zenml service-connector register SERVICE_CONNECTOR_NAME     --type TYPE [--description DESCRIPTION] [--resource-type RESOURCE_TYPE]     [--auth-method AUTH_METHOD] ...

For more details on how to create a service connector, please refer to our docs.

To check if your service connector is registered properly, you can verify it. By doing this, you can both check if it is configured correctly and also, you can fetch the list of resources it has access to:

zenml service-connector verify SERVICE_CONNECTOR_NAME_ID_OR_PREFIX

Some service connectors come equipped with the capability of configuring the clients and SDKs on your local machine with the credentials inferred from your service connector. To use this functionality, simply use the login command:

zenml service-connector login SERVICE_CONNECTOR_NAME_ID_OR_PREFIX

To list all the service connectors that you have registered, you can use:

zenml service-connector list

Moreover, if you would like to list all the resources accessible by your service connectors, you can use the following command:

zenml service-connector list-resources [--resource-type RESOURCE_TYPE] /
    [--connector-type CONNECTOR_TYPE] ...

This command can possibly take a long time depending on the number of service connectors you have registered. Consider using the right filters when you are listing resources.

If you want to see the details about a specific service connector that you have registered, you can use the describe command:

zenml service-connector describe SERVICE_CONNECTOR_NAME_ID_OR_PREFIX

You can update a registered service connector by using the update command. Keep in mind that all service connector updates are validated before being applied. If you want to disable this behavior please use the --no-verify flag.

zenml service-connector update SERVICE_CONNECTOR_NAME_ID_OR_PREFIX ...

Finally, if you wish to remove a service connector, you can use the delete command:

zenml service-connector delete SERVICE_CONNECTOR_NAME_ID_OR_PREFIX

Managing users

When using the ZenML service, you can manage permissions by managing users using the CLI. If you want to create a new user or delete an existing one, run either

zenml user create USER_NAME
zenml user delete USER_NAME

To see a list of all users, run:

zenml user list

For detail about the particular user, use the describe command. By default, (without a specific user name passed in) it will describe the active user:

zenml user describe [USER_NAME]

If you want to update any properties of a specific user, you can use the update command. Use the --help flag to get a full list of available properties to update:

zenml user update --help

If you want to change the password of the current user account:

zenml user change-password --help

Service Accounts

ZenML supports the use of service accounts to authenticate clients to the ZenML server using API keys. This is useful for automating tasks such as running pipelines or deploying models.

To create a new service account, run:

zenml service-account create SERVICE_ACCOUNT_NAME

This command creates a service account and an API key for it. The API key is displayed as part of the command output and cannot be retrieved later. You can then use the issued API key to connect your ZenML client to the server with the CLI:

zenml login https://... --api-key

or by setting the ZENML_STORE_URL and ZENML_STORE_API_KEY environment variables when you set up your ZenML client for the first time:

export ZENML_STORE_URL=https://...
export ZENML_STORE_API_KEY=<API_KEY>

You don't need to run zenml login after setting these two environment variables and can start interacting with your server right away.

To see all the service accounts you've created and their API keys, use the following commands:

zenml service-account list
zenml service-account api-key <SERVICE_ACCOUNT_NAME> list

Additionally, the following command allows you to more precisely inspect one of these service accounts and an API key:

zenml service-account describe <SERVICE_ACCOUNT_NAME>
zenml service-account api-key <SERVICE_ACCOUNT_NAME> describe <API_KEY_NAME>

API keys don't have an expiration date. For increased security, we recommend that you regularly rotate the API keys to prevent unauthorized access to your ZenML server. You can do this with the ZenML CLI:

zenml service-account api-key <SERVICE_ACCOUNT_NAME> rotate <API_KEY_NAME>

Running this command will create a new API key and invalidate the old one. The new API key is displayed as part of the command output and cannot be retrieved later. You can then use the new API key to connect your ZenML client to the server just as described above.

When rotating an API key, you can also configure a retention period for the old API key. This is useful if you need to keep the old API key for a while to ensure that all your workloads have been updated to use the new API key. You can do this with the --retain flag. For example, to rotate an API key and keep the old one for 60 minutes, you can run the following command:

zenml service-account api-key <SERVICE_ACCOUNT_NAME> rotate <API_KEY_NAME>       --retain 60

For increased security, you can deactivate a service account or an API key using one of the following commands:

zenml service-account update <SERVICE_ACCOUNT_NAME> --active false
zenml service-account api-key <SERVICE_ACCOUNT_NAME> update <API_KEY_NAME>       --active false

Deactivating a service account or an API key will prevent it from being used to authenticate and has immediate effect on all workloads that use it.

To permanently delete an API key for a service account, use the following command:

zenml service-account api-key <SERVICE_ACCOUNT_NAME> delete <API_KEY_NAME>

Managing Code Repositories

Code repositories enable ZenML to keep track of the code version that you use for your pipeline runs. Additionally, running a pipeline which is tracked in a registered code repository can decrease the time it takes Docker to build images for containerized stack components.

To register a code repository, use the following CLI command:

zenml code-repository register <NAME> --type=<CODE_REPOSITORY_TYPE]    [--CODE_REPOSITORY_OPTIONS]

ZenML currently supports code repositories of type github and gitlab, but you can also use your custom code repository implementation by passing the type custom and a source of your repository class.

zenml code-repository register <NAME> --type=custom    --source=<CODE_REPOSITORY_SOURCE> [--CODE_REPOSITORY_OPTIONS]

The CODE_REPOSITORY_OPTIONS depend on the configuration necessary for the type of code repository that you're using.

If you want to list your registered code repositories, run:

zenml code-repository list

You can delete one of your registered code repositories like this:

zenml code-repository delete <REPOSITORY_NAME_OR_ID>

Building an image without Runs

To build or run a pipeline from the CLI, you need to know the source path of your pipeline. Let's imagine you have defined your pipeline in a python file called run.py like this:

from zenml import pipeline

@pipeline
def my_pipeline(...):
   # Connect your pipeline steps here
   pass

The source path of your pipeline will be run.my_pipeline. In a generalized way, this will be <MODULE_PATH>.<PIPELINE_FUNCTION_NAME>. If the python file defining the pipeline is not in your current directory, the module path consists of the full path to the file, separated by dots, e.g. some_directory.some_file.my_pipeline.

To build Docker images for your pipeline without actually running the pipeline, use:

zenml pipeline build <PIPELINE_SOURCE_PATH>

To specify settings for the Docker builds, use the --config/-c option of the command. For more information about the structure of this configuration file, check out the zenml.pipelines.base_pipeline.BasePipeline.build(...) method.

zenml pipeline build <PIPELINE_SOURCE_PATH> --config=<PATH_TO_CONFIG_YAML>

If you want to build the pipeline for a stack other than your current active stack, use the --stack option.

zenml pipeline build <PIPELINE_SOURCE_PATH> --stack=<STACK_ID_OR_NAME>

To run a pipeline that was previously registered, use:

zenml pipeline run <PIPELINE_SOURCE_PATH>

To specify settings for the pipeline, use the --config/-c option of the command. For more information about the structure of this configuration file, check out the zenml.pipelines.base_pipeline.BasePipeline.run(...) method.

zenml pipeline run <PIPELINE_SOURCE_PATH> --config=<PATH_TO_CONFIG_YAML>

If you want to run the pipeline on a stack different than your current active stack, use the --stack option.

zenml pipeline run <PIPELINE_SOURCE_PATH> --stack=<STACK_ID_OR_NAME>

If you want to create a run template based on your pipeline that can later be used to trigger a run either from the dashboard or through an HTTP request:

zenml pipeline create-run-template <PIPELINE_SOURCE_PATH>    --name=<TEMPLATE_NAME>

To specify a config file, use the `--config/-c` option. If you would like to use a different stack than the active one, use the `--stack` option.

```bash
zenml pipeline create-run-template <PIPELINE_SOURCE_PATH>    --name=<TEMPLATE_NAME>    --config=<PATH_TO_CONFIG_YAML>    --stack=<STACK_ID_OR_NAME>

Tagging your resources with ZenML

When you are using ZenML, you can use tags to organize and categorize your assets. This way, you can streamline your workflows and enhance the discoverability of your resources more easily.

Currently, you can use tags with artifacts, models and their versions:

# Tag the artifact
zenml artifact update ARTIFACT_NAME -t TAG_NAME

# Tag the artifact version
zenml artifact version update ARTIFACT_NAME ARTIFACT_VERSION -t TAG_NAME

# Tag an existing model
zenml model update MODEL_NAME --tag TAG_NAME

# Tag a specific model version
zenml model version update MODEL_NAME VERSION_NAME --tag TAG_NAME

Besides these interactions, you can also create a new tag by using the register command:

zenml tag register -n TAG_NAME [-c COLOR]

If you would like to list all the tags that you have, you can use the command:

zenml tag list

To update the properties of a specific tag, you can use the update subcommand:

zenml tag update TAG_NAME_OR_ID [-n NEW_NAME] [-c NEW_COLOR]

Finally, in order to delete a tag, you can execute:

zenml tag delete TAG_NAME_OR_ID

Managing the Global Configuration

The ZenML global configuration CLI commands cover options such as enabling or disabling the collection of anonymous usage statistics, changing the logging verbosity.

In order to help us better understand how the community uses ZenML, the library reports anonymized usage statistics. You can always opt out by using the CLI command:

zenml analytics opt-out

If you want to opt back in, use the following command:

zenml analytics opt-in

The verbosity of the ZenML client output can be configured using the zenml logging command. For example, to set the verbosity to DEBUG, run:

zenml logging set-verbosity DEBUG

APIKeyFilter

Bases: BaseFilter

Filter model for API keys.

Source code in src/zenml/models/v2/core/api_key.py
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
class APIKeyFilter(BaseFilter):
    """Filter model for API keys."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "service_account",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
        "service_account",
    ]

    service_account: Optional[UUID] = Field(
        default=None,
        description="The service account to scope this query to.",
    )
    name: Optional[str] = Field(
        default=None,
        description="Name of the API key",
    )
    description: Optional[str] = Field(
        default=None,
        title="Filter by the API key description.",
    )
    active: Optional[Union[bool, str]] = Field(
        default=None,
        title="Whether the API key is active.",
        union_mode="left_to_right",
    )
    last_login: Optional[Union[datetime, str]] = Field(
        default=None,
        title="Time when the API key was last used to log in.",
        union_mode="left_to_right",
    )
    last_rotated: Optional[Union[datetime, str]] = Field(
        default=None,
        title="Time when the API key was last rotated.",
        union_mode="left_to_right",
    )

    def set_service_account(self, service_account_id: UUID) -> None:
        """Set the service account by which to scope this query.

        Args:
            service_account_id: The service account ID.
        """
        self.service_account = service_account_id

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Override to apply the service account scope as an additional filter.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)

        if self.service_account:
            scope_filter = (
                getattr(table, "service_account_id") == self.service_account
            )
            query = query.where(scope_filter)

        return query

apply_filter(query, table)

Override to apply the service account scope as an additional filter.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/api_key.py
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Override to apply the service account scope as an additional filter.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)

    if self.service_account:
        scope_filter = (
            getattr(table, "service_account_id") == self.service_account
        )
        query = query.where(scope_filter)

    return query

set_service_account(service_account_id)

Set the service account by which to scope this query.

Parameters:

Name Type Description Default
service_account_id UUID

The service account ID.

required
Source code in src/zenml/models/v2/core/api_key.py
377
378
379
380
381
382
383
def set_service_account(self, service_account_id: UUID) -> None:
    """Set the service account by which to scope this query.

    Args:
        service_account_id: The service account ID.
    """
    self.service_account = service_account_id

AnalyticsEvent

Bases: str, Enum

Enum of events to track in segment.

Source code in src/zenml/analytics/enums.py
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
class AnalyticsEvent(str, Enum):
    """Enum of events to track in segment."""

    # Login
    DEVICE_VERIFIED = "Device verified"

    # Onboarding
    USER_ENRICHED = "User Enriched"

    # Pipelines
    RUN_PIPELINE = "Pipeline run"
    RUN_PIPELINE_ENDED = "Pipeline run ended"
    CREATE_PIPELINE = "Pipeline created"
    BUILD_PIPELINE = "Pipeline built"

    # Template
    GENERATE_TEMPLATE = "Template generated"

    # Components
    REGISTERED_STACK_COMPONENT = "Stack component registered"

    # Code repository
    REGISTERED_CODE_REPOSITORY = "Code repository registered"

    # Stack
    REGISTERED_STACK = "Stack registered"
    UPDATED_STACK = "Stack updated"

    # Trigger
    CREATED_TRIGGER = "Trigger created"
    UPDATED_TRIGGER = "Trigger updated"

    # Templates
    CREATED_RUN_TEMPLATE = "Run template created"
    EXECUTED_RUN_TEMPLATE = "Run templated executed"

    # Model Control Plane
    MODEL_DEPLOYED = "Model deployed"
    CREATED_MODEL = "Model created"
    CREATED_MODEL_VERSION = "Model Version created"

    # Analytics opt in and out
    OPT_IN_ANALYTICS = "Analytics opt-in"
    OPT_OUT_ANALYTICS = "Analytics opt-out"
    OPT_IN_OUT_EMAIL = "Response for Email prompt"

    # Examples
    RUN_ZENML_GO = "ZenML go"

    # Workspaces
    CREATED_WORKSPACE = "Workspace created"

    # Flavor
    CREATED_FLAVOR = "Flavor created"

    # Secret
    CREATED_SECRET = "Secret created"

    # Service connector
    CREATED_SERVICE_CONNECTOR = "Service connector created"

    # Service account and API keys
    CREATED_SERVICE_ACCOUNT = "Service account created"

    # Full stack infrastructure deployment
    DEPLOY_FULL_STACK = "Full stack deployed"

    # Tag created
    CREATED_TAG = "Tag created"

    # Server Settings
    SERVER_SETTINGS_UPDATED = "Server Settings Updated"

AnalyticsEventSource

Bases: StrEnum

Enum to identify analytics events source.

Source code in src/zenml/enums.py
214
215
216
217
218
219
class AnalyticsEventSource(StrEnum):
    """Enum to identify analytics events source."""

    ZENML_GO = "zenml go"
    ZENML_INIT = "zenml init"
    ZENML_SERVER = "zenml server"

ArtifactFilter

Bases: TaggableFilter

Model to enable advanced filtering of artifacts.

Source code in src/zenml/models/v2/core/artifact.py
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
class ArtifactFilter(TaggableFilter):
    """Model to enable advanced filtering of artifacts."""

    name: Optional[str] = None
    has_custom_name: Optional[bool] = None

    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        SORT_BY_LATEST_VERSION_KEY,
    ]

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query for Artifacts.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, case, col, desc, func, select

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import (
            ArtifactSchema,
            ArtifactVersionSchema,
        )

        sort_by, operand = self.sorting_params

        if sort_by == SORT_BY_LATEST_VERSION_KEY:
            # Subquery to find the latest version per artifact
            latest_version_subquery = (
                select(
                    ArtifactSchema.id,
                    case(
                        (
                            func.max(ArtifactVersionSchema.created).is_(None),
                            ArtifactSchema.created,
                        ),
                        else_=func.max(ArtifactVersionSchema.created),
                    ).label("latest_version_created"),
                )
                .outerjoin(
                    ArtifactVersionSchema,
                    ArtifactSchema.id == ArtifactVersionSchema.artifact_id,  # type: ignore[arg-type]
                )
                .group_by(col(ArtifactSchema.id))
                .subquery()
            )

            query = query.add_columns(
                latest_version_subquery.c.latest_version_created,
            ).where(ArtifactSchema.id == latest_version_subquery.c.id)

            # Apply sorting based on the operand
            if operand == SorterOps.ASCENDING:
                query = query.order_by(
                    asc(latest_version_subquery.c.latest_version_created),
                    asc(ArtifactSchema.id),
                )
            else:
                query = query.order_by(
                    desc(latest_version_subquery.c.latest_version_created),
                    desc(ArtifactSchema.id),
                )
            return query

        # For other sorting cases, delegate to the parent class
        return super().apply_sorting(query=query, table=table)

apply_sorting(query, table)

Apply sorting to the query for Artifacts.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/artifact.py
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query for Artifacts.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, case, col, desc, func, select

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import (
        ArtifactSchema,
        ArtifactVersionSchema,
    )

    sort_by, operand = self.sorting_params

    if sort_by == SORT_BY_LATEST_VERSION_KEY:
        # Subquery to find the latest version per artifact
        latest_version_subquery = (
            select(
                ArtifactSchema.id,
                case(
                    (
                        func.max(ArtifactVersionSchema.created).is_(None),
                        ArtifactSchema.created,
                    ),
                    else_=func.max(ArtifactVersionSchema.created),
                ).label("latest_version_created"),
            )
            .outerjoin(
                ArtifactVersionSchema,
                ArtifactSchema.id == ArtifactVersionSchema.artifact_id,  # type: ignore[arg-type]
            )
            .group_by(col(ArtifactSchema.id))
            .subquery()
        )

        query = query.add_columns(
            latest_version_subquery.c.latest_version_created,
        ).where(ArtifactSchema.id == latest_version_subquery.c.id)

        # Apply sorting based on the operand
        if operand == SorterOps.ASCENDING:
            query = query.order_by(
                asc(latest_version_subquery.c.latest_version_created),
                asc(ArtifactSchema.id),
            )
        else:
            query = query.order_by(
                desc(latest_version_subquery.c.latest_version_created),
                desc(ArtifactSchema.id),
            )
        return query

    # For other sorting cases, delegate to the parent class
    return super().apply_sorting(query=query, table=table)

ArtifactResponse

Bases: BaseIdentifiedResponse[ArtifactResponseBody, ArtifactResponseMetadata, ArtifactResponseResources]

Artifact response model.

Source code in src/zenml/models/v2/core/artifact.py
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
class ArtifactResponse(
    BaseIdentifiedResponse[
        ArtifactResponseBody,
        ArtifactResponseMetadata,
        ArtifactResponseResources,
    ]
):
    """Artifact response model."""

    def get_hydrated_version(self) -> "ArtifactResponse":
        """Get the hydrated version of this artifact.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_artifact(self.id)

    name: str = Field(
        title="Name of the output in the parent step.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    # Body and metadata properties
    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_body().tags

    @property
    def latest_version_name(self) -> Optional[str]:
        """The `latest_version_name` property.

        Returns:
            the value of the property.
        """
        return self.get_body().latest_version_name

    @property
    def latest_version_id(self) -> Optional[UUID]:
        """The `latest_version_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().latest_version_id

    @property
    def has_custom_name(self) -> bool:
        """The `has_custom_name` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().has_custom_name

    # Helper methods
    @property
    def versions(self) -> Dict[str, "ArtifactVersionResponse"]:
        """Get a list of all versions of this artifact.

        Returns:
            A list of all versions of this artifact.
        """
        from zenml.client import Client

        responses = Client().list_artifact_versions(name=self.name)
        return {str(response.version): response for response in responses}

has_custom_name property

The has_custom_name property.

Returns:

Type Description
bool

the value of the property.

latest_version_id property

The latest_version_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

latest_version_name property

The latest_version_name property.

Returns:

Type Description
Optional[str]

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

versions property

Get a list of all versions of this artifact.

Returns:

Type Description
Dict[str, ArtifactVersionResponse]

A list of all versions of this artifact.

get_hydrated_version()

Get the hydrated version of this artifact.

Returns:

Type Description
ArtifactResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/artifact.py
117
118
119
120
121
122
123
124
125
def get_hydrated_version(self) -> "ArtifactResponse":
    """Get the hydrated version of this artifact.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_artifact(self.id)

ArtifactVersionFilter

Bases: WorkspaceScopedFilter, TaggableFilter

Model to enable advanced filtering of artifact versions.

Source code in src/zenml/models/v2/core/artifact_version.py
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
class ArtifactVersionFilter(WorkspaceScopedFilter, TaggableFilter):
    """Model to enable advanced filtering of artifact versions."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        "name",
        "only_unused",
        "has_custom_name",
        "model",
        "pipeline_run",
        "model_version_id",
        "run_metadata",
    ]
    CUSTOM_SORTING_OPTIONS = [
        *WorkspaceScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
    ]
    CLI_EXCLUDE_FIELDS = [
        *WorkspaceScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    artifact_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="ID of the artifact to which this version belongs.",
        union_mode="left_to_right",
    )
    name: Optional[str] = Field(
        default=None,
        description="Name of the artifact to which this version belongs.",
    )
    version: Optional[str] = Field(
        default=None,
        description="Version of the artifact",
    )
    version_number: Optional[Union[int, str]] = Field(
        default=None,
        description="Version of the artifact if it is an integer",
        union_mode="left_to_right",
    )
    uri: Optional[str] = Field(
        default=None,
        description="Uri of the artifact",
    )
    materializer: Optional[str] = Field(
        default=None,
        description="Materializer used to produce the artifact",
    )
    type: Optional[str] = Field(
        default=None,
        description="Type of the artifact",
    )
    data_type: Optional[str] = Field(
        default=None,
        description="Datatype of the artifact",
    )
    artifact_store_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Artifact store for this artifact",
        union_mode="left_to_right",
    )
    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="ID of the model version that is associated with this "
        "artifact version.",
        union_mode="left_to_right",
    )
    only_unused: Optional[bool] = Field(
        default=False, description="Filter only for unused artifacts"
    )
    has_custom_name: Optional[bool] = Field(
        default=None,
        description="Filter only artifacts with/without custom names.",
    )
    user: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the user that created the artifact version.",
    )
    model: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the model that is associated with this "
        "artifact version.",
    )
    pipeline_run: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of a pipeline run that is associated with this "
        "artifact version.",
    )
    run_metadata: Optional[Dict[str, Any]] = Field(
        default=None,
        description="The run_metadata to filter the artifact versions by.",
    )

    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List[Union["ColumnElement[bool]"]]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_, or_, select

        from zenml.zen_stores.schemas import (
            ArtifactSchema,
            ArtifactVersionSchema,
            ModelSchema,
            ModelVersionArtifactSchema,
            ModelVersionSchema,
            PipelineRunSchema,
            RunMetadataResourceSchema,
            RunMetadataSchema,
            StepRunInputArtifactSchema,
            StepRunOutputArtifactSchema,
            StepRunSchema,
        )

        if self.name:
            value, filter_operator = self._resolve_operator(self.name)
            filter_ = StrFilter(
                operation=GenericFilterOps(filter_operator),
                column="name",
                value=value,
            )
            artifact_name_filter = and_(
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                filter_.generate_query_conditions(ArtifactSchema),
            )
            custom_filters.append(artifact_name_filter)

        if self.only_unused:
            unused_filter = and_(
                ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                    select(StepRunOutputArtifactSchema.artifact_id)
                ),
                ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                    select(StepRunInputArtifactSchema.artifact_id)
                ),
            )
            custom_filters.append(unused_filter)

        if self.model_version_id:
            value, operator = self._resolve_operator(self.model_version_id)

            model_version_filter = and_(
                ArtifactVersionSchema.id
                == ModelVersionArtifactSchema.artifact_version_id,
                ModelVersionArtifactSchema.model_version_id
                == ModelVersionSchema.id,
                FilterGenerator(ModelVersionSchema)
                .define_filter(column="id", value=value, operator=operator)
                .generate_query_conditions(ModelVersionSchema),
            )
            custom_filters.append(model_version_filter)

        if self.has_custom_name is not None:
            custom_name_filter = and_(
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                ArtifactSchema.has_custom_name == self.has_custom_name,
            )
            custom_filters.append(custom_name_filter)

        if self.model:
            model_filter = and_(
                ArtifactVersionSchema.id
                == ModelVersionArtifactSchema.artifact_version_id,
                ModelVersionArtifactSchema.model_version_id
                == ModelVersionSchema.id,
                ModelVersionSchema.model_id == ModelSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.model, table=ModelSchema
                ),
            )
            custom_filters.append(model_filter)

        if self.pipeline_run:
            pipeline_run_filter = and_(
                or_(
                    and_(
                        ArtifactVersionSchema.id
                        == StepRunOutputArtifactSchema.artifact_id,
                        StepRunOutputArtifactSchema.step_id
                        == StepRunSchema.id,
                    ),
                    and_(
                        ArtifactVersionSchema.id
                        == StepRunInputArtifactSchema.artifact_id,
                        StepRunInputArtifactSchema.step_id == StepRunSchema.id,
                    ),
                ),
                StepRunSchema.pipeline_run_id == PipelineRunSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.pipeline_run, table=PipelineRunSchema
                ),
            )
            custom_filters.append(pipeline_run_filter)

        if self.run_metadata is not None:
            from zenml.enums import MetadataResourceTypes

            for key, value in self.run_metadata.items():
                additional_filter = and_(
                    RunMetadataResourceSchema.resource_id
                    == ArtifactVersionSchema.id,
                    RunMetadataResourceSchema.resource_type
                    == MetadataResourceTypes.ARTIFACT_VERSION.value,
                    RunMetadataResourceSchema.run_metadata_id
                    == RunMetadataSchema.id,
                    self.generate_custom_query_conditions_for_column(
                        value=key,
                        table=RunMetadataSchema,
                        column="key",
                    ),
                    self.generate_custom_query_conditions_for_column(
                        value=value,
                        table=RunMetadataSchema,
                        column="value",
                        json_encode_value=True,
                    ),
                )
                custom_filters.append(additional_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[Union[ColumnElement[bool]]]

A list of custom filters.

Source code in src/zenml/models/v2/core/artifact_version.py
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List[Union["ColumnElement[bool]"]]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_, or_, select

    from zenml.zen_stores.schemas import (
        ArtifactSchema,
        ArtifactVersionSchema,
        ModelSchema,
        ModelVersionArtifactSchema,
        ModelVersionSchema,
        PipelineRunSchema,
        RunMetadataResourceSchema,
        RunMetadataSchema,
        StepRunInputArtifactSchema,
        StepRunOutputArtifactSchema,
        StepRunSchema,
    )

    if self.name:
        value, filter_operator = self._resolve_operator(self.name)
        filter_ = StrFilter(
            operation=GenericFilterOps(filter_operator),
            column="name",
            value=value,
        )
        artifact_name_filter = and_(
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            filter_.generate_query_conditions(ArtifactSchema),
        )
        custom_filters.append(artifact_name_filter)

    if self.only_unused:
        unused_filter = and_(
            ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                select(StepRunOutputArtifactSchema.artifact_id)
            ),
            ArtifactVersionSchema.id.notin_(  # type: ignore[attr-defined]
                select(StepRunInputArtifactSchema.artifact_id)
            ),
        )
        custom_filters.append(unused_filter)

    if self.model_version_id:
        value, operator = self._resolve_operator(self.model_version_id)

        model_version_filter = and_(
            ArtifactVersionSchema.id
            == ModelVersionArtifactSchema.artifact_version_id,
            ModelVersionArtifactSchema.model_version_id
            == ModelVersionSchema.id,
            FilterGenerator(ModelVersionSchema)
            .define_filter(column="id", value=value, operator=operator)
            .generate_query_conditions(ModelVersionSchema),
        )
        custom_filters.append(model_version_filter)

    if self.has_custom_name is not None:
        custom_name_filter = and_(
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            ArtifactSchema.has_custom_name == self.has_custom_name,
        )
        custom_filters.append(custom_name_filter)

    if self.model:
        model_filter = and_(
            ArtifactVersionSchema.id
            == ModelVersionArtifactSchema.artifact_version_id,
            ModelVersionArtifactSchema.model_version_id
            == ModelVersionSchema.id,
            ModelVersionSchema.model_id == ModelSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.model, table=ModelSchema
            ),
        )
        custom_filters.append(model_filter)

    if self.pipeline_run:
        pipeline_run_filter = and_(
            or_(
                and_(
                    ArtifactVersionSchema.id
                    == StepRunOutputArtifactSchema.artifact_id,
                    StepRunOutputArtifactSchema.step_id
                    == StepRunSchema.id,
                ),
                and_(
                    ArtifactVersionSchema.id
                    == StepRunInputArtifactSchema.artifact_id,
                    StepRunInputArtifactSchema.step_id == StepRunSchema.id,
                ),
            ),
            StepRunSchema.pipeline_run_id == PipelineRunSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.pipeline_run, table=PipelineRunSchema
            ),
        )
        custom_filters.append(pipeline_run_filter)

    if self.run_metadata is not None:
        from zenml.enums import MetadataResourceTypes

        for key, value in self.run_metadata.items():
            additional_filter = and_(
                RunMetadataResourceSchema.resource_id
                == ArtifactVersionSchema.id,
                RunMetadataResourceSchema.resource_type
                == MetadataResourceTypes.ARTIFACT_VERSION.value,
                RunMetadataResourceSchema.run_metadata_id
                == RunMetadataSchema.id,
                self.generate_custom_query_conditions_for_column(
                    value=key,
                    table=RunMetadataSchema,
                    column="key",
                ),
                self.generate_custom_query_conditions_for_column(
                    value=value,
                    table=RunMetadataSchema,
                    column="value",
                    json_encode_value=True,
                ),
            )
            custom_filters.append(additional_filter)

    return custom_filters

ArtifactVersionResponse

Bases: WorkspaceScopedResponse[ArtifactVersionResponseBody, ArtifactVersionResponseMetadata, ArtifactVersionResponseResources]

Response model for artifact versions.

Source code in src/zenml/models/v2/core/artifact_version.py
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
class ArtifactVersionResponse(
    WorkspaceScopedResponse[
        ArtifactVersionResponseBody,
        ArtifactVersionResponseMetadata,
        ArtifactVersionResponseResources,
    ]
):
    """Response model for artifact versions."""

    def get_hydrated_version(self) -> "ArtifactVersionResponse":
        """Get the hydrated version of this artifact version.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_artifact_version(self.id)

    # Body and metadata properties
    @property
    def artifact(self) -> "ArtifactResponse":
        """The `artifact` property.

        Returns:
            the value of the property.
        """
        return self.get_body().artifact

    @property
    def version(self) -> Union[str, int]:
        """The `version` property.

        Returns:
            the value of the property.
        """
        return self.get_body().version

    @property
    def uri(self) -> str:
        """The `uri` property.

        Returns:
            the value of the property.
        """
        return self.get_body().uri

    @property
    def type(self) -> ArtifactType:
        """The `type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().type

    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_body().tags

    @property
    def producer_pipeline_run_id(self) -> Optional[UUID]:
        """The `producer_pipeline_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().producer_pipeline_run_id

    @property
    def save_type(self) -> ArtifactSaveType:
        """The `save_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().save_type

    @property
    def artifact_store_id(self) -> Optional[UUID]:
        """The `artifact_store_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().artifact_store_id

    @property
    def producer_step_run_id(self) -> Optional[UUID]:
        """The `producer_step_run_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().producer_step_run_id

    @property
    def visualizations(
        self,
    ) -> Optional[List["ArtifactVisualizationResponse"]]:
        """The `visualizations` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().visualizations

    @property
    def run_metadata(self) -> Dict[str, MetadataType]:
        """The `metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_metadata

    @property
    def materializer(self) -> Source:
        """The `materializer` property.

        Returns:
            the value of the property.
        """
        return self.get_body().materializer

    @property
    def data_type(self) -> Source:
        """The `data_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().data_type

    # Helper methods
    @property
    def name(self) -> str:
        """The `name` property.

        Returns:
            the value of the property.
        """
        return self.artifact.name

    @property
    def step(self) -> "StepRunResponse":
        """Get the step that produced this artifact.

        Returns:
            The step that produced this artifact.
        """
        from zenml.artifacts.utils import get_producer_step_of_artifact

        return get_producer_step_of_artifact(self)

    @property
    def run(self) -> "PipelineRunResponse":
        """Get the pipeline run that produced this artifact.

        Returns:
            The pipeline run that produced this artifact.
        """
        from zenml.client import Client

        return Client().get_pipeline_run(self.step.pipeline_run_id)

    def load(self) -> Any:
        """Materializes (loads) the data stored in this artifact.

        Returns:
            The materialized data.
        """
        from zenml.artifacts.utils import load_artifact_from_response

        return load_artifact_from_response(self)

    def download_files(self, path: str, overwrite: bool = False) -> None:
        """Downloads data for an artifact with no materializing.

        Any artifacts will be saved as a zip file to the given path.

        Args:
            path: The path to save the binary data to.
            overwrite: Whether to overwrite the file if it already exists.

        Raises:
            ValueError: If the path does not end with '.zip'.
        """
        if not path.endswith(".zip"):
            raise ValueError(
                "The path should end with '.zip' to save the binary data."
            )
        from zenml.artifacts.utils import (
            download_artifact_files_from_response,
        )

        download_artifact_files_from_response(
            self,
            path=path,
            overwrite=overwrite,
        )

    def visualize(self, title: Optional[str] = None) -> None:
        """Visualize the artifact in notebook environments.

        Args:
            title: Optional title to show before the visualizations.
        """
        from zenml.utils.visualization_utils import visualize_artifact

        visualize_artifact(self, title=title)

artifact property

The artifact property.

Returns:

Type Description
ArtifactResponse

the value of the property.

artifact_store_id property

The artifact_store_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

data_type property

The data_type property.

Returns:

Type Description
Source

the value of the property.

materializer property

The materializer property.

Returns:

Type Description
Source

the value of the property.

name property

The name property.

Returns:

Type Description
str

the value of the property.

producer_pipeline_run_id property

The producer_pipeline_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

producer_step_run_id property

The producer_step_run_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

run property

Get the pipeline run that produced this artifact.

Returns:

Type Description
PipelineRunResponse

The pipeline run that produced this artifact.

run_metadata property

The metadata property.

Returns:

Type Description
Dict[str, MetadataType]

the value of the property.

save_type property

The save_type property.

Returns:

Type Description
ArtifactSaveType

the value of the property.

step property

Get the step that produced this artifact.

Returns:

Type Description
StepRunResponse

The step that produced this artifact.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

type property

The type property.

Returns:

Type Description
ArtifactType

the value of the property.

uri property

The uri property.

Returns:

Type Description
str

the value of the property.

version property

The version property.

Returns:

Type Description
Union[str, int]

the value of the property.

visualizations property

The visualizations property.

Returns:

Type Description
Optional[List[ArtifactVisualizationResponse]]

the value of the property.

download_files(path, overwrite=False)

Downloads data for an artifact with no materializing.

Any artifacts will be saved as a zip file to the given path.

Parameters:

Name Type Description Default
path str

The path to save the binary data to.

required
overwrite bool

Whether to overwrite the file if it already exists.

False

Raises:

Type Description
ValueError

If the path does not end with '.zip'.

Source code in src/zenml/models/v2/core/artifact_version.py
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
def download_files(self, path: str, overwrite: bool = False) -> None:
    """Downloads data for an artifact with no materializing.

    Any artifacts will be saved as a zip file to the given path.

    Args:
        path: The path to save the binary data to.
        overwrite: Whether to overwrite the file if it already exists.

    Raises:
        ValueError: If the path does not end with '.zip'.
    """
    if not path.endswith(".zip"):
        raise ValueError(
            "The path should end with '.zip' to save the binary data."
        )
    from zenml.artifacts.utils import (
        download_artifact_files_from_response,
    )

    download_artifact_files_from_response(
        self,
        path=path,
        overwrite=overwrite,
    )

get_hydrated_version()

Get the hydrated version of this artifact version.

Returns:

Type Description
ArtifactVersionResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/artifact_version.py
261
262
263
264
265
266
267
268
269
def get_hydrated_version(self) -> "ArtifactVersionResponse":
    """Get the hydrated version of this artifact version.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_artifact_version(self.id)

load()

Materializes (loads) the data stored in this artifact.

Returns:

Type Description
Any

The materialized data.

Source code in src/zenml/models/v2/core/artifact_version.py
423
424
425
426
427
428
429
430
431
def load(self) -> Any:
    """Materializes (loads) the data stored in this artifact.

    Returns:
        The materialized data.
    """
    from zenml.artifacts.utils import load_artifact_from_response

    return load_artifact_from_response(self)

visualize(title=None)

Visualize the artifact in notebook environments.

Parameters:

Name Type Description Default
title Optional[str]

Optional title to show before the visualizations.

None
Source code in src/zenml/models/v2/core/artifact_version.py
459
460
461
462
463
464
465
466
467
def visualize(self, title: Optional[str] = None) -> None:
    """Visualize the artifact in notebook environments.

    Args:
        title: Optional title to show before the visualizations.
    """
    from zenml.utils.visualization_utils import visualize_artifact

    visualize_artifact(self, title=title)

AuthorizationException

Bases: ZenMLBaseException

Raised when an authorization error occurred while trying to access a ZenML resource .

Source code in src/zenml/exceptions.py
46
47
class AuthorizationException(ZenMLBaseException):
    """Raised when an authorization error occurred while trying to access a ZenML resource ."""

BaseCodeRepository

Bases: ABC

Base class for code repositories.

Code repositories are used to connect to a remote code repository and store information about the repository, such as the URL, the owner, the repository name, and the host. They also provide methods to download files from the repository when a pipeline is run remotely.

Source code in src/zenml/code_repositories/base_code_repository.py
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
class BaseCodeRepository(ABC):
    """Base class for code repositories.

    Code repositories are used to connect to a remote code repository and
    store information about the repository, such as the URL, the owner,
    the repository name, and the host. They also provide methods to
    download files from the repository when a pipeline is run remotely.
    """

    def __init__(
        self,
        id: UUID,
        name: str,
        config: Dict[str, Any],
    ) -> None:
        """Initializes a code repository.

        Args:
            id: The ID of the code repository.
            name: The name of the code repository.
            config: The config of the code repository.
        """
        self._id = id
        self._name = name
        self._config = config
        self.login()

    @property
    def config(self) -> "BaseCodeRepositoryConfig":
        """Config class for Code Repository.

        Returns:
            The config class.
        """
        return BaseCodeRepositoryConfig(**self._config)

    @classmethod
    def from_model(cls, model: CodeRepositoryResponse) -> "BaseCodeRepository":
        """Loads a code repository from a model.

        Args:
            model: The CodeRepositoryResponseModel to load from.

        Returns:
            The loaded code repository object.
        """
        class_: Type[BaseCodeRepository] = (
            source_utils.load_and_validate_class(
                source=model.source, expected_class=BaseCodeRepository
            )
        )
        return class_(id=model.id, name=model.name, config=model.config)

    @classmethod
    def validate_config(cls, config: Dict[str, Any]) -> None:
        """Validate the code repository config.

        This method should check that the config/credentials are valid and
        the configured repository exists.

        Args:
            config: The configuration.
        """
        # The initialization calls the login to verify the credentials
        code_repo = cls(id=uuid4(), name="", config=config)

        # Explicitly access the config for pydantic validation
        _ = code_repo.config

    @property
    def id(self) -> UUID:
        """ID of the code repository.

        Returns:
            The ID of the code repository.
        """
        return self._id

    @property
    def name(self) -> str:
        """Name of the code repository.

        Returns:
            The name of the code repository.
        """
        return self._name

    @property
    def requirements(self) -> Set[str]:
        """Set of PyPI requirements for the repository.

        Returns:
            A set of PyPI requirements for the repository.
        """
        from zenml.integrations.utils import get_requirements_for_module

        return set(get_requirements_for_module(self.__module__))

    @abstractmethod
    def login(self) -> None:
        """Logs into the code repository.

        This method is called when the code repository is initialized.
        It should be used to authenticate with the code repository.

        Raises:
            RuntimeError: If the login fails.
        """
        pass

    @abstractmethod
    def download_files(
        self, commit: str, directory: str, repo_sub_directory: Optional[str]
    ) -> None:
        """Downloads files from the code repository to a local directory.

        Args:
            commit: The commit hash to download files from.
            directory: The directory to download files to.
            repo_sub_directory: The subdirectory in the repository to
                download files from.

        Raises:
            RuntimeError: If the download fails.
        """
        pass

    @abstractmethod
    def get_local_context(
        self, path: str
    ) -> Optional["LocalRepositoryContext"]:
        """Gets a local repository context from a path.

        Args:
            path: The path to the local repository.

        Returns:
            The local repository context object.
        """
        pass

config property

Config class for Code Repository.

Returns:

Type Description
BaseCodeRepositoryConfig

The config class.

id property

ID of the code repository.

Returns:

Type Description
UUID

The ID of the code repository.

name property

Name of the code repository.

Returns:

Type Description
str

The name of the code repository.

requirements property

Set of PyPI requirements for the repository.

Returns:

Type Description
Set[str]

A set of PyPI requirements for the repository.

__init__(id, name, config)

Initializes a code repository.

Parameters:

Name Type Description Default
id UUID

The ID of the code repository.

required
name str

The name of the code repository.

required
config Dict[str, Any]

The config of the code repository.

required
Source code in src/zenml/code_repositories/base_code_repository.py
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
def __init__(
    self,
    id: UUID,
    name: str,
    config: Dict[str, Any],
) -> None:
    """Initializes a code repository.

    Args:
        id: The ID of the code repository.
        name: The name of the code repository.
        config: The config of the code repository.
    """
    self._id = id
    self._name = name
    self._config = config
    self.login()

download_files(commit, directory, repo_sub_directory) abstractmethod

Downloads files from the code repository to a local directory.

Parameters:

Name Type Description Default
commit str

The commit hash to download files from.

required
directory str

The directory to download files to.

required
repo_sub_directory Optional[str]

The subdirectory in the repository to download files from.

required

Raises:

Type Description
RuntimeError

If the download fails.

Source code in src/zenml/code_repositories/base_code_repository.py
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
@abstractmethod
def download_files(
    self, commit: str, directory: str, repo_sub_directory: Optional[str]
) -> None:
    """Downloads files from the code repository to a local directory.

    Args:
        commit: The commit hash to download files from.
        directory: The directory to download files to.
        repo_sub_directory: The subdirectory in the repository to
            download files from.

    Raises:
        RuntimeError: If the download fails.
    """
    pass

from_model(model) classmethod

Loads a code repository from a model.

Parameters:

Name Type Description Default
model CodeRepositoryResponse

The CodeRepositoryResponseModel to load from.

required

Returns:

Type Description
BaseCodeRepository

The loaded code repository object.

Source code in src/zenml/code_repositories/base_code_repository.py
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
@classmethod
def from_model(cls, model: CodeRepositoryResponse) -> "BaseCodeRepository":
    """Loads a code repository from a model.

    Args:
        model: The CodeRepositoryResponseModel to load from.

    Returns:
        The loaded code repository object.
    """
    class_: Type[BaseCodeRepository] = (
        source_utils.load_and_validate_class(
            source=model.source, expected_class=BaseCodeRepository
        )
    )
    return class_(id=model.id, name=model.name, config=model.config)

get_local_context(path) abstractmethod

Gets a local repository context from a path.

Parameters:

Name Type Description Default
path str

The path to the local repository.

required

Returns:

Type Description
Optional[LocalRepositoryContext]

The local repository context object.

Source code in src/zenml/code_repositories/base_code_repository.py
162
163
164
165
166
167
168
169
170
171
172
173
174
@abstractmethod
def get_local_context(
    self, path: str
) -> Optional["LocalRepositoryContext"]:
    """Gets a local repository context from a path.

    Args:
        path: The path to the local repository.

    Returns:
        The local repository context object.
    """
    pass

login() abstractmethod

Logs into the code repository.

This method is called when the code repository is initialized. It should be used to authenticate with the code repository.

Raises:

Type Description
RuntimeError

If the login fails.

Source code in src/zenml/code_repositories/base_code_repository.py
133
134
135
136
137
138
139
140
141
142
143
@abstractmethod
def login(self) -> None:
    """Logs into the code repository.

    This method is called when the code repository is initialized.
    It should be used to authenticate with the code repository.

    Raises:
        RuntimeError: If the login fails.
    """
    pass

validate_config(config) classmethod

Validate the code repository config.

This method should check that the config/credentials are valid and the configured repository exists.

Parameters:

Name Type Description Default
config Dict[str, Any]

The configuration.

required
Source code in src/zenml/code_repositories/base_code_repository.py
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
@classmethod
def validate_config(cls, config: Dict[str, Any]) -> None:
    """Validate the code repository config.

    This method should check that the config/credentials are valid and
    the configured repository exists.

    Args:
        config: The configuration.
    """
    # The initialization calls the login to verify the credentials
    code_repo = cls(id=uuid4(), name="", config=config)

    # Explicitly access the config for pydantic validation
    _ = code_repo.config

CliCategories

Bases: StrEnum

All possible categories for CLI commands.

Note: The order of the categories is important. The same order is used to sort the commands in the CLI help output.

Source code in src/zenml/enums.py
174
175
176
177
178
179
180
181
182
183
184
185
186
187
class CliCategories(StrEnum):
    """All possible categories for CLI commands.

    Note: The order of the categories is important. The same
    order is used to sort the commands in the CLI help output.
    """

    STACK_COMPONENTS = "Stack Components"
    MODEL_DEPLOYMENT = "Model Deployment"
    INTEGRATIONS = "Integrations"
    MANAGEMENT_TOOLS = "Management Tools"
    MODEL_CONTROL_PLANE = "Model Control Plane"
    IDENTITY_AND_SECURITY = "Identity and Security"
    OTHER_COMMANDS = "Other Commands"

Client

ZenML client class.

The ZenML client manages configuration options for ZenML stacks as well as their components.

Source code in src/zenml/client.py
 338
 339
 340
 341
 342
 343
 344
 345
 346
 347
 348
 349
 350
 351
 352
 353
 354
 355
 356
 357
 358
 359
 360
 361
 362
 363
 364
 365
 366
 367
 368
 369
 370
 371
 372
 373
 374
 375
 376
 377
 378
 379
 380
 381
 382
 383
 384
 385
 386
 387
 388
 389
 390
 391
 392
 393
 394
 395
 396
 397
 398
 399
 400
 401
 402
 403
 404
 405
 406
 407
 408
 409
 410
 411
 412
 413
 414
 415
 416
 417
 418
 419
 420
 421
 422
 423
 424
 425
 426
 427
 428
 429
 430
 431
 432
 433
 434
 435
 436
 437
 438
 439
 440
 441
 442
 443
 444
 445
 446
 447
 448
 449
 450
 451
 452
 453
 454
 455
 456
 457
 458
 459
 460
 461
 462
 463
 464
 465
 466
 467
 468
 469
 470
 471
 472
 473
 474
 475
 476
 477
 478
 479
 480
 481
 482
 483
 484
 485
 486
 487
 488
 489
 490
 491
 492
 493
 494
 495
 496
 497
 498
 499
 500
 501
 502
 503
 504
 505
 506
 507
 508
 509
 510
 511
 512
 513
 514
 515
 516
 517
 518
 519
 520
 521
 522
 523
 524
 525
 526
 527
 528
 529
 530
 531
 532
 533
 534
 535
 536
 537
 538
 539
 540
 541
 542
 543
 544
 545
 546
 547
 548
 549
 550
 551
 552
 553
 554
 555
 556
 557
 558
 559
 560
 561
 562
 563
 564
 565
 566
 567
 568
 569
 570
 571
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
2864
2865
2866
2867
2868
2869
2870
2871
2872
2873
2874
2875
2876
2877
2878
2879
2880
2881
2882
2883
2884
2885
2886
2887
2888
2889
2890
2891
2892
2893
2894
2895
2896
2897
2898
2899
2900
2901
2902
2903
2904
2905
2906
2907
2908
2909
2910
2911
2912
2913
2914
2915
2916
2917
2918
2919
2920
2921
2922
2923
2924
2925
2926
2927
2928
2929
2930
2931
2932
2933
2934
2935
2936
2937
2938
2939
2940
2941
2942
2943
2944
2945
2946
2947
2948
2949
2950
2951
2952
2953
2954
2955
2956
2957
2958
2959
2960
2961
2962
2963
2964
2965
2966
2967
2968
2969
2970
2971
2972
2973
2974
2975
2976
2977
2978
2979
2980
2981
2982
2983
2984
2985
2986
2987
2988
2989
2990
2991
2992
2993
2994
2995
2996
2997
2998
2999
3000
3001
3002
3003
3004
3005
3006
3007
3008
3009
3010
3011
3012
3013
3014
3015
3016
3017
3018
3019
3020
3021
3022
3023
3024
3025
3026
3027
3028
3029
3030
3031
3032
3033
3034
3035
3036
3037
3038
3039
3040
3041
3042
3043
3044
3045
3046
3047
3048
3049
3050
3051
3052
3053
3054
3055
3056
3057
3058
3059
3060
3061
3062
3063
3064
3065
3066
3067
3068
3069
3070
3071
3072
3073
3074
3075
3076
3077
3078
3079
3080
3081
3082
3083
3084
3085
3086
3087
3088
3089
3090
3091
3092
3093
3094
3095
3096
3097
3098
3099
3100
3101
3102
3103
3104
3105
3106
3107
3108
3109
3110
3111
3112
3113
3114
3115
3116
3117
3118
3119
3120
3121
3122
3123
3124
3125
3126
3127
3128
3129
3130
3131
3132
3133
3134
3135
3136
3137
3138
3139
3140
3141
3142
3143
3144
3145
3146
3147
3148
3149
3150
3151
3152
3153
3154
3155
3156
3157
3158
3159
3160
3161
3162
3163
3164
3165
3166
3167
3168
3169
3170
3171
3172
3173
3174
3175
3176
3177
3178
3179
3180
3181
3182
3183
3184
3185
3186
3187
3188
3189
3190
3191
3192
3193
3194
3195
3196
3197
3198
3199
3200
3201
3202
3203
3204
3205
3206
3207
3208
3209
3210
3211
3212
3213
3214
3215
3216
3217
3218
3219
3220
3221
3222
3223
3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
3239
3240
3241
3242
3243
3244
3245
3246
3247
3248
3249
3250
3251
3252
3253
3254
3255
3256
3257
3258
3259
3260
3261
3262
3263
3264
3265
3266
3267
3268
3269
3270
3271
3272
3273
3274
3275
3276
3277
3278
3279
3280
3281
3282
3283
3284
3285
3286
3287
3288
3289
3290
3291
3292
3293
3294
3295
3296
3297
3298
3299
3300
3301
3302
3303
3304
3305
3306
3307
3308
3309
3310
3311
3312
3313
3314
3315
3316
3317
3318
3319
3320
3321
3322
3323
3324
3325
3326
3327
3328
3329
3330
3331
3332
3333
3334
3335
3336
3337
3338
3339
3340
3341
3342
3343
3344
3345
3346
3347
3348
3349
3350
3351
3352
3353
3354
3355
3356
3357
3358
3359
3360
3361
3362
3363
3364
3365
3366
3367
3368
3369
3370
3371
3372
3373
3374
3375
3376
3377
3378
3379
3380
3381
3382
3383
3384
3385
3386
3387
3388
3389
3390
3391
3392
3393
3394
3395
3396
3397
3398
3399
3400
3401
3402
3403
3404
3405
3406
3407
3408
3409
3410
3411
3412
3413
3414
3415
3416
3417
3418
3419
3420
3421
3422
3423
3424
3425
3426
3427
3428
3429
3430
3431
3432
3433
3434
3435
3436
3437
3438
3439
3440
3441
3442
3443
3444
3445
3446
3447
3448
3449
3450
3451
3452
3453
3454
3455
3456
3457
3458
3459
3460
3461
3462
3463
3464
3465
3466
3467
3468
3469
3470
3471
3472
3473
3474
3475
3476
3477
3478
3479
3480
3481
3482
3483
3484
3485
3486
3487
3488
3489
3490
3491
3492
3493
3494
3495
3496
3497
3498
3499
3500
3501
3502
3503
3504
3505
3506
3507
3508
3509
3510
3511
3512
3513
3514
3515
3516
3517
3518
3519
3520
3521
3522
3523
3524
3525
3526
3527
3528
3529
3530
3531
3532
3533
3534
3535
3536
3537
3538
3539
3540
3541
3542
3543
3544
3545
3546
3547
3548
3549
3550
3551
3552
3553
3554
3555
3556
3557
3558
3559
3560
3561
3562
3563
3564
3565
3566
3567
3568
3569
3570
3571
3572
3573
3574
3575
3576
3577
3578
3579
3580
3581
3582
3583
3584
3585
3586
3587
3588
3589
3590
3591
3592
3593
3594
3595
3596
3597
3598
3599
3600
3601
3602
3603
3604
3605
3606
3607
3608
3609
3610
3611
3612
3613
3614
3615
3616
3617
3618
3619
3620
3621
3622
3623
3624
3625
3626
3627
3628
3629
3630
3631
3632
3633
3634
3635
3636
3637
3638
3639
3640
3641
3642
3643
3644
3645
3646
3647
3648
3649
3650
3651
3652
3653
3654
3655
3656
3657
3658
3659
3660
3661
3662
3663
3664
3665
3666
3667
3668
3669
3670
3671
3672
3673
3674
3675
3676
3677
3678
3679
3680
3681
3682
3683
3684
3685
3686
3687
3688
3689
3690
3691
3692
3693
3694
3695
3696
3697
3698
3699
3700
3701
3702
3703
3704
3705
3706
3707
3708
3709
3710
3711
3712
3713
3714
3715
3716
3717
3718
3719
3720
3721
3722
3723
3724
3725
3726
3727
3728
3729
3730
3731
3732
3733
3734
3735
3736
3737
3738
3739
3740
3741
3742
3743
3744
3745
3746
3747
3748
3749
3750
3751
3752
3753
3754
3755
3756
3757
3758
3759
3760
3761
3762
3763
3764
3765
3766
3767
3768
3769
3770
3771
3772
3773
3774
3775
3776
3777
3778
3779
3780
3781
3782
3783
3784
3785
3786
3787
3788
3789
3790
3791
3792
3793
3794
3795
3796
3797
3798
3799
3800
3801
3802
3803
3804
3805
3806
3807
3808
3809
3810
3811
3812
3813
3814
3815
3816
3817
3818
3819
3820
3821
3822
3823
3824
3825
3826
3827
3828
3829
3830
3831
3832
3833
3834
3835
3836
3837
3838
3839
3840
3841
3842
3843
3844
3845
3846
3847
3848
3849
3850
3851
3852
3853
3854
3855
3856
3857
3858
3859
3860
3861
3862
3863
3864
3865
3866
3867
3868
3869
3870
3871
3872
3873
3874
3875
3876
3877
3878
3879
3880
3881
3882
3883
3884
3885
3886
3887
3888
3889
3890
3891
3892
3893
3894
3895
3896
3897
3898
3899
3900
3901
3902
3903
3904
3905
3906
3907
3908
3909
3910
3911
3912
3913
3914
3915
3916
3917
3918
3919
3920
3921
3922
3923
3924
3925
3926
3927
3928
3929
3930
3931
3932
3933
3934
3935
3936
3937
3938
3939
3940
3941
3942
3943
3944
3945
3946
3947
3948
3949
3950
3951
3952
3953
3954
3955
3956
3957
3958
3959
3960
3961
3962
3963
3964
3965
3966
3967
3968
3969
3970
3971
3972
3973
3974
3975
3976
3977
3978
3979
3980
3981
3982
3983
3984
3985
3986
3987
3988
3989
3990
3991
3992
3993
3994
3995
3996
3997
3998
3999
4000
4001
4002
4003
4004
4005
4006
4007
4008
4009
4010
4011
4012
4013
4014
4015
4016
4017
4018
4019
4020
4021
4022
4023
4024
4025
4026
4027
4028
4029
4030
4031
4032
4033
4034
4035
4036
4037
4038
4039
4040
4041
4042
4043
4044
4045
4046
4047
4048
4049
4050
4051
4052
4053
4054
4055
4056
4057
4058
4059
4060
4061
4062
4063
4064
4065
4066
4067
4068
4069
4070
4071
4072
4073
4074
4075
4076
4077
4078
4079
4080
4081
4082
4083
4084
4085
4086
4087
4088
4089
4090
4091
4092
4093
4094
4095
4096
4097
4098
4099
4100
4101
4102
4103
4104
4105
4106
4107
4108
4109
4110
4111
4112
4113
4114
4115
4116
4117
4118
4119
4120
4121
4122
4123
4124
4125
4126
4127
4128
4129
4130
4131
4132
4133
4134
4135
4136
4137
4138
4139
4140
4141
4142
4143
4144
4145
4146
4147
4148
4149
4150
4151
4152
4153
4154
4155
4156
4157
4158
4159
4160
4161
4162
4163
4164
4165
4166
4167
4168
4169
4170
4171
4172
4173
4174
4175
4176
4177
4178
4179
4180
4181
4182
4183
4184
4185
4186
4187
4188
4189
4190
4191
4192
4193
4194
4195
4196
4197
4198
4199
4200
4201
4202
4203
4204
4205
4206
4207
4208
4209
4210
4211
4212
4213
4214
4215
4216
4217
4218
4219
4220
4221
4222
4223
4224
4225
4226
4227
4228
4229
4230
4231
4232
4233
4234
4235
4236
4237
4238
4239
4240
4241
4242
4243
4244
4245
4246
4247
4248
4249
4250
4251
4252
4253
4254
4255
4256
4257
4258
4259
4260
4261
4262
4263
4264
4265
4266
4267
4268
4269
4270
4271
4272
4273
4274
4275
4276
4277
4278
4279
4280
4281
4282
4283
4284
4285
4286
4287
4288
4289
4290
4291
4292
4293
4294
4295
4296
4297
4298
4299
4300
4301
4302
4303
4304
4305
4306
4307
4308
4309
4310
4311
4312
4313
4314
4315
4316
4317
4318
4319
4320
4321
4322
4323
4324
4325
4326
4327
4328
4329
4330
4331
4332
4333
4334
4335
4336
4337
4338
4339
4340
4341
4342
4343
4344
4345
4346
4347
4348
4349
4350
4351
4352
4353
4354
4355
4356
4357
4358
4359
4360
4361
4362
4363
4364
4365
4366
4367
4368
4369
4370
4371
4372
4373
4374
4375
4376
4377
4378
4379
4380
4381
4382
4383
4384
4385
4386
4387
4388
4389
4390
4391
4392
4393
4394
4395
4396
4397
4398
4399
4400
4401
4402
4403
4404
4405
4406
4407
4408
4409
4410
4411
4412
4413
4414
4415
4416
4417
4418
4419
4420
4421
4422
4423
4424
4425
4426
4427
4428
4429
4430
4431
4432
4433
4434
4435
4436
4437
4438
4439
4440
4441
4442
4443
4444
4445
4446
4447
4448
4449
4450
4451
4452
4453
4454
4455
4456
4457
4458
4459
4460
4461
4462
4463
4464
4465
4466
4467
4468
4469
4470
4471
4472
4473
4474
4475
4476
4477
4478
4479
4480
4481
4482
4483
4484
4485
4486
4487
4488
4489
4490
4491
4492
4493
4494
4495
4496
4497
4498
4499
4500
4501
4502
4503
4504
4505
4506
4507
4508
4509
4510
4511
4512
4513
4514
4515
4516
4517
4518
4519
4520
4521
4522
4523
4524
4525
4526
4527
4528
4529
4530
4531
4532
4533
4534
4535
4536
4537
4538
4539
4540
4541
4542
4543
4544
4545
4546
4547
4548
4549
4550
4551
4552
4553
4554
4555
4556
4557
4558
4559
4560
4561
4562
4563
4564
4565
4566
4567
4568
4569
4570
4571
4572
4573
4574
4575
4576
4577
4578
4579
4580
4581
4582
4583
4584
4585
4586
4587
4588
4589
4590
4591
4592
4593
4594
4595
4596
4597
4598
4599
4600
4601
4602
4603
4604
4605
4606
4607
4608
4609
4610
4611
4612
4613
4614
4615
4616
4617
4618
4619
4620
4621
4622
4623
4624
4625
4626
4627
4628
4629
4630
4631
4632
4633
4634
4635
4636
4637
4638
4639
4640
4641
4642
4643
4644
4645
4646
4647
4648
4649
4650
4651
4652
4653
4654
4655
4656
4657
4658
4659
4660
4661
4662
4663
4664
4665
4666
4667
4668
4669
4670
4671
4672
4673
4674
4675
4676
4677
4678
4679
4680
4681
4682
4683
4684
4685
4686
4687
4688
4689
4690
4691
4692
4693
4694
4695
4696
4697
4698
4699
4700
4701
4702
4703
4704
4705
4706
4707
4708
4709
4710
4711
4712
4713
4714
4715
4716
4717
4718
4719
4720
4721
4722
4723
4724
4725
4726
4727
4728
4729
4730
4731
4732
4733
4734
4735
4736
4737
4738
4739
4740
4741
4742
4743
4744
4745
4746
4747
4748
4749
4750
4751
4752
4753
4754
4755
4756
4757
4758
4759
4760
4761
4762
4763
4764
4765
4766
4767
4768
4769
4770
4771
4772
4773
4774
4775
4776
4777
4778
4779
4780
4781
4782
4783
4784
4785
4786
4787
4788
4789
4790
4791
4792
4793
4794
4795
4796
4797
4798
4799
4800
4801
4802
4803
4804
4805
4806
4807
4808
4809
4810
4811
4812
4813
4814
4815
4816
4817
4818
4819
4820
4821
4822
4823
4824
4825
4826
4827
4828
4829
4830
4831
4832
4833
4834
4835
4836
4837
4838
4839
4840
4841
4842
4843
4844
4845
4846
4847
4848
4849
4850
4851
4852
4853
4854
4855
4856
4857
4858
4859
4860
4861
4862
4863
4864
4865
4866
4867
4868
4869
4870
4871
4872
4873
4874
4875
4876
4877
4878
4879
4880
4881
4882
4883
4884
4885
4886
4887
4888
4889
4890
4891
4892
4893
4894
4895
4896
4897
4898
4899
4900
4901
4902
4903
4904
4905
4906
4907
4908
4909
4910
4911
4912
4913
4914
4915
4916
4917
4918
4919
4920
4921
4922
4923
4924
4925
4926
4927
4928
4929
4930
4931
4932
4933
4934
4935
4936
4937
4938
4939
4940
4941
4942
4943
4944
4945
4946
4947
4948
4949
4950
4951
4952
4953
4954
4955
4956
4957
4958
4959
4960
4961
4962
4963
4964
4965
4966
4967
4968
4969
4970
4971
4972
4973
4974
4975
4976
4977
4978
4979
4980
4981
4982
4983
4984
4985
4986
4987
4988
4989
4990
4991
4992
4993
4994
4995
4996
4997
4998
4999
5000
5001
5002
5003
5004
5005
5006
5007
5008
5009
5010
5011
5012
5013
5014
5015
5016
5017
5018
5019
5020
5021
5022
5023
5024
5025
5026
5027
5028
5029
5030
5031
5032
5033
5034
5035
5036
5037
5038
5039
5040
5041
5042
5043
5044
5045
5046
5047
5048
5049
5050
5051
5052
5053
5054
5055
5056
5057
5058
5059
5060
5061
5062
5063
5064
5065
5066
5067
5068
5069
5070
5071
5072
5073
5074
5075
5076
5077
5078
5079
5080
5081
5082
5083
5084
5085
5086
5087
5088
5089
5090
5091
5092
5093
5094
5095
5096
5097
5098
5099
5100
5101
5102
5103
5104
5105
5106
5107
5108
5109
5110
5111
5112
5113
5114
5115
5116
5117
5118
5119
5120
5121
5122
5123
5124
5125
5126
5127
5128
5129
5130
5131
5132
5133
5134
5135
5136
5137
5138
5139
5140
5141
5142
5143
5144
5145
5146
5147
5148
5149
5150
5151
5152
5153
5154
5155
5156
5157
5158
5159
5160
5161
5162
5163
5164
5165
5166
5167
5168
5169
5170
5171
5172
5173
5174
5175
5176
5177
5178
5179
5180
5181
5182
5183
5184
5185
5186
5187
5188
5189
5190
5191
5192
5193
5194
5195
5196
5197
5198
5199
5200
5201
5202
5203
5204
5205
5206
5207
5208
5209
5210
5211
5212
5213
5214
5215
5216
5217
5218
5219
5220
5221
5222
5223
5224
5225
5226
5227
5228
5229
5230
5231
5232
5233
5234
5235
5236
5237
5238
5239
5240
5241
5242
5243
5244
5245
5246
5247
5248
5249
5250
5251
5252
5253
5254
5255
5256
5257
5258
5259
5260
5261
5262
5263
5264
5265
5266
5267
5268
5269
5270
5271
5272
5273
5274
5275
5276
5277
5278
5279
5280
5281
5282
5283
5284
5285
5286
5287
5288
5289
5290
5291
5292
5293
5294
5295
5296
5297
5298
5299
5300
5301
5302
5303
5304
5305
5306
5307
5308
5309
5310
5311
5312
5313
5314
5315
5316
5317
5318
5319
5320
5321
5322
5323
5324
5325
5326
5327
5328
5329
5330
5331
5332
5333
5334
5335
5336
5337
5338
5339
5340
5341
5342
5343
5344
5345
5346
5347
5348
5349
5350
5351
5352
5353
5354
5355
5356
5357
5358
5359
5360
5361
5362
5363
5364
5365
5366
5367
5368
5369
5370
5371
5372
5373
5374
5375
5376
5377
5378
5379
5380
5381
5382
5383
5384
5385
5386
5387
5388
5389
5390
5391
5392
5393
5394
5395
5396
5397
5398
5399
5400
5401
5402
5403
5404
5405
5406
5407
5408
5409
5410
5411
5412
5413
5414
5415
5416
5417
5418
5419
5420
5421
5422
5423
5424
5425
5426
5427
5428
5429
5430
5431
5432
5433
5434
5435
5436
5437
5438
5439
5440
5441
5442
5443
5444
5445
5446
5447
5448
5449
5450
5451
5452
5453
5454
5455
5456
5457
5458
5459
5460
5461
5462
5463
5464
5465
5466
5467
5468
5469
5470
5471
5472
5473
5474
5475
5476
5477
5478
5479
5480
5481
5482
5483
5484
5485
5486
5487
5488
5489
5490
5491
5492
5493
5494
5495
5496
5497
5498
5499
5500
5501
5502
5503
5504
5505
5506
5507
5508
5509
5510
5511
5512
5513
5514
5515
5516
5517
5518
5519
5520
5521
5522
5523
5524
5525
5526
5527
5528
5529
5530
5531
5532
5533
5534
5535
5536
5537
5538
5539
5540
5541
5542
5543
5544
5545
5546
5547
5548
5549
5550
5551
5552
5553
5554
5555
5556
5557
5558
5559
5560
5561
5562
5563
5564
5565
5566
5567
5568
5569
5570
5571
5572
5573
5574
5575
5576
5577
5578
5579
5580
5581
5582
5583
5584
5585
5586
5587
5588
5589
5590
5591
5592
5593
5594
5595
5596
5597
5598
5599
5600
5601
5602
5603
5604
5605
5606
5607
5608
5609
5610
5611
5612
5613
5614
5615
5616
5617
5618
5619
5620
5621
5622
5623
5624
5625
5626
5627
5628
5629
5630
5631
5632
5633
5634
5635
5636
5637
5638
5639
5640
5641
5642
5643
5644
5645
5646
5647
5648
5649
5650
5651
5652
5653
5654
5655
5656
5657
5658
5659
5660
5661
5662
5663
5664
5665
5666
5667
5668
5669
5670
5671
5672
5673
5674
5675
5676
5677
5678
5679
5680
5681
5682
5683
5684
5685
5686
5687
5688
5689
5690
5691
5692
5693
5694
5695
5696
5697
5698
5699
5700
5701
5702
5703
5704
5705
5706
5707
5708
5709
5710
5711
5712
5713
5714
5715
5716
5717
5718
5719
5720
5721
5722
5723
5724
5725
5726
5727
5728
5729
5730
5731
5732
5733
5734
5735
5736
5737
5738
5739
5740
5741
5742
5743
5744
5745
5746
5747
5748
5749
5750
5751
5752
5753
5754
5755
5756
5757
5758
5759
5760
5761
5762
5763
5764
5765
5766
5767
5768
5769
5770
5771
5772
5773
5774
5775
5776
5777
5778
5779
5780
5781
5782
5783
5784
5785
5786
5787
5788
5789
5790
5791
5792
5793
5794
5795
5796
5797
5798
5799
5800
5801
5802
5803
5804
5805
5806
5807
5808
5809
5810
5811
5812
5813
5814
5815
5816
5817
5818
5819
5820
5821
5822
5823
5824
5825
5826
5827
5828
5829
5830
5831
5832
5833
5834
5835
5836
5837
5838
5839
5840
5841
5842
5843
5844
5845
5846
5847
5848
5849
5850
5851
5852
5853
5854
5855
5856
5857
5858
5859
5860
5861
5862
5863
5864
5865
5866
5867
5868
5869
5870
5871
5872
5873
5874
5875
5876
5877
5878
5879
5880
5881
5882
5883
5884
5885
5886
5887
5888
5889
5890
5891
5892
5893
5894
5895
5896
5897
5898
5899
5900
5901
5902
5903
5904
5905
5906
5907
5908
5909
5910
5911
5912
5913
5914
5915
5916
5917
5918
5919
5920
5921
5922
5923
5924
5925
5926
5927
5928
5929
5930
5931
5932
5933
5934
5935
5936
5937
5938
5939
5940
5941
5942
5943
5944
5945
5946
5947
5948
5949
5950
5951
5952
5953
5954
5955
5956
5957
5958
5959
5960
5961
5962
5963
5964
5965
5966
5967
5968
5969
5970
5971
5972
5973
5974
5975
5976
5977
5978
5979
5980
5981
5982
5983
5984
5985
5986
5987
5988
5989
5990
5991
5992
5993
5994
5995
5996
5997
5998
5999
6000
6001
6002
6003
6004
6005
6006
6007
6008
6009
6010
6011
6012
6013
6014
6015
6016
6017
6018
6019
6020
6021
6022
6023
6024
6025
6026
6027
6028
6029
6030
6031
6032
6033
6034
6035
6036
6037
6038
6039
6040
6041
6042
6043
6044
6045
6046
6047
6048
6049
6050
6051
6052
6053
6054
6055
6056
6057
6058
6059
6060
6061
6062
6063
6064
6065
6066
6067
6068
6069
6070
6071
6072
6073
6074
6075
6076
6077
6078
6079
6080
6081
6082
6083
6084
6085
6086
6087
6088
6089
6090
6091
6092
6093
6094
6095
6096
6097
6098
6099
6100
6101
6102
6103
6104
6105
6106
6107
6108
6109
6110
6111
6112
6113
6114
6115
6116
6117
6118
6119
6120
6121
6122
6123
6124
6125
6126
6127
6128
6129
6130
6131
6132
6133
6134
6135
6136
6137
6138
6139
6140
6141
6142
6143
6144
6145
6146
6147
6148
6149
6150
6151
6152
6153
6154
6155
6156
6157
6158
6159
6160
6161
6162
6163
6164
6165
6166
6167
6168
6169
6170
6171
6172
6173
6174
6175
6176
6177
6178
6179
6180
6181
6182
6183
6184
6185
6186
6187
6188
6189
6190
6191
6192
6193
6194
6195
6196
6197
6198
6199
6200
6201
6202
6203
6204
6205
6206
6207
6208
6209
6210
6211
6212
6213
6214
6215
6216
6217
6218
6219
6220
6221
6222
6223
6224
6225
6226
6227
6228
6229
6230
6231
6232
6233
6234
6235
6236
6237
6238
6239
6240
6241
6242
6243
6244
6245
6246
6247
6248
6249
6250
6251
6252
6253
6254
6255
6256
6257
6258
6259
6260
6261
6262
6263
6264
6265
6266
6267
6268
6269
6270
6271
6272
6273
6274
6275
6276
6277
6278
6279
6280
6281
6282
6283
6284
6285
6286
6287
6288
6289
6290
6291
6292
6293
6294
6295
6296
6297
6298
6299
6300
6301
6302
6303
6304
6305
6306
6307
6308
6309
6310
6311
6312
6313
6314
6315
6316
6317
6318
6319
6320
6321
6322
6323
6324
6325
6326
6327
6328
6329
6330
6331
6332
6333
6334
6335
6336
6337
6338
6339
6340
6341
6342
6343
6344
6345
6346
6347
6348
6349
6350
6351
6352
6353
6354
6355
6356
6357
6358
6359
6360
6361
6362
6363
6364
6365
6366
6367
6368
6369
6370
6371
6372
6373
6374
6375
6376
6377
6378
6379
6380
6381
6382
6383
6384
6385
6386
6387
6388
6389
6390
6391
6392
6393
6394
6395
6396
6397
6398
6399
6400
6401
6402
6403
6404
6405
6406
6407
6408
6409
6410
6411
6412
6413
6414
6415
6416
6417
6418
6419
6420
6421
6422
6423
6424
6425
6426
6427
6428
6429
6430
6431
6432
6433
6434
6435
6436
6437
6438
6439
6440
6441
6442
6443
6444
6445
6446
6447
6448
6449
6450
6451
6452
6453
6454
6455
6456
6457
6458
6459
6460
6461
6462
6463
6464
6465
6466
6467
6468
6469
6470
6471
6472
6473
6474
6475
6476
6477
6478
6479
6480
6481
6482
6483
6484
6485
6486
6487
6488
6489
6490
6491
6492
6493
6494
6495
6496
6497
6498
6499
6500
6501
6502
6503
6504
6505
6506
6507
6508
6509
6510
6511
6512
6513
6514
6515
6516
6517
6518
6519
6520
6521
6522
6523
6524
6525
6526
6527
6528
6529
6530
6531
6532
6533
6534
6535
6536
6537
6538
6539
6540
6541
6542
6543
6544
6545
6546
6547
6548
6549
6550
6551
6552
6553
6554
6555
6556
6557
6558
6559
6560
6561
6562
6563
6564
6565
6566
6567
6568
6569
6570
6571
6572
6573
6574
6575
6576
6577
6578
6579
6580
6581
6582
6583
6584
6585
6586
6587
6588
6589
6590
6591
6592
6593
6594
6595
6596
6597
6598
6599
6600
6601
6602
6603
6604
6605
6606
6607
6608
6609
6610
6611
6612
6613
6614
6615
6616
6617
6618
6619
6620
6621
6622
6623
6624
6625
6626
6627
6628
6629
6630
6631
6632
6633
6634
6635
6636
6637
6638
6639
6640
6641
6642
6643
6644
6645
6646
6647
6648
6649
6650
6651
6652
6653
6654
6655
6656
6657
6658
6659
6660
6661
6662
6663
6664
6665
6666
6667
6668
6669
6670
6671
6672
6673
6674
6675
6676
6677
6678
6679
6680
6681
6682
6683
6684
6685
6686
6687
6688
6689
6690
6691
6692
6693
6694
6695
6696
6697
6698
6699
6700
6701
6702
6703
6704
6705
6706
6707
6708
6709
6710
6711
6712
6713
6714
6715
6716
6717
6718
6719
6720
6721
6722
6723
6724
6725
6726
6727
6728
6729
6730
6731
6732
6733
6734
6735
6736
6737
6738
6739
6740
6741
6742
6743
6744
6745
6746
6747
6748
6749
6750
6751
6752
6753
6754
6755
6756
6757
6758
6759
6760
6761
6762
6763
6764
6765
6766
6767
6768
6769
6770
6771
6772
6773
6774
6775
6776
6777
6778
6779
6780
6781
6782
6783
6784
6785
6786
6787
6788
6789
6790
6791
6792
6793
6794
6795
6796
6797
6798
6799
6800
6801
6802
6803
6804
6805
6806
6807
6808
6809
6810
6811
6812
6813
6814
6815
6816
6817
6818
6819
6820
6821
6822
6823
6824
6825
6826
6827
6828
6829
6830
6831
6832
6833
6834
6835
6836
6837
6838
6839
6840
6841
6842
6843
6844
6845
6846
6847
6848
6849
6850
6851
6852
6853
6854
6855
6856
6857
6858
6859
6860
6861
6862
6863
6864
6865
6866
6867
6868
6869
6870
6871
6872
6873
6874
6875
6876
6877
6878
6879
6880
6881
6882
6883
6884
6885
6886
6887
6888
6889
6890
6891
6892
6893
6894
6895
6896
6897
6898
6899
6900
6901
6902
6903
6904
6905
6906
6907
6908
6909
6910
6911
6912
6913
6914
6915
6916
6917
6918
6919
6920
6921
6922
6923
6924
6925
6926
6927
6928
6929
6930
6931
6932
6933
6934
6935
6936
6937
6938
6939
6940
6941
6942
6943
6944
6945
6946
6947
6948
6949
6950
6951
6952
6953
6954
6955
6956
6957
6958
6959
6960
6961
6962
6963
6964
6965
6966
6967
6968
6969
6970
6971
6972
6973
6974
6975
6976
6977
6978
6979
6980
6981
6982
6983
6984
6985
6986
6987
6988
6989
6990
6991
6992
6993
6994
6995
6996
6997
6998
6999
7000
7001
7002
7003
7004
7005
7006
7007
7008
7009
7010
7011
7012
7013
7014
7015
7016
7017
7018
7019
7020
7021
7022
7023
7024
7025
7026
7027
7028
7029
7030
7031
7032
7033
7034
7035
7036
7037
7038
7039
7040
7041
7042
7043
7044
7045
7046
7047
7048
7049
7050
7051
7052
7053
7054
7055
7056
7057
7058
7059
7060
7061
7062
7063
7064
7065
7066
7067
7068
7069
7070
7071
7072
7073
7074
7075
7076
7077
7078
7079
7080
7081
7082
7083
7084
7085
7086
7087
7088
7089
7090
7091
7092
7093
7094
7095
7096
7097
7098
7099
7100
7101
7102
7103
7104
7105
7106
7107
7108
7109
7110
7111
7112
7113
7114
7115
7116
7117
7118
7119
7120
7121
7122
7123
7124
7125
7126
7127
7128
7129
7130
7131
7132
7133
7134
7135
7136
7137
7138
7139
7140
7141
7142
7143
7144
7145
7146
7147
7148
7149
7150
7151
7152
7153
7154
7155
7156
7157
7158
7159
7160
7161
7162
7163
7164
7165
7166
7167
7168
7169
7170
7171
7172
7173
7174
7175
7176
7177
7178
7179
7180
7181
7182
7183
7184
7185
7186
7187
7188
7189
7190
7191
7192
7193
7194
7195
7196
7197
7198
7199
7200
7201
7202
7203
7204
7205
7206
7207
7208
7209
7210
7211
7212
7213
7214
7215
7216
7217
7218
7219
7220
7221
7222
7223
7224
7225
7226
7227
7228
7229
7230
7231
7232
7233
7234
7235
7236
7237
7238
7239
7240
7241
7242
7243
7244
7245
7246
7247
7248
7249
7250
7251
7252
7253
7254
7255
7256
7257
7258
7259
7260
7261
7262
7263
7264
7265
7266
7267
7268
7269
7270
7271
7272
7273
7274
7275
7276
7277
7278
7279
7280
7281
7282
7283
7284
7285
7286
7287
7288
7289
7290
7291
7292
7293
7294
7295
7296
7297
7298
7299
7300
7301
7302
7303
7304
7305
7306
7307
7308
7309
7310
7311
7312
7313
7314
7315
7316
7317
7318
7319
7320
7321
7322
7323
7324
7325
7326
7327
7328
7329
7330
7331
7332
7333
7334
7335
7336
7337
7338
7339
7340
7341
7342
7343
7344
7345
7346
7347
7348
7349
7350
7351
7352
7353
7354
7355
7356
7357
7358
7359
7360
7361
7362
7363
7364
7365
7366
7367
7368
7369
7370
7371
7372
7373
7374
7375
7376
7377
7378
7379
7380
7381
7382
7383
7384
7385
7386
7387
7388
7389
7390
7391
7392
7393
7394
7395
7396
7397
7398
7399
7400
7401
7402
7403
7404
7405
7406
7407
7408
7409
7410
7411
7412
7413
7414
7415
7416
7417
7418
7419
7420
7421
7422
7423
7424
7425
7426
7427
7428
7429
7430
7431
7432
7433
7434
7435
7436
7437
7438
7439
7440
7441
7442
7443
7444
7445
7446
7447
7448
7449
7450
7451
7452
7453
7454
7455
7456
7457
7458
7459
7460
7461
7462
7463
7464
7465
7466
7467
7468
7469
7470
7471
7472
7473
7474
7475
7476
7477
7478
7479
7480
7481
7482
7483
7484
7485
7486
7487
7488
7489
7490
7491
7492
7493
7494
7495
7496
7497
7498
7499
7500
7501
7502
7503
7504
7505
7506
7507
7508
7509
7510
7511
7512
7513
7514
7515
7516
7517
7518
7519
7520
7521
7522
7523
7524
7525
7526
7527
7528
7529
7530
7531
7532
7533
7534
7535
7536
7537
7538
7539
7540
7541
7542
7543
7544
7545
7546
7547
7548
7549
7550
7551
7552
7553
7554
7555
7556
7557
7558
7559
7560
7561
7562
7563
7564
7565
7566
7567
7568
7569
7570
7571
7572
7573
7574
7575
7576
7577
7578
7579
7580
7581
7582
7583
7584
7585
7586
7587
@evaluate_all_lazy_load_args_in_client_methods
class Client(metaclass=ClientMetaClass):
    """ZenML client class.

    The ZenML client manages configuration options for ZenML stacks as well
    as their components.
    """

    _active_user: Optional["UserResponse"] = None
    _active_workspace: Optional["WorkspaceResponse"] = None
    _active_stack: Optional["StackResponse"] = None

    def __init__(
        self,
        root: Optional[Path] = None,
    ) -> None:
        """Initializes the global client instance.

        Client is a singleton class: only one instance can exist. Calling
        this constructor multiple times will always yield the same instance (see
        the exception below).

        The `root` argument is only meant for internal use and testing purposes.
        User code must never pass them to the constructor.
        When a custom `root` value is passed, an anonymous Client instance
        is created and returned independently of the Client singleton and
        that will have no effect as far as the rest of the ZenML core code is
        concerned.

        Instead of creating a new Client instance to reflect a different
        repository root, to change the active root in the global Client,
        call `Client().activate_root(<new-root>)`.

        Args:
            root: (internal use) custom root directory for the client. If
                no path is given, the repository root is determined using the
                environment variable `ZENML_REPOSITORY_PATH` (if set) and by
                recursively searching in the parent directories of the
                current working directory. Only used to initialize new
                clients internally.
        """
        self._root: Optional[Path] = None
        self._config: Optional[ClientConfiguration] = None

        self._set_active_root(root)

    @classmethod
    def get_instance(cls) -> Optional["Client"]:
        """Return the Client singleton instance.

        Returns:
            The Client singleton instance or None, if the Client hasn't
            been initialized yet.
        """
        return cls._global_client

    @classmethod
    def _reset_instance(cls, client: Optional["Client"] = None) -> None:
        """Reset the Client singleton instance.

        This method is only meant for internal use and testing purposes.

        Args:
            client: The Client instance to set as the global singleton.
                If None, the global Client singleton is reset to an empty
                value.
        """
        cls._global_client = client

    def _set_active_root(self, root: Optional[Path] = None) -> None:
        """Set the supplied path as the repository root.

        If a client configuration is found at the given path or the
        path, it is loaded and used to initialize the client.
        If no client configuration is found, the global configuration is
        used instead to manage the active stack, workspace etc.

        Args:
            root: The path to set as the active repository root. If not set,
                the repository root is determined using the environment
                variable `ZENML_REPOSITORY_PATH` (if set) and by recursively
                searching in the parent directories of the current working
                directory.
        """
        enable_warnings = handle_bool_env_var(
            ENV_ZENML_ENABLE_REPO_INIT_WARNINGS, False
        )
        self._root = self.find_repository(
            root, enable_warnings=enable_warnings
        )

        if not self._root:
            self._config = None
            if enable_warnings:
                logger.info("Running without an active repository root.")
        else:
            logger.debug("Using repository root %s.", self._root)
            self._config = self._load_config()

        # Sanitize the client configuration to reflect the current
        # settings
        self._sanitize_config()

    def _config_path(self) -> Optional[str]:
        """Path to the client configuration file.

        Returns:
            Path to the client configuration file or None if the client
            root has not been initialized yet.
        """
        if not self.config_directory:
            return None
        return str(self.config_directory / "config.yaml")

    def _sanitize_config(self) -> None:
        """Sanitize and save the client configuration.

        This method is called to ensure that the client configuration
        doesn't contain outdated information, such as an active stack or
        workspace that no longer exists.
        """
        if not self._config:
            return

        active_workspace, active_stack = self.zen_store.validate_active_config(
            self._config.active_workspace_id,
            self._config.active_stack_id,
            config_name="repo",
        )
        self._config.set_active_stack(active_stack)
        self._config.set_active_workspace(active_workspace)

    def _load_config(self) -> Optional[ClientConfiguration]:
        """Loads the client configuration from disk.

        This happens if the client has an active root and the configuration
        file exists. If the configuration file doesn't exist, an empty
        configuration is returned.

        Returns:
            Loaded client configuration or None if the client does not
            have an active root.
        """
        config_path = self._config_path()
        if not config_path:
            return None

        # load the client configuration file if it exists, otherwise use
        # an empty configuration as default
        if fileio.exists(config_path):
            logger.debug(f"Loading client configuration from {config_path}.")
        else:
            logger.debug(
                "No client configuration file found, creating default "
                "configuration."
            )

        return ClientConfiguration(config_file=config_path)

    @staticmethod
    def initialize(
        root: Optional[Path] = None,
    ) -> None:
        """Initializes a new ZenML repository at the given path.

        Args:
            root: The root directory where the repository should be created.
                If None, the current working directory is used.

        Raises:
            InitializationException: If the root directory already contains a
                ZenML repository.
        """
        root = root or Path.cwd()
        logger.debug("Initializing new repository at path %s.", root)
        if Client.is_repository_directory(root):
            raise InitializationException(
                f"Found existing ZenML repository at path '{root}'."
            )

        config_directory = str(root / REPOSITORY_DIRECTORY_NAME)
        io_utils.create_dir_recursive_if_not_exists(config_directory)
        # Initialize the repository configuration at the custom path
        Client(root=root)

    @property
    def uses_local_configuration(self) -> bool:
        """Check if the client is using a local configuration.

        Returns:
            True if the client is using a local configuration,
            False otherwise.
        """
        return self._config is not None

    @staticmethod
    def is_repository_directory(path: Path) -> bool:
        """Checks whether a ZenML client exists at the given path.

        Args:
            path: The path to check.

        Returns:
            True if a ZenML client exists at the given path,
            False otherwise.
        """
        config_dir = path / REPOSITORY_DIRECTORY_NAME
        return fileio.isdir(str(config_dir))

    @staticmethod
    def find_repository(
        path: Optional[Path] = None, enable_warnings: bool = False
    ) -> Optional[Path]:
        """Search for a ZenML repository directory.

        Args:
            path: Optional path to look for the repository. If no path is
                given, this function tries to find the repository using the
                environment variable `ZENML_REPOSITORY_PATH` (if set) and
                recursively searching in the parent directories of the current
                working directory.
            enable_warnings: If `True`, warnings are printed if the repository
                root cannot be found.

        Returns:
            Absolute path to a ZenML repository directory or None if no
            repository directory was found.
        """
        if not path:
            # try to get path from the environment variable
            env_var_path = os.getenv(ENV_ZENML_REPOSITORY_PATH)
            if env_var_path:
                path = Path(env_var_path)

        if path:
            # explicit path via parameter or environment variable, don't search
            # parent directories
            search_parent_directories = False
            warning_message = (
                f"Unable to find ZenML repository at path '{path}'. Make sure "
                f"to create a ZenML repository by calling `zenml init` when "
                f"specifying an explicit repository path in code or via the "
                f"environment variable '{ENV_ZENML_REPOSITORY_PATH}'."
            )
        else:
            # try to find the repository in the parent directories of the
            # current working directory
            path = Path.cwd()
            search_parent_directories = True
            warning_message = (
                f"Unable to find ZenML repository in your current working "
                f"directory ({path}) or any parent directories. If you "
                f"want to use an existing repository which is in a different "
                f"location, set the environment variable "
                f"'{ENV_ZENML_REPOSITORY_PATH}'. If you want to create a new "
                f"repository, run `zenml init`."
            )

        def _find_repository_helper(path_: Path) -> Optional[Path]:
            """Recursively search parent directories for a ZenML repository.

            Args:
                path_: The path to search.

            Returns:
                Absolute path to a ZenML repository directory or None if no
                repository directory was found.
            """
            if Client.is_repository_directory(path_):
                return path_

            if not search_parent_directories or io_utils.is_root(str(path_)):
                return None

            return _find_repository_helper(path_.parent)

        repository_path = _find_repository_helper(path)

        if repository_path:
            return repository_path.resolve()
        if enable_warnings:
            logger.warning(warning_message)
        return None

    @staticmethod
    def is_inside_repository(file_path: str) -> bool:
        """Returns whether a file is inside the active ZenML repository.

        Args:
            file_path: A file path.

        Returns:
            True if the file is inside the active ZenML repository, False
            otherwise.
        """
        if repo_path := Client.find_repository():
            return repo_path in Path(file_path).resolve().parents
        return False

    @property
    def zen_store(self) -> "BaseZenStore":
        """Shortcut to return the global zen store.

        Returns:
            The global zen store.
        """
        return GlobalConfiguration().zen_store

    @property
    def root(self) -> Optional[Path]:
        """The root directory of this client.

        Returns:
            The root directory of this client, or None, if the client
            has not been initialized.
        """
        return self._root

    @property
    def config_directory(self) -> Optional[Path]:
        """The configuration directory of this client.

        Returns:
            The configuration directory of this client, or None, if the
            client doesn't have an active root.
        """
        return self.root / REPOSITORY_DIRECTORY_NAME if self.root else None

    def activate_root(self, root: Optional[Path] = None) -> None:
        """Set the active repository root directory.

        Args:
            root: The path to set as the active repository root. If not set,
                the repository root is determined using the environment
                variable `ZENML_REPOSITORY_PATH` (if set) and by recursively
                searching in the parent directories of the current working
                directory.
        """
        self._set_active_root(root)

    def set_active_workspace(
        self, workspace_name_or_id: Union[str, UUID]
    ) -> "WorkspaceResponse":
        """Set the workspace for the local client.

        Args:
            workspace_name_or_id: The name or ID of the workspace to set active.

        Returns:
            The model of the active workspace.
        """
        workspace = self.zen_store.get_workspace(
            workspace_name_or_id=workspace_name_or_id
        )  # raises KeyError
        if self._config:
            self._config.set_active_workspace(workspace)
            # Sanitize the client configuration to reflect the current
            # settings
            self._sanitize_config()
        else:
            # set the active workspace globally only if the client doesn't use
            # a local configuration
            GlobalConfiguration().set_active_workspace(workspace)
        return workspace

    # ----------------------------- Server Settings ----------------------------

    def get_settings(self, hydrate: bool = True) -> ServerSettingsResponse:
        """Get the server settings.

        Args:
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The server settings.
        """
        return self.zen_store.get_server_settings(hydrate=hydrate)

    def update_server_settings(
        self,
        updated_name: Optional[str] = None,
        updated_logo_url: Optional[str] = None,
        updated_enable_analytics: Optional[bool] = None,
        updated_enable_announcements: Optional[bool] = None,
        updated_enable_updates: Optional[bool] = None,
        updated_onboarding_state: Optional[Dict[str, Any]] = None,
    ) -> ServerSettingsResponse:
        """Update the server settings.

        Args:
            updated_name: Updated name for the server.
            updated_logo_url: Updated logo URL for the server.
            updated_enable_analytics: Updated value whether to enable
                analytics for the server.
            updated_enable_announcements: Updated value whether to display
                announcements about ZenML.
            updated_enable_updates: Updated value whether to display updates
                about ZenML.
            updated_onboarding_state: Updated onboarding state for the server.

        Returns:
            The updated server settings.
        """
        update_model = ServerSettingsUpdate(
            server_name=updated_name,
            logo_url=updated_logo_url,
            enable_analytics=updated_enable_analytics,
            display_announcements=updated_enable_announcements,
            display_updates=updated_enable_updates,
            onboarding_state=updated_onboarding_state,
        )
        return self.zen_store.update_server_settings(update_model)

    # ---------------------------------- Users ---------------------------------

    def create_user(
        self,
        name: str,
        password: Optional[str] = None,
        is_admin: bool = False,
    ) -> UserResponse:
        """Create a new user.

        Args:
            name: The name of the user.
            password: The password of the user. If not provided, the user will
                be created with empty password.
            is_admin: Whether the user should be an admin.

        Returns:
            The model of the created user.
        """
        user = UserRequest(
            name=name, password=password or None, is_admin=is_admin
        )
        user.active = (
            password != "" if self.zen_store.type != StoreType.REST else True
        )
        created_user = self.zen_store.create_user(user=user)

        return created_user

    def get_user(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> UserResponse:
        """Gets a user.

        Args:
            name_id_or_prefix: The name or ID of the user.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The User
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_user,
            list_method=self.list_users,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_users(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        external_user_id: Optional[str] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        full_name: Optional[str] = None,
        email: Optional[str] = None,
        active: Optional[bool] = None,
        email_opted_in: Optional[bool] = None,
        hydrate: bool = False,
    ) -> Page[UserResponse]:
        """List all users.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            external_user_id: Use the external user id for filtering.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: Use the username for filtering
            full_name: Use the user full name for filtering
            email: Use the user email for filtering
            active: User the user active status for filtering
            email_opted_in: Use the user opt in status for filtering
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The User
        """
        return self.zen_store.list_users(
            UserFilter(
                sort_by=sort_by,
                page=page,
                size=size,
                logical_operator=logical_operator,
                id=id,
                external_user_id=external_user_id,
                created=created,
                updated=updated,
                name=name,
                full_name=full_name,
                email=email,
                active=active,
                email_opted_in=email_opted_in,
            ),
            hydrate=hydrate,
        )

    def update_user(
        self,
        name_id_or_prefix: Union[str, UUID],
        updated_name: Optional[str] = None,
        updated_full_name: Optional[str] = None,
        updated_email: Optional[str] = None,
        updated_email_opt_in: Optional[bool] = None,
        updated_password: Optional[str] = None,
        old_password: Optional[str] = None,
        updated_is_admin: Optional[bool] = None,
        updated_metadata: Optional[Dict[str, Any]] = None,
        active: Optional[bool] = None,
    ) -> UserResponse:
        """Update a user.

        Args:
            name_id_or_prefix: The name or ID of the user to update.
            updated_name: The new name of the user.
            updated_full_name: The new full name of the user.
            updated_email: The new email of the user.
            updated_email_opt_in: The new email opt-in status of the user.
            updated_password: The new password of the user.
            old_password: The old password of the user. Required for password
                update.
            updated_is_admin: Whether the user should be an admin.
            updated_metadata: The new metadata for the user.
            active: Use to activate or deactivate the user.

        Returns:
            The updated user.

        Raises:
            ValidationError: If the old password is not provided when updating
                the password.
        """
        user = self.get_user(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        user_update = UserUpdate(name=updated_name or user.name)
        if updated_full_name:
            user_update.full_name = updated_full_name
        if updated_email is not None:
            user_update.email = updated_email
            user_update.email_opted_in = (
                updated_email_opt_in or user.email_opted_in
            )
        if updated_email_opt_in is not None:
            user_update.email_opted_in = updated_email_opt_in
        if updated_password is not None:
            user_update.password = updated_password
            if old_password is None:
                raise ValidationError(
                    "Old password is required to update the password."
                )
            user_update.old_password = old_password
        if updated_is_admin is not None:
            user_update.is_admin = updated_is_admin
        if active is not None:
            user_update.active = active

        if updated_metadata is not None:
            user_update.user_metadata = updated_metadata

        return self.zen_store.update_user(
            user_id=user.id, user_update=user_update
        )

    @_fail_for_sql_zen_store
    def deactivate_user(self, name_id_or_prefix: str) -> "UserResponse":
        """Deactivate a user and generate an activation token.

        Args:
            name_id_or_prefix: The name or ID of the user to reset.

        Returns:
            The deactivated user.
        """
        from zenml.zen_stores.rest_zen_store import RestZenStore

        user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
        assert isinstance(self.zen_store, RestZenStore)
        return self.zen_store.deactivate_user(user_name_or_id=user.name)

    def delete_user(self, name_id_or_prefix: str) -> None:
        """Delete a user.

        Args:
            name_id_or_prefix: The name or ID of the user to delete.
        """
        user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
        self.zen_store.delete_user(user_name_or_id=user.name)

    @property
    def active_user(self) -> "UserResponse":
        """Get the user that is currently in use.

        Returns:
            The active user.
        """
        if self._active_user is None:
            self._active_user = self.zen_store.get_user(include_private=True)
        return self._active_user

    # -------------------------------- Workspaces ------------------------------

    def create_workspace(
        self, name: str, description: str
    ) -> WorkspaceResponse:
        """Create a new workspace.

        Args:
            name: Name of the workspace.
            description: Description of the workspace.

        Returns:
            The created workspace.
        """
        return self.zen_store.create_workspace(
            WorkspaceRequest(name=name, description=description)
        )

    def get_workspace(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> WorkspaceResponse:
        """Gets a workspace.

        Args:
            name_id_or_prefix: The name or ID of the workspace.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The workspace
        """
        if not name_id_or_prefix:
            return self.active_workspace
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_workspace,
            list_method=self.list_workspaces,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_workspaces(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        hydrate: bool = False,
    ) -> Page[WorkspaceResponse]:
        """List all workspaces.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the workspace ID to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: Use the workspace name for filtering
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            Page of workspaces
        """
        return self.zen_store.list_workspaces(
            WorkspaceFilter(
                sort_by=sort_by,
                page=page,
                size=size,
                logical_operator=logical_operator,
                id=id,
                created=created,
                updated=updated,
                name=name,
            ),
            hydrate=hydrate,
        )

    def update_workspace(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]],
        new_name: Optional[str] = None,
        new_description: Optional[str] = None,
    ) -> WorkspaceResponse:
        """Update a workspace.

        Args:
            name_id_or_prefix: Name, ID or prefix of the workspace to update.
            new_name: New name of the workspace.
            new_description: New description of the workspace.

        Returns:
            The updated workspace.
        """
        workspace = self.get_workspace(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        workspace_update = WorkspaceUpdate(name=new_name or workspace.name)
        if new_description:
            workspace_update.description = new_description
        return self.zen_store.update_workspace(
            workspace_id=workspace.id,
            workspace_update=workspace_update,
        )

    def delete_workspace(self, name_id_or_prefix: str) -> None:
        """Delete a workspace.

        Args:
            name_id_or_prefix: The name or ID of the workspace to delete.

        Raises:
            IllegalOperationError: If the workspace to delete is the active
                workspace.
        """
        workspace = self.get_workspace(
            name_id_or_prefix, allow_name_prefix_match=False
        )
        if self.active_workspace.id == workspace.id:
            raise IllegalOperationError(
                f"Workspace '{name_id_or_prefix}' cannot be deleted since "
                "it is currently active. Please set another workspace as "
                "active first."
            )
        self.zen_store.delete_workspace(workspace_name_or_id=workspace.id)

    @property
    def active_workspace(self) -> WorkspaceResponse:
        """Get the currently active workspace of the local client.

        If no active workspace is configured locally for the client, the
        active workspace in the global configuration is used instead.

        Returns:
            The active workspace.

        Raises:
            RuntimeError: If the active workspace is not set.
        """
        if workspace_id := os.environ.get(ENV_ZENML_ACTIVE_WORKSPACE_ID):
            if not self._active_workspace or self._active_workspace.id != UUID(
                workspace_id
            ):
                self._active_workspace = self.get_workspace(workspace_id)

            return self._active_workspace

        from zenml.constants import DEFAULT_WORKSPACE_NAME

        # If running in a ZenML server environment, the active workspace is
        # not relevant
        if ENV_ZENML_SERVER in os.environ:
            return self.get_workspace(DEFAULT_WORKSPACE_NAME)

        workspace = (
            self._config.active_workspace if self._config else None
        ) or GlobalConfiguration().get_active_workspace()
        if not workspace:
            raise RuntimeError(
                "No active workspace is configured. Run "
                "`zenml workspace set WORKSPACE_NAME` to set the active "
                "workspace."
            )

        if workspace.name != DEFAULT_WORKSPACE_NAME:
            logger.warning(
                f"You are running with a non-default workspace "
                f"'{workspace.name}'. Any stacks, components, "
                f"pipelines and pipeline runs produced in this "
                f"workspace will currently not be accessible through "
                f"the dashboard. However, this will be possible "
                f"in the near future."
            )
        return workspace

    # --------------------------------- Stacks ---------------------------------

    def create_stack(
        self,
        name: str,
        components: Mapping[StackComponentType, Union[str, UUID]],
        stack_spec_file: Optional[str] = None,
        labels: Optional[Dict[str, Any]] = None,
    ) -> StackResponse:
        """Registers a stack and its components.

        Args:
            name: The name of the stack to register.
            components: dictionary which maps component types to component names
            stack_spec_file: path to the stack spec file
            labels: The labels of the stack.

        Returns:
            The model of the registered stack.
        """
        stack_components = {}

        for c_type, c_identifier in components.items():
            # Skip non-existent components.
            if not c_identifier:
                continue

            # Get the component.
            component = self.get_stack_component(
                name_id_or_prefix=c_identifier,
                component_type=c_type,
            )
            stack_components[c_type] = [component.id]

        stack = StackRequest(
            name=name,
            components=stack_components,
            stack_spec_path=stack_spec_file,
            workspace=self.active_workspace.id,
            user=self.active_user.id,
            labels=labels,
        )

        self._validate_stack_configuration(stack=stack)

        return self.zen_store.create_stack(stack=stack)

    def get_stack(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]] = None,
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> StackResponse:
        """Get a stack by name, ID or prefix.

        If no name, ID or prefix is provided, the active stack is returned.

        Args:
            name_id_or_prefix: The name, ID or prefix of the stack.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The stack.
        """
        if name_id_or_prefix is not None:
            return self._get_entity_by_id_or_name_or_prefix(
                get_method=self.zen_store.get_stack,
                list_method=self.list_stacks,
                name_id_or_prefix=name_id_or_prefix,
                allow_name_prefix_match=allow_name_prefix_match,
                hydrate=hydrate,
            )
        else:
            return self.active_stack_model

    def list_stacks(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        description: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        component_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        component: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[StackResponse]:
        """Lists all stacks.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            description: Use the stack description for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            component_id: The id of the component to filter by.
            user: The name/ID of the user to filter by.
            component: The name/ID of the component to filter by.
            name: The name of the stack to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of stacks.
        """
        stack_filter_model = StackFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            workspace_id=workspace_id,
            user_id=user_id,
            component_id=component_id,
            user=user,
            component=component,
            name=name,
            description=description,
            id=id,
            created=created,
            updated=updated,
        )
        stack_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_stacks(stack_filter_model, hydrate=hydrate)

    def update_stack(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]] = None,
        name: Optional[str] = None,
        stack_spec_file: Optional[str] = None,
        labels: Optional[Dict[str, Any]] = None,
        description: Optional[str] = None,
        component_updates: Optional[
            Dict[StackComponentType, List[Union[UUID, str]]]
        ] = None,
    ) -> StackResponse:
        """Updates a stack and its components.

        Args:
            name_id_or_prefix: The name, id or prefix of the stack to update.
            name: the new name of the stack.
            stack_spec_file: path to the stack spec file.
            labels: The new labels of the stack component.
            description: the new description of the stack.
            component_updates: dictionary which maps stack component types to
                lists of new stack component names or ids.

        Returns:
            The model of the updated stack.

        Raises:
            EntityExistsError: If the stack name is already taken.
        """
        # First, get the stack
        stack = self.get_stack(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        # Create the update model
        update_model = StackUpdate(
            workspace=self.active_workspace.id,
            user=self.active_user.id,
            stack_spec_path=stack_spec_file,
        )

        if name:
            if self.list_stacks(name=name):
                raise EntityExistsError(
                    "There are already existing stacks with the name "
                    f"'{name}'."
                )

            update_model.name = name

        if description:
            update_model.description = description

        # Get the current components
        if component_updates:
            components_dict = stack.components.copy()

            for component_type, component_id_list in component_updates.items():
                if component_id_list is not None:
                    components_dict[component_type] = [
                        self.get_stack_component(
                            name_id_or_prefix=component_id,
                            component_type=component_type,
                        )
                        for component_id in component_id_list
                    ]

            update_model.components = {
                c_type: [c.id for c in c_list]
                for c_type, c_list in components_dict.items()
            }

        if labels is not None:
            existing_labels = stack.labels or {}
            existing_labels.update(labels)

            existing_labels = {
                k: v for k, v in existing_labels.items() if v is not None
            }
            update_model.labels = existing_labels

        updated_stack = self.zen_store.update_stack(
            stack_id=stack.id,
            stack_update=update_model,
        )
        if updated_stack.id == self.active_stack_model.id:
            if self._config:
                self._config.set_active_stack(updated_stack)
            else:
                GlobalConfiguration().set_active_stack(updated_stack)
        return updated_stack

    def delete_stack(
        self, name_id_or_prefix: Union[str, UUID], recursive: bool = False
    ) -> None:
        """Deregisters a stack.

        Args:
            name_id_or_prefix: The name, id or prefix id of the stack
                to deregister.
            recursive: If `True`, all components of the stack which are not
                associated with any other stack will also be deleted.

        Raises:
            ValueError: If the stack is the currently active stack for this
                client.
        """
        stack = self.get_stack(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        if stack.id == self.active_stack_model.id:
            raise ValueError(
                f"Unable to deregister active stack '{stack.name}'. Make "
                f"sure to designate a new active stack before deleting this "
                f"one."
            )

        cfg = GlobalConfiguration()
        if stack.id == cfg.active_stack_id:
            raise ValueError(
                f"Unable to deregister '{stack.name}' as it is the active "
                f"stack within your global configuration. Make "
                f"sure to designate a new active stack before deleting this "
                f"one."
            )

        if recursive:
            stack_components_free_for_deletion = []

            # Get all stack components associated with this stack
            for component_type, component_model in stack.components.items():
                # Get stack associated with the stack component

                stacks = self.list_stacks(
                    component_id=component_model[0].id, size=2, page=1
                )

                # Check if the stack component is part of another stack
                if len(stacks) == 1 and stack.id == stacks[0].id:
                    stack_components_free_for_deletion.append(
                        (component_type, component_model)
                    )

            self.delete_stack(stack.id)

            for (
                stack_component_type,
                stack_component_model,
            ) in stack_components_free_for_deletion:
                self.delete_stack_component(
                    stack_component_model[0].name, stack_component_type
                )

            logger.info("Deregistered stack with name '%s'.", stack.name)
            return

        self.zen_store.delete_stack(stack_id=stack.id)
        logger.info("Deregistered stack with name '%s'.", stack.name)

    @property
    def active_stack(self) -> "Stack":
        """The active stack for this client.

        Returns:
            The active stack for this client.
        """
        from zenml.stack.stack import Stack

        return Stack.from_model(self.active_stack_model)

    @property
    def active_stack_model(self) -> StackResponse:
        """The model of the active stack for this client.

        If no active stack is configured locally for the client, the active
        stack in the global configuration is used instead.

        Returns:
            The model of the active stack for this client.

        Raises:
            RuntimeError: If the active stack is not set.
        """
        if env_stack_id := os.environ.get(ENV_ZENML_ACTIVE_STACK_ID):
            if not self._active_stack or self._active_stack.id != UUID(
                env_stack_id
            ):
                self._active_stack = self.get_stack(env_stack_id)

            return self._active_stack

        stack_id: Optional[UUID] = None

        if self._config:
            if self._config._active_stack:
                return self._config._active_stack

            stack_id = self._config.active_stack_id

        if not stack_id:
            # Initialize the zen store so the global config loads the active
            # stack
            _ = GlobalConfiguration().zen_store
            if active_stack := GlobalConfiguration()._active_stack:
                return active_stack

            stack_id = GlobalConfiguration().get_active_stack_id()

        if not stack_id:
            raise RuntimeError(
                "No active stack is configured. Run "
                "`zenml stack set STACK_NAME` to set the active stack."
            )

        return self.get_stack(stack_id)

    def activate_stack(
        self, stack_name_id_or_prefix: Union[str, UUID]
    ) -> None:
        """Sets the stack as active.

        Args:
            stack_name_id_or_prefix: Model of the stack to activate.

        Raises:
            KeyError: If the stack is not registered.
        """
        # Make sure the stack is registered
        try:
            stack = self.get_stack(name_id_or_prefix=stack_name_id_or_prefix)
        except KeyError as e:
            raise KeyError(
                f"Stack '{stack_name_id_or_prefix}' cannot be activated since "
                f"it is not registered yet. Please register it first."
            ) from e

        if self._config:
            self._config.set_active_stack(stack=stack)

        else:
            # set the active stack globally only if the client doesn't use
            # a local configuration
            GlobalConfiguration().set_active_stack(stack=stack)

    def _validate_stack_configuration(self, stack: StackRequest) -> None:
        """Validates the configuration of a stack.

        Args:
            stack: The stack to validate.

        Raises:
            ValidationError: If the stack configuration is invalid.
        """
        local_components: List[str] = []
        remote_components: List[str] = []
        assert stack.components is not None
        for component_type, components in stack.components.items():
            component_flavor: Union[FlavorResponse, str]

            for component in components:
                if isinstance(component, UUID):
                    component_response = self.get_stack_component(
                        name_id_or_prefix=component,
                        component_type=component_type,
                    )
                    component_config = component_response.configuration
                    component_flavor = component_response.flavor
                else:
                    component_config = component.configuration
                    component_flavor = component.flavor

                # Create and validate the configuration
                from zenml.stack.utils import (
                    validate_stack_component_config,
                    warn_if_config_server_mismatch,
                )

                configuration = validate_stack_component_config(
                    configuration_dict=component_config,
                    flavor=component_flavor,
                    component_type=component_type,
                    # Always enforce validation of custom flavors
                    validate_custom_flavors=True,
                )
                # Guaranteed to not be None by setting
                # `validate_custom_flavors=True` above
                assert configuration is not None
                warn_if_config_server_mismatch(configuration)
                flavor_name = (
                    component_flavor.name
                    if isinstance(component_flavor, FlavorResponse)
                    else component_flavor
                )
                if configuration.is_local:
                    local_components.append(
                        f"{component_type.value}: {flavor_name}"
                    )
                elif configuration.is_remote:
                    remote_components.append(
                        f"{component_type.value}: {flavor_name}"
                    )

        if local_components and remote_components:
            logger.warning(
                f"You are configuring a stack that is composed of components "
                f"that are relying on local resources "
                f"({', '.join(local_components)}) as well as "
                f"components that are running remotely "
                f"({', '.join(remote_components)}). This is not recommended as "
                f"it can lead to unexpected behavior, especially if the remote "
                f"components need to access the local resources. Please make "
                f"sure that your stack is configured correctly, or try to use "
                f"component flavors or configurations that do not require "
                f"local resources."
            )

        if not stack.is_valid:
            raise ValidationError(
                "Stack configuration is invalid. A valid"
                "stack must contain an Artifact Store and "
                "an Orchestrator."
            )

    # ----------------------------- Services -----------------------------------

    def create_service(
        self,
        config: ServiceConfig,
        service_type: ServiceType,
        model_version_id: Optional[UUID] = None,
    ) -> ServiceResponse:
        """Registers a service.

        Args:
            config: The configuration of the service.
            service_type: The type of the service.
            model_version_id: The ID of the model version to associate with the
                service.

        Returns:
            The registered service.
        """
        service_request = ServiceRequest(
            name=config.service_name,
            service_type=service_type,
            config=config.model_dump(),
            workspace=self.active_workspace.id,
            user=self.active_user.id,
            model_version_id=model_version_id,
        )
        # Register the service
        return self.zen_store.create_service(service_request)

    def get_service(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
        type: Optional[str] = None,
    ) -> ServiceResponse:
        """Gets a service.

        Args:
            name_id_or_prefix: The name or ID of the service.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            type: The type of the service.

        Returns:
            The Service
        """

        def type_scoped_list_method(
            hydrate: bool = True,
            **kwargs: Any,
        ) -> Page[ServiceResponse]:
            """Call `zen_store.list_services` with type scoping.

            Args:
                hydrate: Flag deciding whether to hydrate the output model(s)
                    by including metadata fields in the response.
                **kwargs: Keyword arguments to pass to `ServiceFilterModel`.

            Returns:
                The type-scoped list of services.
            """
            service_filter_model = ServiceFilter(**kwargs)
            if type:
                service_filter_model.set_type(type=type)
            service_filter_model.set_scope_workspace(self.active_workspace.id)
            return self.zen_store.list_services(
                filter_model=service_filter_model,
                hydrate=hydrate,
            )

        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_service,
            list_method=type_scoped_list_method,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_services(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        type: Optional[str] = None,
        flavor: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
        running: Optional[bool] = None,
        service_name: Optional[str] = None,
        pipeline_name: Optional[str] = None,
        pipeline_run_id: Optional[str] = None,
        pipeline_step_name: Optional[str] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        config: Optional[Dict[str, Any]] = None,
    ) -> Page[ServiceResponse]:
        """List all services.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of services to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            type: Use the service type for filtering
            flavor: Use the service flavor for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The id of the user to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            running: Use the running status for filtering
            pipeline_name: Use the pipeline name for filtering
            service_name: Use the service name or model name
                for filtering
            pipeline_step_name: Use the pipeline step name for filtering
            model_version_id: Use the model version id for filtering
            config: Use the config for filtering
            pipeline_run_id: Use the pipeline run id for filtering

        Returns:
            The Service response page.
        """
        service_filter_model = ServiceFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            type=type,
            flavor=flavor,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            running=running,
            name=service_name,
            pipeline_name=pipeline_name,
            pipeline_step_name=pipeline_step_name,
            model_version_id=model_version_id,
            pipeline_run_id=pipeline_run_id,
            config=dict_to_bytes(config) if config else None,
        )
        service_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_services(
            filter_model=service_filter_model, hydrate=hydrate
        )

    def update_service(
        self,
        id: UUID,
        name: Optional[str] = None,
        service_source: Optional[str] = None,
        admin_state: Optional[ServiceState] = None,
        status: Optional[Dict[str, Any]] = None,
        endpoint: Optional[Dict[str, Any]] = None,
        labels: Optional[Dict[str, str]] = None,
        prediction_url: Optional[str] = None,
        health_check_url: Optional[str] = None,
        model_version_id: Optional[UUID] = None,
    ) -> ServiceResponse:
        """Update a service.

        Args:
            id: The ID of the service to update.
            name: The new name of the service.
            admin_state: The new admin state of the service.
            status: The new status of the service.
            endpoint: The new endpoint of the service.
            service_source: The new service source of the service.
            labels: The new labels of the service.
            prediction_url: The new prediction url of the service.
            health_check_url: The new health check url of the service.
            model_version_id: The new model version id of the service.

        Returns:
            The updated service.
        """
        service_update = ServiceUpdate()
        if name:
            service_update.name = name
        if service_source:
            service_update.service_source = service_source
        if admin_state:
            service_update.admin_state = admin_state
        if status:
            service_update.status = status
        if endpoint:
            service_update.endpoint = endpoint
        if labels:
            service_update.labels = labels
        if prediction_url:
            service_update.prediction_url = prediction_url
        if health_check_url:
            service_update.health_check_url = health_check_url
        if model_version_id:
            service_update.model_version_id = model_version_id
        return self.zen_store.update_service(
            service_id=id, update=service_update
        )

    def delete_service(self, name_id_or_prefix: UUID) -> None:
        """Delete a service.

        Args:
            name_id_or_prefix: The name or ID of the service to delete.
        """
        service = self.get_service(
            name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        self.zen_store.delete_service(service_id=service.id)

    # -------------------------------- Components ------------------------------

    def get_stack_component(
        self,
        component_type: StackComponentType,
        name_id_or_prefix: Optional[Union[str, UUID]] = None,
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> ComponentResponse:
        """Fetches a registered stack component.

        If the name_id_or_prefix is provided, it will try to fetch the component
        with the corresponding identifier. If not, it will try to fetch the
        active component of the given type.

        Args:
            component_type: The type of the component to fetch
            name_id_or_prefix: The id of the component to fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The registered stack component.

        Raises:
            KeyError: If no name_id_or_prefix is provided and no such component
                is part of the active stack.
        """
        # If no `name_id_or_prefix` provided, try to get the active component.
        if not name_id_or_prefix:
            components = self.active_stack_model.components.get(
                component_type, None
            )
            if components:
                return components[0]
            raise KeyError(
                "No name_id_or_prefix provided and there is no active "
                f"{component_type} in the current active stack."
            )

        # Else, try to fetch the component with an explicit type filter
        def type_scoped_list_method(
            hydrate: bool = False,
            **kwargs: Any,
        ) -> Page[ComponentResponse]:
            """Call `zen_store.list_stack_components` with type scoping.

            Args:
                hydrate: Flag deciding whether to hydrate the output model(s)
                    by including metadata fields in the response.
                **kwargs: Keyword arguments to pass to `ComponentFilterModel`.

            Returns:
                The type-scoped list of components.
            """
            component_filter_model = ComponentFilter(**kwargs)
            component_filter_model.set_scope_type(
                component_type=component_type
            )
            component_filter_model.set_scope_workspace(
                self.active_workspace.id
            )
            return self.zen_store.list_stack_components(
                component_filter_model=component_filter_model,
                hydrate=hydrate,
            )

        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_stack_component,
            list_method=type_scoped_list_method,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_stack_components(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        flavor: Optional[str] = None,
        type: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        connector_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ComponentResponse]:
        """Lists all registered stack components.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of component to filter by.
            created: Use to component by time of creation
            updated: Use the last updated date for filtering
            flavor: Use the component flavor for filtering
            type: Use the component type for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The id of the user to filter by.
            connector_id: The id of the connector to filter by.
            stack_id: The id of the stack to filter by.
            name: The name of the component to filter by.
            user: The ID of name of the user to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of stack components.
        """
        component_filter_model = ComponentFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            workspace_id=workspace_id or self.active_workspace.id,
            user_id=user_id,
            connector_id=connector_id,
            stack_id=stack_id,
            name=name,
            flavor=flavor,
            type=type,
            id=id,
            created=created,
            updated=updated,
            user=user,
        )
        component_filter_model.set_scope_workspace(self.active_workspace.id)

        return self.zen_store.list_stack_components(
            component_filter_model=component_filter_model, hydrate=hydrate
        )

    def create_stack_component(
        self,
        name: str,
        flavor: str,
        component_type: StackComponentType,
        configuration: Dict[str, str],
        labels: Optional[Dict[str, Any]] = None,
    ) -> "ComponentResponse":
        """Registers a stack component.

        Args:
            name: The name of the stack component.
            flavor: The flavor of the stack component.
            component_type: The type of the stack component.
            configuration: The configuration of the stack component.
            labels: The labels of the stack component.

        Returns:
            The model of the registered component.
        """
        from zenml.stack.utils import (
            validate_stack_component_config,
            warn_if_config_server_mismatch,
        )

        validated_config = validate_stack_component_config(
            configuration_dict=configuration,
            flavor=flavor,
            component_type=component_type,
            # Always enforce validation of custom flavors
            validate_custom_flavors=True,
        )
        # Guaranteed to not be None by setting
        # `validate_custom_flavors=True` above
        assert validated_config is not None
        warn_if_config_server_mismatch(validated_config)

        create_component_model = ComponentRequest(
            name=name,
            type=component_type,
            flavor=flavor,
            configuration=configuration,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
            labels=labels,
        )

        # Register the new model
        return self.zen_store.create_stack_component(
            component=create_component_model
        )

    def update_stack_component(
        self,
        name_id_or_prefix: Optional[Union[UUID, str]],
        component_type: StackComponentType,
        name: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        labels: Optional[Dict[str, Any]] = None,
        disconnect: Optional[bool] = None,
        connector_id: Optional[UUID] = None,
        connector_resource_id: Optional[str] = None,
    ) -> ComponentResponse:
        """Updates a stack component.

        Args:
            name_id_or_prefix: The name, id or prefix of the stack component to
                update.
            component_type: The type of the stack component to update.
            name: The new name of the stack component.
            configuration: The new configuration of the stack component.
            labels: The new labels of the stack component.
            disconnect: Whether to disconnect the stack component from its
                service connector.
            connector_id: The new connector id of the stack component.
            connector_resource_id: The new connector resource id of the
                stack component.

        Returns:
            The updated stack component.

        Raises:
            EntityExistsError: If the new name is already taken.
        """
        # Get the existing component model
        component = self.get_stack_component(
            name_id_or_prefix=name_id_or_prefix,
            component_type=component_type,
            allow_name_prefix_match=False,
        )

        update_model = ComponentUpdate(
            workspace=self.active_workspace.id,
            user=self.active_user.id,
        )

        if name is not None:
            existing_components = self.list_stack_components(
                name=name,
                type=component_type,
            )
            if existing_components.total > 0:
                raise EntityExistsError(
                    f"There are already existing components with the "
                    f"name '{name}'."
                )
            update_model.name = name

        if configuration is not None:
            existing_configuration = component.configuration
            existing_configuration.update(configuration)
            existing_configuration = {
                k: v
                for k, v in existing_configuration.items()
                if v is not None
            }

            from zenml.stack.utils import (
                validate_stack_component_config,
                warn_if_config_server_mismatch,
            )

            validated_config = validate_stack_component_config(
                configuration_dict=existing_configuration,
                flavor=component.flavor,
                component_type=component.type,
                # Always enforce validation of custom flavors
                validate_custom_flavors=True,
            )
            # Guaranteed to not be None by setting
            # `validate_custom_flavors=True` above
            assert validated_config is not None
            warn_if_config_server_mismatch(validated_config)

            update_model.configuration = existing_configuration

        if labels is not None:
            existing_labels = component.labels or {}
            existing_labels.update(labels)

            existing_labels = {
                k: v for k, v in existing_labels.items() if v is not None
            }
            update_model.labels = existing_labels

        if disconnect:
            update_model.connector = None
            update_model.connector_resource_id = None
        else:
            existing_component = self.get_stack_component(
                name_id_or_prefix=name_id_or_prefix,
                component_type=component_type,
                allow_name_prefix_match=False,
            )
            update_model.connector = connector_id
            update_model.connector_resource_id = connector_resource_id
            if connector_id is None and existing_component.connector:
                update_model.connector = existing_component.connector.id
                update_model.connector_resource_id = (
                    existing_component.connector_resource_id
                )

        # Send the updated component to the ZenStore
        return self.zen_store.update_stack_component(
            component_id=component.id,
            component_update=update_model,
        )

    def delete_stack_component(
        self,
        name_id_or_prefix: Union[str, UUID],
        component_type: StackComponentType,
    ) -> None:
        """Deletes a registered stack component.

        Args:
            name_id_or_prefix: The model of the component to delete.
            component_type: The type of the component to delete.
        """
        component = self.get_stack_component(
            name_id_or_prefix=name_id_or_prefix,
            component_type=component_type,
            allow_name_prefix_match=False,
        )

        self.zen_store.delete_stack_component(component_id=component.id)
        logger.info(
            "Deregistered stack component (type: %s) with name '%s'.",
            component.type,
            component.name,
        )

    # --------------------------------- Flavors --------------------------------

    def create_flavor(
        self,
        source: str,
        component_type: StackComponentType,
    ) -> FlavorResponse:
        """Creates a new flavor.

        Args:
            source: The flavor to create.
            component_type: The type of the flavor.

        Returns:
            The created flavor (in model form).

        Raises:
            ValueError: in case the config_schema of the flavor is too large.
        """
        from zenml.stack.flavor import validate_flavor_source

        flavor = validate_flavor_source(
            source=source, component_type=component_type
        )()

        if len(flavor.config_schema) > TEXT_FIELD_MAX_LENGTH:
            raise ValueError(
                "Json representation of configuration schema"
                "exceeds max length. This could be caused by an"
                "overly long docstring on the flavors "
                "configuration class' docstring."
            )

        flavor_request = flavor.to_model(integration="custom", is_custom=True)
        return self.zen_store.create_flavor(flavor=flavor_request)

    def get_flavor(
        self,
        name_id_or_prefix: str,
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> FlavorResponse:
        """Get a stack component flavor.

        Args:
            name_id_or_prefix: The name, ID or prefix to the id of the flavor
                to get.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The stack component flavor.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_flavor,
            list_method=self.list_flavors,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_flavors(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        type: Optional[str] = None,
        integration: Optional[str] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[FlavorResponse]:
        """Fetches all the flavor models.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of flavors to filter by.
            created: Use to flavors by time of creation
            updated: Use the last updated date for filtering
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            name: The name of the flavor to filter by.
            type: The type of the flavor to filter by.
            integration: The integration of the flavor to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of all the flavor models.
        """
        flavor_filter_model = FlavorFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            user_id=user_id,
            user=user,
            name=name,
            type=type,
            integration=integration,
            id=id,
            created=created,
            updated=updated,
        )
        flavor_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_flavors(
            flavor_filter_model=flavor_filter_model, hydrate=hydrate
        )

    def delete_flavor(self, name_id_or_prefix: str) -> None:
        """Deletes a flavor.

        Args:
            name_id_or_prefix: The name, id or prefix of the id for the
                flavor to delete.
        """
        flavor = self.get_flavor(
            name_id_or_prefix, allow_name_prefix_match=False
        )
        self.zen_store.delete_flavor(flavor_id=flavor.id)

        logger.info(f"Deleted flavor '{flavor.name}' of type '{flavor.type}'.")

    def get_flavors_by_type(
        self, component_type: "StackComponentType"
    ) -> Page[FlavorResponse]:
        """Fetches the list of flavor for a stack component type.

        Args:
            component_type: The type of the component to fetch.

        Returns:
            The list of flavors.
        """
        logger.debug(f"Fetching the flavors of type {component_type}.")

        return self.list_flavors(
            type=component_type,
        )

    def get_flavor_by_name_and_type(
        self, name: str, component_type: "StackComponentType"
    ) -> FlavorResponse:
        """Fetches a registered flavor.

        Args:
            component_type: The type of the component to fetch.
            name: The name of the flavor to fetch.

        Returns:
            The registered flavor.

        Raises:
            KeyError: If no flavor exists for the given type and name.
        """
        logger.debug(
            f"Fetching the flavor of type {component_type} with name {name}."
        )

        if not (
            flavors := self.list_flavors(
                type=component_type, name=name, hydrate=True
            ).items
        ):
            raise KeyError(
                f"No flavor with name '{name}' and type '{component_type}' "
                "exists."
            )
        if len(flavors) > 1:
            raise KeyError(
                f"More than one flavor with name {name} and type "
                f"{component_type} exists."
            )

        return flavors[0]

    # ------------------------------- Pipelines --------------------------------

    def list_pipelines(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        latest_run_status: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        tag: Optional[str] = None,
        hydrate: bool = False,
    ) -> Page[PipelineResponse]:
        """List all pipelines.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of pipeline to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the pipeline to filter by.
            latest_run_status: Filter by the status of the latest run of a
                pipeline.
            workspace_id: The id of the workspace to filter by.
            user_id: The id of the user to filter by.
            user: The name/ID of the user to filter by.
            tag: Tag to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with Pipeline fitting the filter description
        """
        pipeline_filter_model = PipelineFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            latest_run_status=latest_run_status,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            tag=tag,
        )
        pipeline_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_pipelines(
            pipeline_filter_model=pipeline_filter_model,
            hydrate=hydrate,
        )

    def get_pipeline(
        self,
        name_id_or_prefix: Union[str, UUID],
        hydrate: bool = True,
    ) -> PipelineResponse:
        """Get a pipeline by name, id or prefix.

        Args:
            name_id_or_prefix: The name, ID or ID prefix of the pipeline.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The pipeline.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_pipeline,
            list_method=self.list_pipelines,
            name_id_or_prefix=name_id_or_prefix,
            hydrate=hydrate,
        )

    def delete_pipeline(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete a pipeline.

        Args:
            name_id_or_prefix: The name, ID or ID prefix of the pipeline.
        """
        pipeline = self.get_pipeline(name_id_or_prefix=name_id_or_prefix)
        self.zen_store.delete_pipeline(pipeline_id=pipeline.id)

    @_fail_for_sql_zen_store
    def trigger_pipeline(
        self,
        pipeline_name_or_id: Union[str, UUID, None] = None,
        run_configuration: Union[
            PipelineRunConfiguration, Dict[str, Any], None
        ] = None,
        config_path: Optional[str] = None,
        template_id: Optional[UUID] = None,
        stack_name_or_id: Union[str, UUID, None] = None,
        synchronous: bool = False,
    ) -> PipelineRunResponse:
        """Trigger a pipeline from the server.

        Usage examples:
        * Run the latest runnable template for a pipeline:
        ```python
        Client().trigger_pipeline(pipeline_name_or_id=<NAME>)
        ```
        * Run the latest runnable template for a pipeline on a specific stack:
        ```python
        Client().trigger_pipeline(
            pipeline_name_or_id=<NAME>,
            stack_name_or_id=<STACK_NAME_OR_ID>
        )
        ```
        * Run a specific template:
        ```python
        Client().trigger_pipeline(template_id=<ID>)
        ```

        Args:
            pipeline_name_or_id: Name or ID of the pipeline. If this is
                specified, the latest runnable template for this pipeline will
                be used for the run (Runnable here means that the build
                associated with the template is for a remote stack without any
                custom flavor stack components). If not given, a template ID
                that should be run needs to be specified.
            run_configuration: Configuration for the run. Either this or a
                path to a config file can be specified.
            config_path: Path to a YAML configuration file. This file will be
                parsed as a `PipelineRunConfiguration` object. Either this or
                the configuration in code can be specified.
            template_id: ID of the template to run. Either this or a pipeline
                can be specified.
            stack_name_or_id: Name or ID of the stack on which to run the
                pipeline. If not specified, this method will try to find a
                runnable template on any stack.
            synchronous: If `True`, this method will wait until the triggered
                run is finished.

        Raises:
            RuntimeError: If triggering the pipeline failed.

        Returns:
            Model of the pipeline run.
        """
        from zenml.pipelines.run_utils import (
            validate_run_config_is_runnable_from_server,
            validate_stack_is_runnable_from_server,
            wait_for_pipeline_run_to_finish,
        )

        if Counter([template_id, pipeline_name_or_id])[None] != 1:
            raise RuntimeError(
                "You need to specify exactly one of pipeline or template "
                "to trigger."
            )

        if run_configuration and config_path:
            raise RuntimeError(
                "Only config path or runtime configuration can be specified."
            )

        if config_path:
            run_configuration = PipelineRunConfiguration.from_yaml(config_path)

        if isinstance(run_configuration, Dict):
            run_configuration = PipelineRunConfiguration.model_validate(
                run_configuration
            )

        if run_configuration:
            validate_run_config_is_runnable_from_server(run_configuration)

        if template_id:
            if stack_name_or_id:
                logger.warning(
                    "Template ID and stack specified, ignoring the stack and "
                    "using stack associated with the template instead."
                )

            run = self.zen_store.run_template(
                template_id=template_id,
                run_configuration=run_configuration,
            )
        else:
            assert pipeline_name_or_id
            pipeline = self.get_pipeline(name_id_or_prefix=pipeline_name_or_id)

            stack = None
            if stack_name_or_id:
                stack = self.get_stack(
                    stack_name_or_id, allow_name_prefix_match=False
                )
                validate_stack_is_runnable_from_server(
                    zen_store=self.zen_store, stack=stack
                )

            templates = depaginate(
                self.list_run_templates,
                pipeline_id=pipeline.id,
                stack_id=stack.id if stack else None,
            )

            for template in templates:
                if not template.build:
                    continue

                stack = template.build.stack
                if not stack:
                    continue

                try:
                    validate_stack_is_runnable_from_server(
                        zen_store=self.zen_store, stack=stack
                    )
                except ValueError:
                    continue

                run = self.zen_store.run_template(
                    template_id=template.id,
                    run_configuration=run_configuration,
                )
                break
            else:
                raise RuntimeError(
                    "Unable to find a runnable template for the given stack "
                    "and pipeline."
                )

        if synchronous:
            run = wait_for_pipeline_run_to_finish(run_id=run.id)

        return run

    # -------------------------------- Builds ----------------------------------

    def get_build(
        self,
        id_or_prefix: Union[str, UUID],
        hydrate: bool = True,
    ) -> PipelineBuildResponse:
        """Get a build by id or prefix.

        Args:
            id_or_prefix: The id or id prefix of the build.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The build.

        Raises:
            KeyError: If no build was found for the given id or prefix.
            ZenKeyError: If multiple builds were found that match the given
                id or prefix.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        # First interpret as full UUID
        if is_valid_uuid(id_or_prefix):
            if not isinstance(id_or_prefix, UUID):
                id_or_prefix = UUID(id_or_prefix, version=4)

            return self.zen_store.get_build(
                id_or_prefix,
                hydrate=hydrate,
            )

        entity = self.list_builds(
            id=f"startswith:{id_or_prefix}", hydrate=hydrate
        )

        # If only a single entity is found, return it.
        if entity.total == 1:
            return entity.items[0]

        # If no entity is found, raise an error.
        if entity.total == 0:
            raise KeyError(
                f"No builds have been found that have either an id or prefix "
                f"that matches the provided string '{id_or_prefix}'."
            )

        raise ZenKeyError(
            f"{entity.total} builds have been found that have "
            f"an ID that matches the provided "
            f"string '{id_or_prefix}':\n"
            f"{[entity.items]}.\n"
            f"Please use the id to uniquely identify "
            f"only one of the builds."
        )

    def list_builds(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        container_registry_id: Optional[Union[UUID, str]] = None,
        is_local: Optional[bool] = None,
        contains_code: Optional[bool] = None,
        zenml_version: Optional[str] = None,
        python_version: Optional[str] = None,
        checksum: Optional[str] = None,
        stack_checksum: Optional[str] = None,
        hydrate: bool = False,
    ) -> Page[PipelineBuildResponse]:
        """List all builds.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of build to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            pipeline_id: The id of the pipeline to filter by.
            stack_id: The id of the stack to filter by.
            container_registry_id: The id of the container registry to
                filter by.
            is_local: Use to filter local builds.
            contains_code: Use to filter builds that contain code.
            zenml_version: The version of ZenML to filter by.
            python_version: The Python version to filter by.
            checksum: The build checksum to filter by.
            stack_checksum: The stack checksum to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with builds fitting the filter description
        """
        build_filter_model = PipelineBuildFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            pipeline_id=pipeline_id,
            stack_id=stack_id,
            container_registry_id=container_registry_id,
            is_local=is_local,
            contains_code=contains_code,
            zenml_version=zenml_version,
            python_version=python_version,
            checksum=checksum,
            stack_checksum=stack_checksum,
        )
        build_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_builds(
            build_filter_model=build_filter_model,
            hydrate=hydrate,
        )

    def delete_build(self, id_or_prefix: str) -> None:
        """Delete a build.

        Args:
            id_or_prefix: The id or id prefix of the build.
        """
        build = self.get_build(id_or_prefix=id_or_prefix)
        self.zen_store.delete_build(build_id=build.id)

    # --------------------------------- Event Sources -------------------------

    @_fail_for_sql_zen_store
    def create_event_source(
        self,
        name: str,
        configuration: Dict[str, Any],
        flavor: str,
        event_source_subtype: PluginSubType,
        description: str = "",
    ) -> EventSourceResponse:
        """Registers an event source.

        Args:
            name: The name of the event source to create.
            configuration: Configuration for this event source.
            flavor: The flavor of event source.
            event_source_subtype: The event source subtype.
            description: The description of the event source.

        Returns:
            The model of the registered event source.
        """
        event_source = EventSourceRequest(
            name=name,
            configuration=configuration,
            description=description,
            flavor=flavor,
            plugin_type=PluginType.EVENT_SOURCE,
            plugin_subtype=event_source_subtype,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
        )

        return self.zen_store.create_event_source(event_source=event_source)

    @_fail_for_sql_zen_store
    def get_event_source(
        self,
        name_id_or_prefix: Union[UUID, str],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> EventSourceResponse:
        """Get an event source by name, ID or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the stack.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The event_source.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_event_source,
            list_method=self.list_event_sources,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_event_sources(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        flavor: Optional[str] = None,
        event_source_type: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[EventSourceResponse]:
        """Lists all event_sources.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of event_sources to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            name: The name of the event_source to filter by.
            flavor: The flavor of the event_source to filter by.
            event_source_type: The subtype of the event_source to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of event_sources.
        """
        event_source_filter_model = EventSourceFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            name=name,
            flavor=flavor,
            plugin_subtype=event_source_type,
            id=id,
            created=created,
            updated=updated,
        )
        event_source_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_event_sources(
            event_source_filter_model, hydrate=hydrate
        )

    @_fail_for_sql_zen_store
    def update_event_source(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        rotate_secret: Optional[bool] = None,
        is_active: Optional[bool] = None,
    ) -> EventSourceResponse:
        """Updates a event_source.

        Args:
            name_id_or_prefix: The name, id or prefix of the event_source to update.
            name: the new name of the event_source.
            description: the new description of the event_source.
            configuration: The event source configuration.
            rotate_secret: Allows rotating of secret, if true, the response will
                contain the new secret value
            is_active: Optional[bool] = Allows for activation/deactivating the
                event source

        Returns:
            The model of the updated event_source.

        Raises:
            EntityExistsError: If the event_source name is already taken.
        """
        # First, get the eve
        event_source = self.get_event_source(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        # Create the update model
        update_model = EventSourceUpdate(
            name=name,
            description=description,
            configuration=configuration,
            rotate_secret=rotate_secret,
            is_active=is_active,
        )

        if name:
            if self.list_event_sources(name=name):
                raise EntityExistsError(
                    "There are already existing event_sources with the name "
                    f"'{name}'."
                )

        updated_event_source = self.zen_store.update_event_source(
            event_source_id=event_source.id,
            event_source_update=update_model,
        )
        return updated_event_source

    @_fail_for_sql_zen_store
    def delete_event_source(self, name_id_or_prefix: Union[str, UUID]) -> None:
        """Deletes an event_source.

        Args:
            name_id_or_prefix: The name, id or prefix id of the event_source
                to deregister.
        """
        event_source = self.get_event_source(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        self.zen_store.delete_event_source(event_source_id=event_source.id)
        logger.info("Deleted event_source with name '%s'.", event_source.name)

    # --------------------------------- Actions -------------------------

    @_fail_for_sql_zen_store
    def create_action(
        self,
        name: str,
        flavor: str,
        action_type: PluginSubType,
        configuration: Dict[str, Any],
        service_account_id: UUID,
        auth_window: Optional[int] = None,
        description: str = "",
    ) -> ActionResponse:
        """Create an action.

        Args:
            name: The name of the action.
            flavor: The flavor of the action,
            action_type: The action subtype.
            configuration: The action configuration.
            service_account_id: The service account that is used to execute the
                action.
            auth_window: The time window in minutes for which the service
                account is authorized to execute the action. Set this to 0 to
                authorize the service account indefinitely (not recommended).
            description: The description of the action.

        Returns:
            The created action
        """
        action = ActionRequest(
            name=name,
            description=description,
            flavor=flavor,
            plugin_subtype=action_type,
            configuration=configuration,
            service_account_id=service_account_id,
            auth_window=auth_window,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
        )

        return self.zen_store.create_action(action=action)

    @_fail_for_sql_zen_store
    def get_action(
        self,
        name_id_or_prefix: Union[UUID, str],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> ActionResponse:
        """Get an action by name, ID or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the action.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The action.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_action,
            list_method=self.list_actions,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    @_fail_for_sql_zen_store
    def list_actions(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        flavor: Optional[str] = None,
        action_type: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ActionResponse]:
        """List actions.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of the action to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The id of the user to filter by.
            user: Filter by user name/ID.
            name: The name of the action to filter by.
            flavor: The flavor of the action to filter by.
            action_type: The type of the action to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of actions.
        """
        filter_model = ActionFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            name=name,
            id=id,
            flavor=flavor,
            plugin_subtype=action_type,
            created=created,
            updated=updated,
        )
        filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_actions(filter_model, hydrate=hydrate)

    @_fail_for_sql_zen_store
    def update_action(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        service_account_id: Optional[UUID] = None,
        auth_window: Optional[int] = None,
    ) -> ActionResponse:
        """Update an action.

        Args:
            name_id_or_prefix: The name, id or prefix of the action to update.
            name: The new name of the action.
            description: The new description of the action.
            configuration: The new configuration of the action.
            service_account_id: The new service account that is used to execute
                the action.
            auth_window: The new time window in minutes for which the service
                account is authorized to execute the action. Set this to 0 to
                authorize the service account indefinitely (not recommended).

        Returns:
            The updated action.
        """
        action = self.get_action(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        update_model = ActionUpdate(
            name=name,
            description=description,
            configuration=configuration,
            service_account_id=service_account_id,
            auth_window=auth_window,
        )

        return self.zen_store.update_action(
            action_id=action.id,
            action_update=update_model,
        )

    @_fail_for_sql_zen_store
    def delete_action(self, name_id_or_prefix: Union[str, UUID]) -> None:
        """Delete an action.

        Args:
            name_id_or_prefix: The name, id or prefix id of the action
                to delete.
        """
        action = self.get_action(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        self.zen_store.delete_action(action_id=action.id)
        logger.info("Deleted action with name '%s'.", action.name)

    # --------------------------------- Triggers -------------------------

    @_fail_for_sql_zen_store
    def create_trigger(
        self,
        name: str,
        event_source_id: UUID,
        event_filter: Dict[str, Any],
        action_id: UUID,
        description: str = "",
    ) -> TriggerResponse:
        """Registers a trigger.

        Args:
            name: The name of the trigger to create.
            event_source_id: The id of the event source id
            event_filter: The event filter
            action_id: The ID of the action that should be triggered.
            description: The description of the trigger

        Returns:
            The created trigger.
        """
        trigger = TriggerRequest(
            name=name,
            description=description,
            event_source_id=event_source_id,
            event_filter=event_filter,
            action_id=action_id,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
        )

        return self.zen_store.create_trigger(trigger=trigger)

    @_fail_for_sql_zen_store
    def get_trigger(
        self,
        name_id_or_prefix: Union[UUID, str],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> TriggerResponse:
        """Get a trigger by name, ID or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the trigger.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The trigger.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_trigger,
            list_method=self.list_triggers,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    @_fail_for_sql_zen_store
    def list_triggers(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        event_source_id: Optional[UUID] = None,
        action_id: Optional[UUID] = None,
        event_source_flavor: Optional[str] = None,
        event_source_subtype: Optional[str] = None,
        action_flavor: Optional[str] = None,
        action_subtype: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[TriggerResponse]:
        """Lists all triggers.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of triggers to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            name: The name of the trigger to filter by.
            event_source_id: The event source associated with the trigger.
            action_id: The action associated with the trigger.
            event_source_flavor: Flavor of the event source associated with the
                trigger.
            event_source_subtype: Type of the event source associated with the
                trigger.
            action_flavor: Flavor of the action associated with the trigger.
            action_subtype: Type of the action associated with the trigger.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of triggers.
        """
        trigger_filter_model = TriggerFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            name=name,
            event_source_id=event_source_id,
            action_id=action_id,
            event_source_flavor=event_source_flavor,
            event_source_subtype=event_source_subtype,
            action_flavor=action_flavor,
            action_subtype=action_subtype,
            id=id,
            created=created,
            updated=updated,
        )
        trigger_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_triggers(
            trigger_filter_model, hydrate=hydrate
        )

    @_fail_for_sql_zen_store
    def update_trigger(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        event_filter: Optional[Dict[str, Any]] = None,
        is_active: Optional[bool] = None,
    ) -> TriggerResponse:
        """Updates a trigger.

        Args:
            name_id_or_prefix: The name, id or prefix of the trigger to update.
            name: the new name of the trigger.
            description: the new description of the trigger.
            event_filter: The event filter configuration.
            is_active: Whether the trigger is active or not.

        Returns:
            The model of the updated trigger.

        Raises:
            EntityExistsError: If the trigger name is already taken.
        """
        # First, get the eve
        trigger = self.get_trigger(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        # Create the update model
        update_model = TriggerUpdate(
            name=name,
            description=description,
            event_filter=event_filter,
            is_active=is_active,
        )

        if name:
            if self.list_triggers(name=name):
                raise EntityExistsError(
                    "There are already is an existing trigger with the name "
                    f"'{name}'."
                )

        updated_trigger = self.zen_store.update_trigger(
            trigger_id=trigger.id,
            trigger_update=update_model,
        )
        return updated_trigger

    @_fail_for_sql_zen_store
    def delete_trigger(self, name_id_or_prefix: Union[str, UUID]) -> None:
        """Deletes an trigger.

        Args:
            name_id_or_prefix: The name, id or prefix id of the trigger
                to deregister.
        """
        trigger = self.get_trigger(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )

        self.zen_store.delete_trigger(trigger_id=trigger.id)
        logger.info("Deleted trigger with name '%s'.", trigger.name)

    # ------------------------------ Deployments -------------------------------

    def get_deployment(
        self,
        id_or_prefix: Union[str, UUID],
        hydrate: bool = True,
    ) -> PipelineDeploymentResponse:
        """Get a deployment by id or prefix.

        Args:
            id_or_prefix: The id or id prefix of the deployment.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The deployment.

        Raises:
            KeyError: If no deployment was found for the given id or prefix.
            ZenKeyError: If multiple deployments were found that match the given
                id or prefix.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        # First interpret as full UUID
        if is_valid_uuid(id_or_prefix):
            id_ = (
                UUID(id_or_prefix)
                if isinstance(id_or_prefix, str)
                else id_or_prefix
            )
            return self.zen_store.get_deployment(id_, hydrate=hydrate)

        entity = self.list_deployments(
            id=f"startswith:{id_or_prefix}",
            hydrate=hydrate,
        )

        # If only a single entity is found, return it.
        if entity.total == 1:
            return entity.items[0]

        # If no entity is found, raise an error.
        if entity.total == 0:
            raise KeyError(
                f"No deployment have been found that have either an id or "
                f"prefix that matches the provided string '{id_or_prefix}'."
            )

        raise ZenKeyError(
            f"{entity.total} deployments have been found that have "
            f"an ID that matches the provided "
            f"string '{id_or_prefix}':\n"
            f"{[entity.items]}.\n"
            f"Please use the id to uniquely identify "
            f"only one of the deployments."
        )

    def list_deployments(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        build_id: Optional[Union[str, UUID]] = None,
        template_id: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
    ) -> Page[PipelineDeploymentResponse]:
        """List all deployments.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of build to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            pipeline_id: The id of the pipeline to filter by.
            stack_id: The id of the stack to filter by.
            build_id: The id of the build to filter by.
            template_id: The ID of the template to filter by.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with deployments fitting the filter description
        """
        deployment_filter_model = PipelineDeploymentFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            pipeline_id=pipeline_id,
            stack_id=stack_id,
            build_id=build_id,
            template_id=template_id,
        )
        deployment_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_deployments(
            deployment_filter_model=deployment_filter_model,
            hydrate=hydrate,
        )

    def delete_deployment(self, id_or_prefix: str) -> None:
        """Delete a deployment.

        Args:
            id_or_prefix: The id or id prefix of the deployment.
        """
        deployment = self.get_deployment(
            id_or_prefix=id_or_prefix, hydrate=False
        )
        self.zen_store.delete_deployment(deployment_id=deployment.id)

    # ------------------------------ Run templates -----------------------------

    def create_run_template(
        self,
        name: str,
        deployment_id: UUID,
        description: Optional[str] = None,
        tags: Optional[List[str]] = None,
    ) -> RunTemplateResponse:
        """Create a run template.

        Args:
            name: The name of the run template.
            deployment_id: ID of the deployment which this template should be
                based off of.
            description: The description of the run template.
            tags: Tags associated with the run template.

        Returns:
            The created run template.
        """
        return self.zen_store.create_run_template(
            template=RunTemplateRequest(
                name=name,
                description=description,
                source_deployment_id=deployment_id,
                tags=tags,
                user=self.active_user.id,
                workspace=self.active_workspace.id,
            )
        )

    def get_run_template(
        self,
        name_id_or_prefix: Union[str, UUID],
        hydrate: bool = True,
    ) -> RunTemplateResponse:
        """Get a run template.

        Args:
            name_id_or_prefix: Name/ID/ID prefix of the template to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The run template.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_run_template,
            list_method=self.list_run_templates,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
            hydrate=hydrate,
        )

    def list_run_templates(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        id: Optional[Union[UUID, str]] = None,
        name: Optional[str] = None,
        tag: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        build_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        code_repository_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline: Optional[Union[UUID, str]] = None,
        stack: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[RunTemplateResponse]:
        """Get a page of run templates.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            created: Filter by the creation date.
            updated: Filter by the last updated date.
            id: Filter by run template ID.
            name: Filter by run template name.
            tag: Filter by run template tags.
            workspace_id: Filter by workspace ID.
            user_id: Filter by user ID.
            pipeline_id: Filter by pipeline ID.
            build_id: Filter by build ID.
            stack_id: Filter by stack ID.
            code_repository_id: Filter by code repository ID.
            user: Filter by user name/ID.
            pipeline: Filter by pipeline name/ID.
            stack: Filter by stack name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of run templates.
        """
        filter = RunTemplateFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            created=created,
            updated=updated,
            id=id,
            name=name,
            tag=tag,
            workspace_id=workspace_id,
            user_id=user_id,
            pipeline_id=pipeline_id,
            build_id=build_id,
            stack_id=stack_id,
            code_repository_id=code_repository_id,
            user=user,
            pipeline=pipeline,
            stack=stack,
        )

        return self.zen_store.list_run_templates(
            template_filter_model=filter, hydrate=hydrate
        )

    def update_run_template(
        self,
        name_id_or_prefix: Union[str, UUID],
        name: Optional[str] = None,
        description: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
    ) -> RunTemplateResponse:
        """Update a run template.

        Args:
            name_id_or_prefix: Name/ID/ID prefix of the template to update.
            name: The new name of the run template.
            description: The new description of the run template.
            add_tags: Tags to add to the run template.
            remove_tags: Tags to remove from the run template.

        Returns:
            The updated run template.
        """
        if is_valid_uuid(name_id_or_prefix):
            template_id = (
                UUID(name_id_or_prefix)
                if isinstance(name_id_or_prefix, str)
                else name_id_or_prefix
            )
        else:
            template_id = self.get_run_template(
                name_id_or_prefix, hydrate=False
            ).id

        return self.zen_store.update_run_template(
            template_id=template_id,
            template_update=RunTemplateUpdate(
                name=name,
                description=description,
                add_tags=add_tags,
                remove_tags=remove_tags,
            ),
        )

    def delete_run_template(self, name_id_or_prefix: Union[str, UUID]) -> None:
        """Delete a run template.

        Args:
            name_id_or_prefix: Name/ID/ID prefix of the template to delete.
        """
        if is_valid_uuid(name_id_or_prefix):
            template_id = (
                UUID(name_id_or_prefix)
                if isinstance(name_id_or_prefix, str)
                else name_id_or_prefix
            )
        else:
            template_id = self.get_run_template(
                name_id_or_prefix, hydrate=False
            ).id

        self.zen_store.delete_run_template(template_id=template_id)

    # ------------------------------- Schedules --------------------------------

    def get_schedule(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> ScheduleResponse:
        """Get a schedule by name, id or prefix.

        Args:
            name_id_or_prefix: The name, id or prefix of the schedule.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The schedule.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_schedule,
            list_method=self.list_schedules,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_schedules(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        orchestrator_id: Optional[Union[str, UUID]] = None,
        active: Optional[Union[str, bool]] = None,
        cron_expression: Optional[str] = None,
        start_time: Optional[Union[datetime, str]] = None,
        end_time: Optional[Union[datetime, str]] = None,
        interval_second: Optional[int] = None,
        catchup: Optional[Union[str, bool]] = None,
        hydrate: bool = False,
        run_once_start_time: Optional[Union[datetime, str]] = None,
    ) -> Page[ScheduleResponse]:
        """List schedules.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the stack to filter by.
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            pipeline_id: The id of the pipeline to filter by.
            orchestrator_id: The id of the orchestrator to filter by.
            active: Use to filter by active status.
            cron_expression: Use to filter by cron expression.
            start_time: Use to filter by start time.
            end_time: Use to filter by end time.
            interval_second: Use to filter by interval second.
            catchup: Use to filter by catchup.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            run_once_start_time: Use to filter by run once start time.

        Returns:
            A list of schedules.
        """
        schedule_filter_model = ScheduleFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            pipeline_id=pipeline_id,
            orchestrator_id=orchestrator_id,
            active=active,
            cron_expression=cron_expression,
            start_time=start_time,
            end_time=end_time,
            interval_second=interval_second,
            catchup=catchup,
            run_once_start_time=run_once_start_time,
        )
        schedule_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_schedules(
            schedule_filter_model=schedule_filter_model,
            hydrate=hydrate,
        )

    def delete_schedule(self, name_id_or_prefix: Union[str, UUID]) -> None:
        """Delete a schedule.

        Args:
            name_id_or_prefix: The name, id or prefix id of the schedule
                to delete.
        """
        schedule = self.get_schedule(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        logger.warning(
            f"Deleting schedule '{name_id_or_prefix}'... This will only delete "
            "the reference of the schedule from ZenML. Please make sure to "
            "manually stop/delete this schedule in your orchestrator as well!"
        )
        self.zen_store.delete_schedule(schedule_id=schedule.id)

    # ----------------------------- Pipeline runs ------------------------------

    def get_pipeline_run(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> PipelineRunResponse:
        """Gets a pipeline run by name, ID, or prefix.

        Args:
            name_id_or_prefix: Name, ID, or prefix of the pipeline run.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The pipeline run.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_run,
            list_method=self.list_pipeline_runs,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_pipeline_runs(
        self,
        sort_by: str = "desc:created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        pipeline_id: Optional[Union[str, UUID]] = None,
        pipeline_name: Optional[str] = None,
        user_id: Optional[Union[str, UUID]] = None,
        stack_id: Optional[Union[str, UUID]] = None,
        schedule_id: Optional[Union[str, UUID]] = None,
        build_id: Optional[Union[str, UUID]] = None,
        deployment_id: Optional[Union[str, UUID]] = None,
        code_repository_id: Optional[Union[str, UUID]] = None,
        template_id: Optional[Union[str, UUID]] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        orchestrator_run_id: Optional[str] = None,
        status: Optional[str] = None,
        start_time: Optional[Union[datetime, str]] = None,
        end_time: Optional[Union[datetime, str]] = None,
        num_steps: Optional[Union[int, str]] = None,
        unlisted: Optional[bool] = None,
        templatable: Optional[bool] = None,
        tag: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        run_metadata: Optional[Dict[str, Any]] = None,
        pipeline: Optional[Union[UUID, str]] = None,
        code_repository: Optional[Union[UUID, str]] = None,
        model: Optional[Union[UUID, str]] = None,
        stack: Optional[Union[UUID, str]] = None,
        stack_component: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[PipelineRunResponse]:
        """List all pipeline runs.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: The id of the runs to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            workspace_id: The id of the workspace to filter by.
            pipeline_id: The id of the pipeline to filter by.
            pipeline_name: DEPRECATED. Use `pipeline` instead to filter by
                pipeline name.
            user_id: The id of the user to filter by.
            stack_id: The id of the stack to filter by.
            schedule_id: The id of the schedule to filter by.
            build_id: The id of the build to filter by.
            deployment_id: The id of the deployment to filter by.
            code_repository_id: The id of the code repository to filter by.
            template_id: The ID of the template to filter by.
            model_version_id: The ID of the model version to filter by.
            orchestrator_run_id: The run id of the orchestrator to filter by.
            name: The name of the run to filter by.
            status: The status of the pipeline run
            start_time: The start_time for the pipeline run
            end_time: The end_time for the pipeline run
            num_steps: The number of steps for the pipeline run
            unlisted: If the runs should be unlisted or not.
            templatable: If the runs should be templatable or not.
            tag: Tag to filter by.
            user: The name/ID of the user to filter by.
            run_metadata: The run_metadata of the run to filter by.
            pipeline: The name/ID of the pipeline to filter by.
            code_repository: Filter by code repository name/ID.
            model: Filter by model name/ID.
            stack: Filter by stack name/ID.
            stack_component: Filter by stack component name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with Pipeline Runs fitting the filter description
        """
        runs_filter_model = PipelineRunFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            workspace_id=workspace_id,
            pipeline_id=pipeline_id,
            pipeline_name=pipeline_name,
            schedule_id=schedule_id,
            build_id=build_id,
            deployment_id=deployment_id,
            code_repository_id=code_repository_id,
            template_id=template_id,
            model_version_id=model_version_id,
            orchestrator_run_id=orchestrator_run_id,
            user_id=user_id,
            stack_id=stack_id,
            status=status,
            start_time=start_time,
            end_time=end_time,
            num_steps=num_steps,
            tag=tag,
            unlisted=unlisted,
            user=user,
            run_metadata=run_metadata,
            pipeline=pipeline,
            code_repository=code_repository,
            stack=stack,
            model=model,
            stack_component=stack_component,
            templatable=templatable,
        )
        runs_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_runs(
            runs_filter_model=runs_filter_model,
            hydrate=hydrate,
        )

    def delete_pipeline_run(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Deletes a pipeline run.

        Args:
            name_id_or_prefix: Name, ID, or prefix of the pipeline run.
        """
        run = self.get_pipeline_run(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        self.zen_store.delete_run(run_id=run.id)

    # -------------------------------- Step run --------------------------------

    def get_run_step(
        self,
        step_run_id: UUID,
        hydrate: bool = True,
    ) -> StepRunResponse:
        """Get a step run by ID.

        Args:
            step_run_id: The ID of the step run to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The step run.
        """
        return self.zen_store.get_run_step(
            step_run_id,
            hydrate=hydrate,
        )

    def list_run_steps(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        cache_key: Optional[str] = None,
        code_hash: Optional[str] = None,
        status: Optional[str] = None,
        start_time: Optional[Union[datetime, str]] = None,
        end_time: Optional[Union[datetime, str]] = None,
        pipeline_run_id: Optional[Union[str, UUID]] = None,
        deployment_id: Optional[Union[str, UUID]] = None,
        original_step_run_id: Optional[Union[str, UUID]] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        model: Optional[Union[UUID, str]] = None,
        run_metadata: Optional[Dict[str, Any]] = None,
        hydrate: bool = False,
    ) -> Page[StepRunResponse]:
        """List all pipelines.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of runs to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            start_time: Use to filter by the time when the step started running
            end_time: Use to filter by the time when the step finished running
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            pipeline_run_id: The id of the pipeline run to filter by.
            deployment_id: The id of the deployment to filter by.
            original_step_run_id: The id of the original step run to filter by.
            model_version_id: The ID of the model version to filter by.
            model: Filter by model name/ID.
            name: The name of the step run to filter by.
            cache_key: The cache key of the step run to filter by.
            code_hash: The code hash of the step run to filter by.
            status: The name of the run to filter by.
            run_metadata: Filter by run metadata.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page with Pipeline fitting the filter description
        """
        step_run_filter_model = StepRunFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            cache_key=cache_key,
            code_hash=code_hash,
            pipeline_run_id=pipeline_run_id,
            deployment_id=deployment_id,
            original_step_run_id=original_step_run_id,
            status=status,
            created=created,
            updated=updated,
            start_time=start_time,
            end_time=end_time,
            name=name,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
            model_version_id=model_version_id,
            model=model,
            run_metadata=run_metadata,
        )
        step_run_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_run_steps(
            step_run_filter_model=step_run_filter_model,
            hydrate=hydrate,
        )

    # ------------------------------- Artifacts -------------------------------

    def get_artifact(
        self,
        name_id_or_prefix: Union[str, UUID],
        hydrate: bool = False,
    ) -> ArtifactResponse:
        """Get an artifact by name, id or prefix.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The artifact.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_artifact,
            list_method=self.list_artifacts,
            name_id_or_prefix=name_id_or_prefix,
            hydrate=hydrate,
        )

    def list_artifacts(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        has_custom_name: Optional[bool] = None,
        hydrate: bool = False,
        tag: Optional[str] = None,
    ) -> Page[ArtifactResponse]:
        """Get a list of artifacts.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of artifact to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the artifact to filter by.
            has_custom_name: Filter artifacts with/without custom names.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            tag: Filter artifacts by tag.

        Returns:
            A list of artifacts.
        """
        artifact_filter_model = ArtifactFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            has_custom_name=has_custom_name,
            tag=tag,
        )
        return self.zen_store.list_artifacts(
            artifact_filter_model,
            hydrate=hydrate,
        )

    def update_artifact(
        self,
        name_id_or_prefix: Union[str, UUID],
        new_name: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
        has_custom_name: Optional[bool] = None,
    ) -> ArtifactResponse:
        """Update an artifact.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to update.
            new_name: The new name of the artifact.
            add_tags: Tags to add to the artifact.
            remove_tags: Tags to remove from the artifact.
            has_custom_name: Whether the artifact has a custom name.

        Returns:
            The updated artifact.
        """
        artifact = self.get_artifact(name_id_or_prefix=name_id_or_prefix)
        artifact_update = ArtifactUpdate(
            name=new_name,
            add_tags=add_tags,
            remove_tags=remove_tags,
            has_custom_name=has_custom_name,
        )
        return self.zen_store.update_artifact(
            artifact_id=artifact.id, artifact_update=artifact_update
        )

    def delete_artifact(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete an artifact.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to delete.
        """
        artifact = self.get_artifact(name_id_or_prefix=name_id_or_prefix)
        self.zen_store.delete_artifact(artifact_id=artifact.id)
        logger.info(f"Deleted artifact '{artifact.name}'.")

    def prune_artifacts(
        self,
        only_versions: bool = True,
        delete_from_artifact_store: bool = False,
    ) -> None:
        """Delete all unused artifacts and artifact versions.

        Args:
            only_versions: Only delete artifact versions, keeping artifacts
            delete_from_artifact_store: Delete data from artifact metadata
        """
        if delete_from_artifact_store:
            unused_artifact_versions = depaginate(
                self.list_artifact_versions, only_unused=True
            )
            for unused_artifact_version in unused_artifact_versions:
                self._delete_artifact_from_artifact_store(
                    unused_artifact_version
                )

        self.zen_store.prune_artifact_versions(only_versions)
        logger.info("All unused artifacts and artifact versions deleted.")

    # --------------------------- Artifact Versions ---------------------------

    def get_artifact_version(
        self,
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str] = None,
        hydrate: bool = True,
    ) -> ArtifactVersionResponse:
        """Get an artifact version by ID or artifact name.

        Args:
            name_id_or_prefix: Either the ID of the artifact version or the
                name of the artifact.
            version: The version of the artifact to get. Only used if
                `name_id_or_prefix` is the name of the artifact. If not
                specified, the latest version is returned.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The artifact version.
        """
        from zenml import get_step_context

        if cll := client_lazy_loader(
            method_name="get_artifact_version",
            name_id_or_prefix=name_id_or_prefix,
            version=version,
            hydrate=hydrate,
        ):
            return cll  # type: ignore[return-value]

        artifact = self._get_entity_version_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_artifact_version,
            list_method=self.list_artifact_versions,
            name_id_or_prefix=name_id_or_prefix,
            version=version,
            hydrate=hydrate,
        )
        try:
            step_run = get_step_context().step_run
            client = Client()
            client.zen_store.update_run_step(
                step_run_id=step_run.id,
                step_run_update=StepRunUpdate(
                    loaded_artifact_versions={artifact.name: artifact.id}
                ),
            )
        except RuntimeError:
            pass  # Cannot link to step run if called outside a step
        return artifact

    def list_artifact_versions(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        artifact_id: Optional[Union[str, UUID]] = None,
        name: Optional[str] = None,
        version: Optional[Union[str, int]] = None,
        version_number: Optional[int] = None,
        artifact_store_id: Optional[Union[str, UUID]] = None,
        type: Optional[ArtifactType] = None,
        data_type: Optional[str] = None,
        uri: Optional[str] = None,
        materializer: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        model_version_id: Optional[Union[str, UUID]] = None,
        only_unused: Optional[bool] = False,
        has_custom_name: Optional[bool] = None,
        user: Optional[Union[UUID, str]] = None,
        model: Optional[Union[UUID, str]] = None,
        pipeline_run: Optional[Union[UUID, str]] = None,
        run_metadata: Optional[Dict[str, Any]] = None,
        tag: Optional[str] = None,
        hydrate: bool = False,
    ) -> Page[ArtifactVersionResponse]:
        """Get a list of artifact versions.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of artifact version to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            artifact_id: The id of the artifact to filter by.
            name: The name of the artifact to filter by.
            version: The version of the artifact to filter by.
            version_number: The version number of the artifact to filter by.
            artifact_store_id: The id of the artifact store to filter by.
            type: The type of the artifact to filter by.
            data_type: The data type of the artifact to filter by.
            uri: The uri of the artifact to filter by.
            materializer: The materializer of the artifact to filter by.
            workspace_id: The id of the workspace to filter by.
            user_id: The id of the user to filter by.
            model_version_id: Filter by model version ID.
            only_unused: Only return artifact versions that are not used in
                any pipeline runs.
            has_custom_name: Filter artifacts with/without custom names.
            tag: A tag to filter by.
            user: Filter by user name or ID.
            model: Filter by model name or ID.
            pipeline_run: Filter by pipeline run name or ID.
            run_metadata: Filter by run metadata.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of artifact versions.
        """
        artifact_version_filter_model = ArtifactVersionFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            artifact_id=artifact_id,
            name=name,
            version=str(version) if version else None,
            version_number=version_number,
            artifact_store_id=artifact_store_id,
            type=type,
            data_type=data_type,
            uri=uri,
            materializer=materializer,
            workspace_id=workspace_id,
            user_id=user_id,
            model_version_id=model_version_id,
            only_unused=only_unused,
            has_custom_name=has_custom_name,
            tag=tag,
            user=user,
            model=model,
            pipeline_run=pipeline_run,
            run_metadata=run_metadata,
        )
        artifact_version_filter_model.set_scope_workspace(
            self.active_workspace.id
        )
        return self.zen_store.list_artifact_versions(
            artifact_version_filter_model,
            hydrate=hydrate,
        )

    def update_artifact_version(
        self,
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
    ) -> ArtifactVersionResponse:
        """Update an artifact version.

        Args:
            name_id_or_prefix: The name, ID or prefix of the artifact to update.
            version: The version of the artifact to update. Only used if
                `name_id_or_prefix` is the name of the artifact. If not
                specified, the latest version is updated.
            add_tags: Tags to add to the artifact version.
            remove_tags: Tags to remove from the artifact version.

        Returns:
            The updated artifact version.
        """
        artifact_version = self.get_artifact_version(
            name_id_or_prefix=name_id_or_prefix,
            version=version,
        )
        artifact_version_update = ArtifactVersionUpdate(
            add_tags=add_tags, remove_tags=remove_tags
        )
        return self.zen_store.update_artifact_version(
            artifact_version_id=artifact_version.id,
            artifact_version_update=artifact_version_update,
        )

    def delete_artifact_version(
        self,
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str] = None,
        delete_metadata: bool = True,
        delete_from_artifact_store: bool = False,
    ) -> None:
        """Delete an artifact version.

        By default, this will delete only the metadata of the artifact from the
        database, not the actual object stored in the artifact store.

        Args:
            name_id_or_prefix: The ID of artifact version or name or prefix of the artifact to
                delete.
            version: The version of the artifact to delete.
            delete_metadata: If True, delete the metadata of the artifact
                version from the database.
            delete_from_artifact_store: If True, delete the artifact object
                itself from the artifact store.
        """
        artifact_version = self.get_artifact_version(
            name_id_or_prefix=name_id_or_prefix, version=version
        )
        if delete_from_artifact_store:
            self._delete_artifact_from_artifact_store(
                artifact_version=artifact_version
            )
        if delete_metadata:
            self._delete_artifact_version(artifact_version=artifact_version)

    def _delete_artifact_version(
        self, artifact_version: ArtifactVersionResponse
    ) -> None:
        """Delete the metadata of an artifact version from the database.

        Args:
            artifact_version: The artifact version to delete.

        Raises:
            ValueError: If the artifact version is still used in any runs.
        """
        if artifact_version not in depaginate(
            self.list_artifact_versions, only_unused=True
        ):
            raise ValueError(
                "The metadata of artifact versions that are used in runs "
                "cannot be deleted. Please delete all runs that use this "
                "artifact first."
            )
        self.zen_store.delete_artifact_version(artifact_version.id)
        logger.info(
            f"Deleted version '{artifact_version.version}' of artifact "
            f"'{artifact_version.artifact.name}'."
        )

    def _delete_artifact_from_artifact_store(
        self, artifact_version: ArtifactVersionResponse
    ) -> None:
        """Delete an artifact object from the artifact store.

        Args:
            artifact_version: The artifact version to delete.

        Raises:
            Exception: If the artifact store is inaccessible.
        """
        from zenml.artifact_stores.base_artifact_store import BaseArtifactStore
        from zenml.stack.stack_component import StackComponent

        if not artifact_version.artifact_store_id:
            logger.warning(
                f"Artifact '{artifact_version.uri}' does not have an artifact "
                "store associated with it. Skipping deletion from artifact "
                "store."
            )
            return
        try:
            artifact_store_model = self.get_stack_component(
                component_type=StackComponentType.ARTIFACT_STORE,
                name_id_or_prefix=artifact_version.artifact_store_id,
            )
            artifact_store = StackComponent.from_model(artifact_store_model)
            assert isinstance(artifact_store, BaseArtifactStore)
            artifact_store.rmtree(artifact_version.uri)
        except Exception as e:
            logger.error(
                f"Failed to delete artifact '{artifact_version.uri}' from the "
                "artifact store. This might happen if your local client "
                "does not have access to the artifact store or does not "
                "have the required integrations installed. Full error: "
                f"{e}"
            )
            raise e
        else:
            logger.info(
                f"Deleted artifact '{artifact_version.uri}' from the artifact "
                "store."
            )

    # ------------------------------ Run Metadata ------------------------------

    def create_run_metadata(
        self,
        metadata: Dict[str, "MetadataType"],
        resources: List[RunMetadataResource],
        stack_component_id: Optional[UUID] = None,
        publisher_step_id: Optional[UUID] = None,
    ) -> None:
        """Create run metadata.

        Args:
            metadata: The metadata to create as a dictionary of key-value pairs.
            resources: The list of IDs and types of the resources for that the
                metadata was produced.
            stack_component_id: The ID of the stack component that produced
                the metadata.
            publisher_step_id: The ID of the step execution that publishes
                this metadata automatically.
        """
        from zenml.metadata.metadata_types import get_metadata_type

        values: Dict[str, "MetadataType"] = {}
        types: Dict[str, "MetadataTypeEnum"] = {}
        for key, value in metadata.items():
            # Skip metadata that is too large to be stored in the database.
            if len(json.dumps(value)) > TEXT_FIELD_MAX_LENGTH:
                logger.warning(
                    f"Metadata value for key '{key}' is too large to be "
                    "stored in the database. Skipping."
                )
                continue
            # Skip metadata that is not of a supported type.
            try:
                metadata_type = get_metadata_type(value)
            except ValueError as e:
                logger.warning(
                    f"Metadata value for key '{key}' is not of a supported "
                    f"type. Skipping. Full error: {e}"
                )
                continue
            values[key] = value
            types[key] = metadata_type

        run_metadata = RunMetadataRequest(
            workspace=self.active_workspace.id,
            user=self.active_user.id,
            resources=resources,
            stack_component_id=stack_component_id,
            publisher_step_id=publisher_step_id,
            values=values,
            types=types,
        )
        self.zen_store.create_run_metadata(run_metadata)

    # -------------------------------- Secrets ---------------------------------

    def create_secret(
        self,
        name: str,
        values: Dict[str, str],
        scope: SecretScope = SecretScope.WORKSPACE,
    ) -> SecretResponse:
        """Creates a new secret.

        Args:
            name: The name of the secret.
            values: The values of the secret.
            scope: The scope of the secret.

        Returns:
            The created secret (in model form).

        Raises:
            NotImplementedError: If centralized secrets management is not
                enabled.
        """
        create_secret_request = SecretRequest(
            name=name,
            values=values,
            scope=scope,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
        )
        try:
            return self.zen_store.create_secret(secret=create_secret_request)
        except NotImplementedError:
            raise NotImplementedError(
                "centralized secrets management is not supported or explicitly "
                "disabled in the target ZenML deployment."
            )

    def get_secret(
        self,
        name_id_or_prefix: Union[str, UUID],
        scope: Optional[SecretScope] = None,
        allow_partial_name_match: bool = True,
        allow_partial_id_match: bool = True,
        hydrate: bool = True,
    ) -> SecretResponse:
        """Get a secret.

        Get a secret identified by a name, ID or prefix of the name or ID and
        optionally a scope.

        If a scope is not provided, the secret will be searched for in all
        scopes starting with the innermost scope (user) to the outermost scope
        (workspace). When a name or prefix is used instead of a UUID value, each
        scope is first searched for an exact match, then for a ID prefix or
        name substring match before moving on to the next scope.

        Args:
            name_id_or_prefix: The name, ID or prefix to the id of the secret
                to get.
            scope: The scope of the secret. If not set, all scopes will be
                searched starting with the innermost scope (user) to the
                outermost scope (global) until a secret is found.
            allow_partial_name_match: If True, allow partial name matches.
            allow_partial_id_match: If True, allow partial ID matches.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The secret.

        Raises:
            KeyError: If no secret is found.
            ZenKeyError: If multiple secrets are found.
            NotImplementedError: If centralized secrets management is not
                enabled.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        try:
            # First interpret as full UUID
            if is_valid_uuid(name_id_or_prefix):
                # Fetch by ID; filter by scope if provided
                secret = self.zen_store.get_secret(
                    secret_id=UUID(name_id_or_prefix)
                    if isinstance(name_id_or_prefix, str)
                    else name_id_or_prefix,
                    hydrate=hydrate,
                )
                if scope is not None and secret.scope != scope:
                    raise KeyError(
                        f"No secret found with ID {str(name_id_or_prefix)}"
                    )

                return secret
        except NotImplementedError:
            raise NotImplementedError(
                "centralized secrets management is not supported or explicitly "
                "disabled in the target ZenML deployment."
            )

        # If not a UUID, try to find by name and then by prefix
        assert not isinstance(name_id_or_prefix, UUID)

        # Scopes to search in order of priority
        search_scopes = (
            [SecretScope.USER, SecretScope.WORKSPACE]
            if scope is None
            else [scope]
        )

        secrets = self.list_secrets(
            logical_operator=LogicalOperators.OR,
            name=f"contains:{name_id_or_prefix}"
            if allow_partial_name_match
            else f"equals:{name_id_or_prefix}",
            id=f"startswith:{name_id_or_prefix}"
            if allow_partial_id_match
            else None,
            hydrate=hydrate,
        )

        for search_scope in search_scopes:
            partial_matches: List[SecretResponse] = []
            for secret in secrets.items:
                if secret.scope != search_scope:
                    continue
                # Exact match
                if secret.name == name_id_or_prefix:
                    # Need to fetch the secret again to get the secret values
                    return self.zen_store.get_secret(
                        secret_id=secret.id,
                        hydrate=hydrate,
                    )
                # Partial match
                partial_matches.append(secret)

            if len(partial_matches) > 1:
                match_summary = "\n".join(
                    [
                        f"[{secret.id}]: name = {secret.name}"
                        for secret in partial_matches
                    ]
                )
                raise ZenKeyError(
                    f"{len(partial_matches)} secrets have been found that have "
                    f"a name or ID that matches the provided "
                    f"string '{name_id_or_prefix}':\n"
                    f"{match_summary}.\n"
                    f"Please use the id to uniquely identify "
                    f"only one of the secrets."
                )

            # If only a single secret is found, return it
            if len(partial_matches) == 1:
                # Need to fetch the secret again to get the secret values
                return self.zen_store.get_secret(
                    secret_id=partial_matches[0].id,
                    hydrate=hydrate,
                )

        msg = f"No secret found with name, ID or prefix '{name_id_or_prefix}'"
        if scope is not None:
            msg += f" in scope '{scope}'"

        raise KeyError(msg)

    def list_secrets(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        scope: Optional[SecretScope] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[SecretResponse]:
        """Fetches all the secret models.

        The returned secrets do not contain the secret values. To get the
        secret values, use `get_secret` individually for each secret.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of secrets to filter by.
            created: Use to secrets by time of creation
            updated: Use the last updated date for filtering
            name: The name of the secret to filter by.
            scope: The scope of the secret to filter by.
            workspace_id: The id of the workspace to filter by.
            user_id: The  id of the user to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of all the secret models without the secret values.

        Raises:
            NotImplementedError: If centralized secrets management is not
                enabled.
        """
        secret_filter_model = SecretFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            user_id=user_id,
            user=user,
            workspace_id=workspace_id,
            name=name,
            scope=scope,
            id=id,
            created=created,
            updated=updated,
        )
        secret_filter_model.set_scope_workspace(self.active_workspace.id)
        try:
            return self.zen_store.list_secrets(
                secret_filter_model=secret_filter_model,
                hydrate=hydrate,
            )
        except NotImplementedError:
            raise NotImplementedError(
                "centralized secrets management is not supported or explicitly "
                "disabled in the target ZenML deployment."
            )

    def update_secret(
        self,
        name_id_or_prefix: Union[str, UUID],
        scope: Optional[SecretScope] = None,
        new_name: Optional[str] = None,
        new_scope: Optional[SecretScope] = None,
        add_or_update_values: Optional[Dict[str, str]] = None,
        remove_values: Optional[List[str]] = None,
    ) -> SecretResponse:
        """Updates a secret.

        Args:
            name_id_or_prefix: The name, id or prefix of the id for the
                secret to update.
            scope: The scope of the secret to update.
            new_name: The new name of the secret.
            new_scope: The new scope of the secret.
            add_or_update_values: The values to add or update.
            remove_values: The values to remove.

        Returns:
            The updated secret.

        Raises:
            KeyError: If trying to remove a value that doesn't exist.
            ValueError: If a key is provided in both add_or_update_values and
                remove_values.
        """
        secret = self.get_secret(
            name_id_or_prefix=name_id_or_prefix,
            scope=scope,
            # Don't allow partial name matches, but allow partial ID matches
            allow_partial_name_match=False,
            allow_partial_id_match=True,
            hydrate=True,
        )

        secret_update = SecretUpdate(name=new_name or secret.name)

        if new_scope:
            secret_update.scope = new_scope
        values: Dict[str, Optional[SecretStr]] = {}
        if add_or_update_values:
            values.update(
                {
                    key: SecretStr(value)
                    for key, value in add_or_update_values.items()
                }
            )
        if remove_values:
            for key in remove_values:
                if key not in secret.values:
                    raise KeyError(
                        f"Cannot remove value '{key}' from secret "
                        f"'{secret.name}' because it does not exist."
                    )
                if key in values:
                    raise ValueError(
                        f"Key '{key}' is supplied both in the values to add or "
                        f"update and the values to be removed."
                    )
                values[key] = None
        if values:
            secret_update.values = values

        return Client().zen_store.update_secret(
            secret_id=secret.id, secret_update=secret_update
        )

    def delete_secret(
        self, name_id_or_prefix: str, scope: Optional[SecretScope] = None
    ) -> None:
        """Deletes a secret.

        Args:
            name_id_or_prefix: The name or ID of the secret.
            scope: The scope of the secret to delete.
        """
        secret = self.get_secret(
            name_id_or_prefix=name_id_or_prefix,
            scope=scope,
            # Don't allow partial name matches, but allow partial ID matches
            allow_partial_name_match=False,
            allow_partial_id_match=True,
        )

        self.zen_store.delete_secret(secret_id=secret.id)

    def get_secret_by_name_and_scope(
        self,
        name: str,
        scope: Optional[SecretScope] = None,
        hydrate: bool = True,
    ) -> SecretResponse:
        """Fetches a registered secret with a given name and optional scope.

        This is a version of get_secret that restricts the search to a given
        name and an optional scope, without doing any prefix or UUID matching.

        If no scope is provided, the search will be done first in the user
        scope, then in the workspace scope.

        Args:
            name: The name of the secret to get.
            scope: The scope of the secret to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The registered secret.

        Raises:
            KeyError: If no secret exists for the given name in the given scope.
        """
        logger.debug(
            f"Fetching the secret with name '{name}' and scope '{scope}'."
        )

        # Scopes to search in order of priority
        search_scopes = (
            [SecretScope.USER, SecretScope.WORKSPACE]
            if scope is None
            else [scope]
        )

        for search_scope in search_scopes:
            secrets = self.list_secrets(
                logical_operator=LogicalOperators.AND,
                name=f"equals:{name}",
                scope=search_scope,
                hydrate=hydrate,
            )

            if len(secrets.items) >= 1:
                # Need to fetch the secret again to get the secret values
                return self.zen_store.get_secret(
                    secret_id=secrets.items[0].id, hydrate=hydrate
                )

        msg = f"No secret with name '{name}' was found"
        if scope is not None:
            msg += f" in scope '{scope.value}'"

        raise KeyError(msg)

    def list_secrets_in_scope(
        self,
        scope: SecretScope,
        hydrate: bool = False,
    ) -> Page[SecretResponse]:
        """Fetches the list of secret in a given scope.

        The returned secrets do not contain the secret values. To get the
        secret values, use `get_secret` individually for each secret.

        Args:
            scope: The secrets scope to search for.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The list of secrets in the given scope without the secret values.
        """
        logger.debug(f"Fetching the secrets in scope {scope.value}.")

        return self.list_secrets(scope=scope, hydrate=hydrate)

    def backup_secrets(
        self,
        ignore_errors: bool = True,
        delete_secrets: bool = False,
    ) -> None:
        """Backs up all secrets to the configured backup secrets store.

        Args:
            ignore_errors: Whether to ignore individual errors during the backup
                process and attempt to backup all secrets.
            delete_secrets: Whether to delete the secrets that have been
                successfully backed up from the primary secrets store. Setting
                this flag effectively moves all secrets from the primary secrets
                store to the backup secrets store.
        """
        self.zen_store.backup_secrets(
            ignore_errors=ignore_errors, delete_secrets=delete_secrets
        )

    def restore_secrets(
        self,
        ignore_errors: bool = False,
        delete_secrets: bool = False,
    ) -> None:
        """Restore all secrets from the configured backup secrets store.

        Args:
            ignore_errors: Whether to ignore individual errors during the
                restore process and attempt to restore all secrets.
            delete_secrets: Whether to delete the secrets that have been
                successfully restored from the backup secrets store. Setting
                this flag effectively moves all secrets from the backup secrets
                store to the primary secrets store.
        """
        self.zen_store.restore_secrets(
            ignore_errors=ignore_errors, delete_secrets=delete_secrets
        )

    # --------------------------- Code repositories ---------------------------

    @staticmethod
    def _validate_code_repository_config(
        source: Source, config: Dict[str, Any]
    ) -> None:
        """Validate a code repository config.

        Args:
            source: The code repository source.
            config: The code repository config.

        Raises:
            RuntimeError: If the provided config is invalid.
        """
        from zenml.code_repositories import BaseCodeRepository

        code_repo_class: Type[BaseCodeRepository] = (
            source_utils.load_and_validate_class(
                source=source, expected_class=BaseCodeRepository
            )
        )
        try:
            code_repo_class.validate_config(config)
        except Exception as e:
            raise RuntimeError(
                "Failed to validate code repository config."
            ) from e

    def create_code_repository(
        self,
        name: str,
        config: Dict[str, Any],
        source: Source,
        description: Optional[str] = None,
        logo_url: Optional[str] = None,
    ) -> CodeRepositoryResponse:
        """Create a new code repository.

        Args:
            name: Name of the code repository.
            config: The configuration for the code repository.
            source: The code repository implementation source.
            description: The code repository description.
            logo_url: URL of a logo (png, jpg or svg) for the code repository.

        Returns:
            The created code repository.
        """
        self._validate_code_repository_config(source=source, config=config)
        repo_request = CodeRepositoryRequest(
            user=self.active_user.id,
            workspace=self.active_workspace.id,
            name=name,
            config=config,
            source=source,
            description=description,
            logo_url=logo_url,
        )
        return self.zen_store.create_code_repository(
            code_repository=repo_request
        )

    def get_code_repository(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> CodeRepositoryResponse:
        """Get a code repository by name, id or prefix.

        Args:
            name_id_or_prefix: The name, ID or ID prefix of the code repository.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The code repository.
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_code_repository,
            list_method=self.list_code_repositories,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_code_repositories(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[CodeRepositoryResponse]:
        """List all code repositories.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            id: Use the id of the code repository to filter by.
            created: Use to filter by time of creation.
            updated: Use the last updated date for filtering.
            name: The name of the code repository to filter by.
            workspace_id: The id of the workspace to filter by.
            user_id: The id of the user to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of code repositories matching the filter description.
        """
        filter_model = CodeRepositoryFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            workspace_id=workspace_id,
            user_id=user_id,
            user=user,
        )
        filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_code_repositories(
            filter_model=filter_model,
            hydrate=hydrate,
        )

    def update_code_repository(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        logo_url: Optional[str] = None,
        config: Optional[Dict[str, Any]] = None,
    ) -> CodeRepositoryResponse:
        """Update a code repository.

        Args:
            name_id_or_prefix: Name, ID or prefix of the code repository to
                update.
            name: New name of the code repository.
            description: New description of the code repository.
            logo_url: New logo URL of the code repository.
            config: New configuration options for the code repository. Will
                be used to update the existing configuration values. To remove
                values from the existing configuration, set the value for that
                key to `None`.

        Returns:
            The updated code repository.
        """
        repo = self.get_code_repository(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        update = CodeRepositoryUpdate(
            name=name, description=description, logo_url=logo_url
        )
        if config is not None:
            combined_config = repo.config
            combined_config.update(config)
            combined_config = {
                k: v for k, v in combined_config.items() if v is not None
            }

            self._validate_code_repository_config(
                source=repo.source, config=combined_config
            )
            update.config = combined_config

        return self.zen_store.update_code_repository(
            code_repository_id=repo.id, update=update
        )

    def delete_code_repository(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete a code repository.

        Args:
            name_id_or_prefix: The name, ID or prefix of the code repository.
        """
        repo = self.get_code_repository(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        self.zen_store.delete_code_repository(code_repository_id=repo.id)

    # --------------------------- Service Connectors ---------------------------

    def create_service_connector(
        self,
        name: str,
        connector_type: str,
        resource_type: Optional[str] = None,
        auth_method: Optional[str] = None,
        configuration: Optional[Dict[str, str]] = None,
        resource_id: Optional[str] = None,
        description: str = "",
        expiration_seconds: Optional[int] = None,
        expires_at: Optional[datetime] = None,
        expires_skew_tolerance: Optional[int] = None,
        labels: Optional[Dict[str, str]] = None,
        auto_configure: bool = False,
        verify: bool = True,
        list_resources: bool = True,
        register: bool = True,
    ) -> Tuple[
        Optional[
            Union[
                ServiceConnectorResponse,
                ServiceConnectorRequest,
            ]
        ],
        Optional[ServiceConnectorResourcesModel],
    ]:
        """Create, validate and/or register a service connector.

        Args:
            name: The name of the service connector.
            connector_type: The service connector type.
            auth_method: The authentication method of the service connector.
                May be omitted if auto-configuration is used.
            resource_type: The resource type for the service connector.
            configuration: The configuration of the service connector.
            resource_id: The resource id of the service connector.
            description: The description of the service connector.
            expiration_seconds: The expiration time of the service connector.
            expires_at: The expiration time of the service connector.
            expires_skew_tolerance: The allowed expiration skew for the service
                connector credentials.
            labels: The labels of the service connector.
            auto_configure: Whether to automatically configure the service
                connector from the local environment.
            verify: Whether to verify that the service connector configuration
                and credentials can be used to gain access to the resource.
            list_resources: Whether to also list the resources that the service
                connector can give access to (if verify is True).
            register: Whether to register the service connector or not.

        Returns:
            The model of the registered service connector and the resources
            that the service connector can give access to (if verify is True).

        Raises:
            ValueError: If the arguments are invalid.
            KeyError: If the service connector type is not found.
            NotImplementedError: If auto-configuration is not supported or
                not implemented for the service connector type.
            AuthorizationException: If the connector verification failed due
                to authorization issues.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        connector_instance: Optional[ServiceConnector] = None
        connector_resources: Optional[ServiceConnectorResourcesModel] = None

        # Get the service connector type class
        try:
            connector = self.zen_store.get_service_connector_type(
                connector_type=connector_type,
            )
        except KeyError:
            raise KeyError(
                f"Service connector type {connector_type} not found."
                "Please check that you have installed all required "
                "Python packages and ZenML integrations and try again."
            )

        if not resource_type and len(connector.resource_types) == 1:
            resource_type = connector.resource_types[0].resource_type

        # If auto_configure is set, we will try to automatically configure the
        # service connector from the local environment
        if auto_configure:
            if not connector.supports_auto_configuration:
                raise NotImplementedError(
                    f"The {connector.name} service connector type "
                    "does not support auto-configuration."
                )
            if not connector.local:
                raise NotImplementedError(
                    f"The {connector.name} service connector type "
                    "implementation is not available locally. Please "
                    "check that you have installed all required Python "
                    "packages and ZenML integrations and try again, or "
                    "skip auto-configuration."
                )

            assert connector.connector_class is not None

            connector_instance = connector.connector_class.auto_configure(
                resource_type=resource_type,
                auth_method=auth_method,
                resource_id=resource_id,
            )
            assert connector_instance is not None
            connector_request = connector_instance.to_model(
                name=name,
                user=self.active_user.id,
                workspace=self.active_workspace.id,
                description=description or "",
                labels=labels,
            )

            if verify:
                # Prefer to verify the connector config server-side if the
                # implementation if available there, because it ensures
                # that the connector can be shared with other users or used
                # from other machines and because some auth methods rely on the
                # server-side authentication environment
                if connector.remote:
                    connector_resources = (
                        self.zen_store.verify_service_connector_config(
                            connector_request,
                            list_resources=list_resources,
                        )
                    )
                else:
                    connector_resources = connector_instance.verify(
                        list_resources=list_resources,
                    )

                if connector_resources.error:
                    # Raise an exception if the connector verification failed
                    raise AuthorizationException(connector_resources.error)

        else:
            if not auth_method:
                if len(connector.auth_methods) == 1:
                    auth_method = connector.auth_methods[0].auth_method
                else:
                    raise ValueError(
                        f"Multiple authentication methods are available for "
                        f"the {connector.name} service connector type. Please "
                        f"specify one of the following: "
                        f"{list(connector.auth_method_dict.keys())}."
                    )

            connector_request = ServiceConnectorRequest(
                name=name,
                connector_type=connector_type,
                description=description,
                auth_method=auth_method,
                expiration_seconds=expiration_seconds,
                expires_at=expires_at,
                expires_skew_tolerance=expires_skew_tolerance,
                user=self.active_user.id,
                workspace=self.active_workspace.id,
                labels=labels or {},
            )
            # Validate and configure the resources
            connector_request.validate_and_configure_resources(
                connector_type=connector,
                resource_types=resource_type,
                resource_id=resource_id,
                configuration=configuration,
            )
            if verify:
                # Prefer to verify the connector config server-side if the
                # implementation if available there, because it ensures
                # that the connector can be shared with other users or used
                # from other machines and because some auth methods rely on the
                # server-side authentication environment
                if connector.remote:
                    connector_resources = (
                        self.zen_store.verify_service_connector_config(
                            connector_request,
                            list_resources=list_resources,
                        )
                    )
                else:
                    connector_instance = (
                        service_connector_registry.instantiate_connector(
                            model=connector_request
                        )
                    )
                    connector_resources = connector_instance.verify(
                        list_resources=list_resources,
                    )

                if connector_resources.error:
                    # Raise an exception if the connector verification failed
                    raise AuthorizationException(connector_resources.error)

                # For resource types that don't support multi-instances, it's
                # better to save the default resource ID in the connector, if
                # available. Otherwise, we'll need to instantiate the connector
                # again to get the default resource ID.
                connector_request.resource_id = (
                    connector_request.resource_id
                    or connector_resources.get_default_resource_id()
                )

        if not register:
            return connector_request, connector_resources

        # Register the new model
        connector_response = self.zen_store.create_service_connector(
            service_connector=connector_request
        )

        if connector_resources:
            connector_resources.id = connector_response.id
            connector_resources.name = connector_response.name
            connector_resources.connector_type = (
                connector_response.connector_type
            )

        return connector_response, connector_resources

    def get_service_connector(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        load_secrets: bool = False,
        hydrate: bool = True,
    ) -> ServiceConnectorResponse:
        """Fetches a registered service connector.

        Args:
            name_id_or_prefix: The id of the service connector to fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            load_secrets: If True, load the secrets for the service connector.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The registered service connector.
        """

        def scoped_list_method(
            hydrate: bool = False,
            **kwargs: Any,
        ) -> Page[ServiceConnectorResponse]:
            """Call `zen_store.list_service_connectors` with workspace scoping.

            Args:
                hydrate: Flag deciding whether to hydrate the output model(s)
                    by including metadata fields in the response.
                **kwargs: Keyword arguments to pass to
                    `ServiceConnectorFilterModel`.

            Returns:
                The list of service connectors.
            """
            filter_model = ServiceConnectorFilter(**kwargs)
            filter_model.set_scope_workspace(self.active_workspace.id)
            return self.zen_store.list_service_connectors(
                filter_model=filter_model,
                hydrate=hydrate,
            )

        connector = self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_service_connector,
            list_method=scoped_list_method,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

        if load_secrets and connector.secret_id:
            client = Client()
            try:
                secret = client.get_secret(
                    name_id_or_prefix=connector.secret_id,
                    allow_partial_id_match=False,
                    allow_partial_name_match=False,
                )
            except KeyError as err:
                logger.error(
                    "Unable to retrieve secret values associated with "
                    f"service connector '{connector.name}': {err}"
                )
            else:
                # Add secret values to connector configuration
                connector.secrets.update(secret.values)

        return connector

    def list_service_connectors(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[datetime] = None,
        updated: Optional[datetime] = None,
        name: Optional[str] = None,
        connector_type: Optional[str] = None,
        auth_method: Optional[str] = None,
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        workspace_id: Optional[Union[str, UUID]] = None,
        user_id: Optional[Union[str, UUID]] = None,
        user: Optional[Union[UUID, str]] = None,
        labels: Optional[Dict[str, Optional[str]]] = None,
        secret_id: Optional[Union[str, UUID]] = None,
        hydrate: bool = False,
    ) -> Page[ServiceConnectorResponse]:
        """Lists all registered service connectors.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: The id of the service connector to filter by.
            created: Filter service connectors by time of creation
            updated: Use the last updated date for filtering
            connector_type: Use the service connector type for filtering
            auth_method: Use the service connector auth method for filtering
            resource_type: Filter service connectors by the resource type that
                they can give access to.
            resource_id: Filter service connectors by the resource id that
                they can give access to.
            workspace_id: The id of the workspace to filter by.
            user_id: The id of the user to filter by.
            user: Filter by user name/ID.
            name: The name of the service connector to filter by.
            labels: The labels of the service connector to filter by.
            secret_id: Filter by the id of the secret that is referenced by the
                service connector.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of service connectors.
        """
        connector_filter_model = ServiceConnectorFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            workspace_id=workspace_id or self.active_workspace.id,
            user_id=user_id,
            user=user,
            name=name,
            connector_type=connector_type,
            auth_method=auth_method,
            resource_type=resource_type,
            resource_id=resource_id,
            id=id,
            created=created,
            updated=updated,
            labels=labels,
            secret_id=secret_id,
        )
        connector_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_service_connectors(
            filter_model=connector_filter_model,
            hydrate=hydrate,
        )

    def update_service_connector(
        self,
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        auth_method: Optional[str] = None,
        resource_type: Optional[str] = None,
        configuration: Optional[Dict[str, str]] = None,
        resource_id: Optional[str] = None,
        description: Optional[str] = None,
        expires_at: Optional[datetime] = None,
        expires_skew_tolerance: Optional[int] = None,
        expiration_seconds: Optional[int] = None,
        labels: Optional[Dict[str, Optional[str]]] = None,
        verify: bool = True,
        list_resources: bool = True,
        update: bool = True,
    ) -> Tuple[
        Optional[
            Union[
                ServiceConnectorResponse,
                ServiceConnectorUpdate,
            ]
        ],
        Optional[ServiceConnectorResourcesModel],
    ]:
        """Validate and/or register an updated service connector.

        If the `resource_type`, `resource_id` and `expiration_seconds`
        parameters are set to their "empty" values (empty string for resource
        type and resource ID, 0 for expiration seconds), the existing values
        will be removed from the service connector. Setting them to None or
        omitting them will not affect the existing values.

        If supplied, the `configuration` parameter is a full replacement of the
        existing configuration rather than a partial update.

        Labels can be updated or removed by setting the label value to None.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to update.
            name: The new name of the service connector.
            auth_method: The new authentication method of the service connector.
            resource_type: The new resource type for the service connector.
                If set to the empty string, the existing resource type will be
                removed.
            configuration: The new configuration of the service connector. If
                set, this needs to be a full replacement of the existing
                configuration rather than a partial update.
            resource_id: The new resource id of the service connector.
                If set to the empty string, the existing resource ID will be
                removed.
            description: The description of the service connector.
            expires_at: The new UTC expiration time of the service connector.
            expires_skew_tolerance: The allowed expiration skew for the service
                connector credentials.
            expiration_seconds: The expiration time of the service connector.
                If set to 0, the existing expiration time will be removed.
            labels: The service connector to update or remove. If a label value
                is set to None, the label will be removed.
            verify: Whether to verify that the service connector configuration
                and credentials can be used to gain access to the resource.
            list_resources: Whether to also list the resources that the service
                connector can give access to (if verify is True).
            update: Whether to update the service connector or not.

        Returns:
            The model of the registered service connector and the resources
            that the service connector can give access to (if verify is True).

        Raises:
            AuthorizationException: If the service connector verification
                fails due to invalid credentials or insufficient permissions.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        connector_model = self.get_service_connector(
            name_id_or_prefix,
            allow_name_prefix_match=False,
            load_secrets=True,
        )

        connector_instance: Optional[ServiceConnector] = None
        connector_resources: Optional[ServiceConnectorResourcesModel] = None

        if isinstance(connector_model.connector_type, str):
            connector = self.get_service_connector_type(
                connector_model.connector_type
            )
        else:
            connector = connector_model.connector_type

        resource_types: Optional[Union[str, List[str]]] = None
        if resource_type == "":
            resource_types = None
        elif resource_type is None:
            resource_types = connector_model.resource_types
        else:
            resource_types = resource_type

        if not resource_type and len(connector.resource_types) == 1:
            resource_types = connector.resource_types[0].resource_type

        if resource_id == "":
            resource_id = None
        elif resource_id is None:
            resource_id = connector_model.resource_id

        if expiration_seconds == 0:
            expiration_seconds = None
        elif expiration_seconds is None:
            expiration_seconds = connector_model.expiration_seconds

        connector_update = ServiceConnectorUpdate(
            name=name or connector_model.name,
            connector_type=connector.connector_type,
            description=description or connector_model.description,
            auth_method=auth_method or connector_model.auth_method,
            expires_at=expires_at,
            expires_skew_tolerance=expires_skew_tolerance,
            expiration_seconds=expiration_seconds,
        )

        # Validate and configure the resources
        if configuration is not None:
            # The supplied configuration is a drop-in replacement for the
            # existing configuration and secrets
            connector_update.validate_and_configure_resources(
                connector_type=connector,
                resource_types=resource_types,
                resource_id=resource_id,
                configuration=configuration,
            )
        else:
            connector_update.validate_and_configure_resources(
                connector_type=connector,
                resource_types=resource_types,
                resource_id=resource_id,
                configuration=connector_model.configuration,
                secrets=connector_model.secrets,
            )

        # Add the labels
        if labels is not None:
            # Apply the new label values, but don't keep any labels that
            # have been set to None in the update
            connector_update.labels = {
                **{
                    label: value
                    for label, value in connector_model.labels.items()
                    if label not in labels
                },
                **{
                    label: value
                    for label, value in labels.items()
                    if value is not None
                },
            }
        else:
            connector_update.labels = connector_model.labels

        if verify:
            # Prefer to verify the connector config server-side if the
            # implementation, if available there, because it ensures
            # that the connector can be shared with other users or used
            # from other machines and because some auth methods rely on the
            # server-side authentication environment

            # Convert the update model to a request model for validation
            connector_request_dict = connector_update.model_dump()
            connector_request_dict.update(
                user=self.active_user.id,
                workspace=self.active_workspace.id,
            )
            connector_request = ServiceConnectorRequest.model_validate(
                connector_request_dict
            )

            if connector.remote:
                connector_resources = (
                    self.zen_store.verify_service_connector_config(
                        service_connector=connector_request,
                        list_resources=list_resources,
                    )
                )
            else:
                connector_instance = (
                    service_connector_registry.instantiate_connector(
                        model=connector_request,
                    )
                )
                connector_resources = connector_instance.verify(
                    list_resources=list_resources
                )

            if connector_resources.error:
                raise AuthorizationException(connector_resources.error)

            # For resource types that don't support multi-instances, it's
            # better to save the default resource ID in the connector, if
            # available. Otherwise, we'll need to instantiate the connector
            # again to get the default resource ID.
            connector_update.resource_id = (
                connector_update.resource_id
                or connector_resources.get_default_resource_id()
            )

        if not update:
            return connector_update, connector_resources

        # Update the model
        connector_response = self.zen_store.update_service_connector(
            service_connector_id=connector_model.id,
            update=connector_update,
        )

        if connector_resources:
            connector_resources.id = connector_response.id
            connector_resources.name = connector_response.name
            connector_resources.connector_type = (
                connector_response.connector_type
            )

        return connector_response, connector_resources

    def delete_service_connector(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Deletes a registered service connector.

        Args:
            name_id_or_prefix: The ID or name of the service connector to delete.
        """
        service_connector = self.get_service_connector(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        self.zen_store.delete_service_connector(
            service_connector_id=service_connector.id
        )
        logger.info(
            "Removed service connector (type: %s) with name '%s'.",
            service_connector.type,
            service_connector.name,
        )

    def verify_service_connector(
        self,
        name_id_or_prefix: Union[UUID, str],
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        list_resources: bool = True,
    ) -> "ServiceConnectorResourcesModel":
        """Verifies if a service connector has access to one or more resources.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to verify.
            resource_type: The type of the resource for which to verify access.
                If not provided, the resource type from the service connector
                configuration will be used.
            resource_id: The ID of the resource for which to verify access. If
                not provided, the resource ID from the service connector
                configuration will be used.
            list_resources: Whether to list the resources that the service
                connector has access to.

        Returns:
            The list of resources that the service connector has access to,
            scoped to the supplied resource type and ID, if provided.

        Raises:
            AuthorizationException: If the service connector does not have
                access to the resources.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        # Get the service connector model
        service_connector = self.get_service_connector(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        connector_type = self.get_service_connector_type(
            service_connector.type
        )

        # Prefer to verify the connector config server-side if the
        # implementation if available there, because it ensures
        # that the connector can be shared with other users or used
        # from other machines and because some auth methods rely on the
        # server-side authentication environment
        if connector_type.remote:
            connector_resources = self.zen_store.verify_service_connector(
                service_connector_id=service_connector.id,
                resource_type=resource_type,
                resource_id=resource_id,
                list_resources=list_resources,
            )
        else:
            connector_instance = (
                service_connector_registry.instantiate_connector(
                    model=service_connector
                )
            )
            connector_resources = connector_instance.verify(
                resource_type=resource_type,
                resource_id=resource_id,
                list_resources=list_resources,
            )

        if connector_resources.error:
            raise AuthorizationException(connector_resources.error)

        return connector_resources

    def login_service_connector(
        self,
        name_id_or_prefix: Union[UUID, str],
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        **kwargs: Any,
    ) -> "ServiceConnector":
        """Use a service connector to authenticate a local client/SDK.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to use.
            resource_type: The type of the resource to connect to. If not
                provided, the resource type from the service connector
                configuration will be used.
            resource_id: The ID of a particular resource instance to configure
                the local client to connect to. If the connector instance is
                already configured with a resource ID that is not the same or
                equivalent to the one requested, a `ValueError` exception is
                raised. May be omitted for connectors and resource types that do
                not support multiple resource instances.
            kwargs: Additional implementation specific keyword arguments to use
                to configure the client.

        Returns:
            The service connector client instance that was used to configure the
            local client.
        """
        connector_client = self.get_service_connector_client(
            name_id_or_prefix=name_id_or_prefix,
            resource_type=resource_type,
            resource_id=resource_id,
            verify=False,
        )

        connector_client.configure_local_client(
            **kwargs,
        )

        return connector_client

    def get_service_connector_client(
        self,
        name_id_or_prefix: Union[UUID, str],
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
        verify: bool = False,
    ) -> "ServiceConnector":
        """Get the client side of a service connector instance to use with a local client.

        Args:
            name_id_or_prefix: The name, id or prefix of the service connector
                to use.
            resource_type: The type of the resource to connect to. If not
                provided, the resource type from the service connector
                configuration will be used.
            resource_id: The ID of a particular resource instance to configure
                the local client to connect to. If the connector instance is
                already configured with a resource ID that is not the same or
                equivalent to the one requested, a `ValueError` exception is
                raised. May be omitted for connectors and resource types that do
                not support multiple resource instances.
            verify: Whether to verify that the service connector configuration
                and credentials can be used to gain access to the resource.

        Returns:
            The client side of the indicated service connector instance that can
            be used to connect to the resource locally.
        """
        from zenml.service_connectors.service_connector_registry import (
            service_connector_registry,
        )

        # Get the service connector model
        service_connector = self.get_service_connector(
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        connector_type = self.get_service_connector_type(
            service_connector.type
        )

        # Prefer to fetch the connector client from the server if the
        # implementation if available there, because some auth methods rely on
        # the server-side authentication environment
        if connector_type.remote:
            connector_client_model = (
                self.zen_store.get_service_connector_client(
                    service_connector_id=service_connector.id,
                    resource_type=resource_type,
                    resource_id=resource_id,
                )
            )

            connector_client = (
                service_connector_registry.instantiate_connector(
                    model=connector_client_model
                )
            )

            if verify:
                # Verify the connector client on the local machine, because the
                # server-side implementation may not be able to do so
                connector_client.verify()
        else:
            connector_instance = (
                service_connector_registry.instantiate_connector(
                    model=service_connector
                )
            )

            # Fetch the connector client
            connector_client = connector_instance.get_connector_client(
                resource_type=resource_type,
                resource_id=resource_id,
            )

        return connector_client

    def list_service_connector_resources(
        self,
        connector_type: Optional[str] = None,
        resource_type: Optional[str] = None,
        resource_id: Optional[str] = None,
    ) -> List[ServiceConnectorResourcesModel]:
        """List resources that can be accessed by service connectors.

        Args:
            connector_type: The type of service connector to filter by.
            resource_type: The type of resource to filter by.
            resource_id: The ID of a particular resource instance to filter by.

        Returns:
            The matching list of resources that available service
            connectors have access to.
        """
        return self.zen_store.list_service_connector_resources(
            workspace_name_or_id=self.active_workspace.id,
            connector_type=connector_type,
            resource_type=resource_type,
            resource_id=resource_id,
        )

    def list_service_connector_types(
        self,
        connector_type: Optional[str] = None,
        resource_type: Optional[str] = None,
        auth_method: Optional[str] = None,
    ) -> List[ServiceConnectorTypeModel]:
        """Get a list of service connector types.

        Args:
            connector_type: Filter by connector type.
            resource_type: Filter by resource type.
            auth_method: Filter by authentication method.

        Returns:
            List of service connector types.
        """
        return self.zen_store.list_service_connector_types(
            connector_type=connector_type,
            resource_type=resource_type,
            auth_method=auth_method,
        )

    def get_service_connector_type(
        self,
        connector_type: str,
    ) -> ServiceConnectorTypeModel:
        """Returns the requested service connector type.

        Args:
            connector_type: the service connector type identifier.

        Returns:
            The requested service connector type.
        """
        return self.zen_store.get_service_connector_type(
            connector_type=connector_type,
        )

    #########
    # Model
    #########

    def create_model(
        self,
        name: str,
        license: Optional[str] = None,
        description: Optional[str] = None,
        audience: Optional[str] = None,
        use_cases: Optional[str] = None,
        limitations: Optional[str] = None,
        trade_offs: Optional[str] = None,
        ethics: Optional[str] = None,
        tags: Optional[List[str]] = None,
        save_models_to_registry: bool = True,
    ) -> ModelResponse:
        """Creates a new model in Model Control Plane.

        Args:
            name: The name of the model.
            license: The license under which the model is created.
            description: The description of the model.
            audience: The target audience of the model.
            use_cases: The use cases of the model.
            limitations: The known limitations of the model.
            trade_offs: The tradeoffs of the model.
            ethics: The ethical implications of the model.
            tags: Tags associated with the model.
            save_models_to_registry: Whether to save the model to the
                registry.

        Returns:
            The newly created model.
        """
        return self.zen_store.create_model(
            model=ModelRequest(
                name=name,
                license=license,
                description=description,
                audience=audience,
                use_cases=use_cases,
                limitations=limitations,
                trade_offs=trade_offs,
                ethics=ethics,
                tags=tags,
                user=self.active_user.id,
                workspace=self.active_workspace.id,
                save_models_to_registry=save_models_to_registry,
            )
        )

    def delete_model(self, model_name_or_id: Union[str, UUID]) -> None:
        """Deletes a model from Model Control Plane.

        Args:
            model_name_or_id: name or id of the model to be deleted.
        """
        self.zen_store.delete_model(model_name_or_id=model_name_or_id)

    def update_model(
        self,
        model_name_or_id: Union[str, UUID],
        name: Optional[str] = None,
        license: Optional[str] = None,
        description: Optional[str] = None,
        audience: Optional[str] = None,
        use_cases: Optional[str] = None,
        limitations: Optional[str] = None,
        trade_offs: Optional[str] = None,
        ethics: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
        save_models_to_registry: Optional[bool] = None,
    ) -> ModelResponse:
        """Updates an existing model in Model Control Plane.

        Args:
            model_name_or_id: name or id of the model to be deleted.
            name: The name of the model.
            license: The license under which the model is created.
            description: The description of the model.
            audience: The target audience of the model.
            use_cases: The use cases of the model.
            limitations: The known limitations of the model.
            trade_offs: The tradeoffs of the model.
            ethics: The ethical implications of the model.
            add_tags: Tags to add to the model.
            remove_tags: Tags to remove from to the model.
            save_models_to_registry: Whether to save the model to the
                registry.

        Returns:
            The updated model.
        """
        if not is_valid_uuid(model_name_or_id):
            model_name_or_id = self.zen_store.get_model(model_name_or_id).id
        return self.zen_store.update_model(
            model_id=model_name_or_id,  # type:ignore[arg-type]
            model_update=ModelUpdate(
                name=name,
                license=license,
                description=description,
                audience=audience,
                use_cases=use_cases,
                limitations=limitations,
                trade_offs=trade_offs,
                ethics=ethics,
                add_tags=add_tags,
                remove_tags=remove_tags,
                save_models_to_registry=save_models_to_registry,
            ),
        )

    def get_model(
        self,
        model_name_or_id: Union[str, UUID],
        hydrate: bool = True,
    ) -> ModelResponse:
        """Get an existing model from Model Control Plane.

        Args:
            model_name_or_id: name or id of the model to be retrieved.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The model of interest.
        """
        if cll := client_lazy_loader(
            "get_model", model_name_or_id=model_name_or_id, hydrate=hydrate
        ):
            return cll  # type: ignore[return-value]
        return self.zen_store.get_model(
            model_name_or_id=model_name_or_id,
            hydrate=hydrate,
        )

    def list_models(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
        tag: Optional[str] = None,
    ) -> Page[ModelResponse]:
        """Get models by filter from Model Control Plane.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: The name of the model to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            tag: The tag of the model to filter by.

        Returns:
            A page object with all models.
        """
        filter = ModelFilter(
            name=name,
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            created=created,
            updated=updated,
            tag=tag,
            user=user,
        )

        return self.zen_store.list_models(
            model_filter_model=filter, hydrate=hydrate
        )

    #################
    # Model Versions
    #################

    def create_model_version(
        self,
        model_name_or_id: Union[str, UUID],
        name: Optional[str] = None,
        description: Optional[str] = None,
        tags: Optional[List[str]] = None,
    ) -> ModelVersionResponse:
        """Creates a new model version in Model Control Plane.

        Args:
            model_name_or_id: the name or id of the model to create model
                version in.
            name: the name of the Model Version to be created.
            description: the description of the Model Version to be created.
            tags: Tags associated with the model.

        Returns:
            The newly created model version.
        """
        if not is_valid_uuid(model_name_or_id):
            model_name_or_id = self.get_model(model_name_or_id).id
        return self.zen_store.create_model_version(
            model_version=ModelVersionRequest(
                name=name,
                description=description,
                user=self.active_user.id,
                workspace=self.active_workspace.id,
                model=model_name_or_id,
                tags=tags,
            )
        )

    def delete_model_version(
        self,
        model_version_id: UUID,
    ) -> None:
        """Deletes a model version from Model Control Plane.

        Args:
            model_version_id: Id of the model version to be deleted.
        """
        self.zen_store.delete_model_version(
            model_version_id=model_version_id,
        )

    def get_model_version(
        self,
        model_name_or_id: Optional[Union[str, UUID]] = None,
        model_version_name_or_number_or_id: Optional[
            Union[str, int, ModelStages, UUID]
        ] = None,
        hydrate: bool = True,
    ) -> ModelVersionResponse:
        """Get an existing model version from Model Control Plane.

        Args:
            model_name_or_id: name or id of the model containing the model
                version.
            model_version_name_or_number_or_id: name, id, stage or number of
                the model version to be retrieved. If skipped - latest version
                is retrieved.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The model version of interest.

        Raises:
            RuntimeError: In case method inputs don't adhere to restrictions.
            KeyError: In case no model version with the identifiers exists.
            ValueError: In case retrieval is attempted using non UUID model version
                identifier and no model identifier provided.
        """
        if (
            not is_valid_uuid(model_version_name_or_number_or_id)
            and model_name_or_id is None
        ):
            raise ValueError(
                "No model identifier provided and model version identifier "
                f"`{model_version_name_or_number_or_id}` is not a valid UUID."
            )
        if cll := client_lazy_loader(
            "get_model_version",
            model_name_or_id=model_name_or_id,
            model_version_name_or_number_or_id=model_version_name_or_number_or_id,
            hydrate=hydrate,
        ):
            return cll  # type: ignore[return-value]

        if model_version_name_or_number_or_id is None:
            model_version_name_or_number_or_id = ModelStages.LATEST

        if isinstance(model_version_name_or_number_or_id, UUID):
            return self.zen_store.get_model_version(
                model_version_id=model_version_name_or_number_or_id,
                hydrate=hydrate,
            )
        elif isinstance(model_version_name_or_number_or_id, int):
            model_versions = self.zen_store.list_model_versions(
                model_name_or_id=model_name_or_id,
                model_version_filter_model=ModelVersionFilter(
                    number=model_version_name_or_number_or_id,
                ),
                hydrate=hydrate,
            ).items
        elif isinstance(model_version_name_or_number_or_id, str):
            if model_version_name_or_number_or_id == ModelStages.LATEST:
                model_versions = self.zen_store.list_model_versions(
                    model_name_or_id=model_name_or_id,
                    model_version_filter_model=ModelVersionFilter(
                        sort_by=f"{SorterOps.DESCENDING}:number"
                    ),
                    hydrate=hydrate,
                ).items

                if len(model_versions) > 0:
                    model_versions = [model_versions[0]]
                else:
                    model_versions = []
            elif model_version_name_or_number_or_id in ModelStages.values():
                model_versions = self.zen_store.list_model_versions(
                    model_name_or_id=model_name_or_id,
                    model_version_filter_model=ModelVersionFilter(
                        stage=model_version_name_or_number_or_id
                    ),
                    hydrate=hydrate,
                ).items
            else:
                model_versions = self.zen_store.list_model_versions(
                    model_name_or_id=model_name_or_id,
                    model_version_filter_model=ModelVersionFilter(
                        name=model_version_name_or_number_or_id
                    ),
                    hydrate=hydrate,
                ).items
        else:
            raise RuntimeError(
                f"The model version identifier "
                f"`{model_version_name_or_number_or_id}` is not"
                f"of the correct type."
            )

        if len(model_versions) == 1:
            return model_versions[0]
        elif len(model_versions) == 0:
            raise KeyError(
                f"No model version found for model "
                f"`{model_name_or_id}` with version identifier "
                f"`{model_version_name_or_number_or_id}`."
            )
        else:
            raise RuntimeError(
                f"The model version identifier "
                f"`{model_version_name_or_number_or_id}` is not"
                f"unique for model `{model_name_or_id}`."
            )

    def list_model_versions(
        self,
        model_name_or_id: Optional[Union[str, UUID]] = None,
        sort_by: str = "number",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        number: Optional[int] = None,
        stage: Optional[Union[str, ModelStages]] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
        tag: Optional[str] = None,
    ) -> Page[ModelVersionResponse]:
        """Get model versions by filter from Model Control Plane.

        Args:
            model_name_or_id: name or id of the model containing the model
                version.
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: name or id of the model version.
            number: number of the model version.
            stage: stage of the model version.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            tag: The tag to filter by.

        Returns:
            A page object with all model versions.
        """
        model_version_filter_model = ModelVersionFilter(
            page=page,
            size=size,
            sort_by=sort_by,
            logical_operator=logical_operator,
            created=created,
            updated=updated,
            name=name,
            number=number,
            stage=stage,
            tag=tag,
            user=user,
        )

        return self.zen_store.list_model_versions(
            model_name_or_id=model_name_or_id,
            model_version_filter_model=model_version_filter_model,
            hydrate=hydrate,
        )

    def update_model_version(
        self,
        model_name_or_id: Union[str, UUID],
        version_name_or_id: Union[str, UUID],
        stage: Optional[Union[str, ModelStages]] = None,
        force: bool = False,
        name: Optional[str] = None,
        description: Optional[str] = None,
        add_tags: Optional[List[str]] = None,
        remove_tags: Optional[List[str]] = None,
    ) -> ModelVersionResponse:
        """Get all model versions by filter.

        Args:
            model_name_or_id: The name or ID of the model containing model version.
            version_name_or_id: The name or ID of model version to be updated.
            stage: Target model version stage to be set.
            force: Whether existing model version in target stage should be
                silently archived or an error should be raised.
            name: Target model version name to be set.
            description: Target model version description to be set.
            add_tags: Tags to add to the model version.
            remove_tags: Tags to remove from to the model version.

        Returns:
            An updated model version.
        """
        if not is_valid_uuid(model_name_or_id):
            model_name_or_id = self.get_model(model_name_or_id).id
        if not is_valid_uuid(version_name_or_id):
            version_name_or_id = self.get_model_version(
                model_name_or_id, version_name_or_id
            ).id

        return self.zen_store.update_model_version(
            model_version_id=version_name_or_id,  # type:ignore[arg-type]
            model_version_update_model=ModelVersionUpdate(
                model=model_name_or_id,
                stage=stage,
                force=force,
                name=name,
                description=description,
                add_tags=add_tags,
                remove_tags=remove_tags,
            ),
        )

    #################################################
    # Model Versions Artifacts
    #################################################

    def list_model_version_artifact_links(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        model_version_id: Optional[Union[UUID, str]] = None,
        artifact_version_id: Optional[Union[UUID, str]] = None,
        artifact_name: Optional[str] = None,
        only_data_artifacts: Optional[bool] = None,
        only_model_artifacts: Optional[bool] = None,
        only_deployment_artifacts: Optional[bool] = None,
        has_custom_name: Optional[bool] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ModelVersionArtifactResponse]:
        """Get model version to artifact links by filter in Model Control Plane.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            model_version_id: Use the model version id for filtering
            artifact_version_id: Use the artifact id for filtering
            artifact_name: Use the artifact name for filtering
            only_data_artifacts: Use to filter by data artifacts
            only_model_artifacts: Use to filter by model artifacts
            only_deployment_artifacts: Use to filter by deployment artifacts
            has_custom_name: Filter artifacts with/without custom names.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of all model version to artifact links.
        """
        return self.zen_store.list_model_version_artifact_links(
            ModelVersionArtifactFilter(
                sort_by=sort_by,
                logical_operator=logical_operator,
                page=page,
                size=size,
                created=created,
                updated=updated,
                model_version_id=model_version_id,
                artifact_version_id=artifact_version_id,
                artifact_name=artifact_name,
                only_data_artifacts=only_data_artifacts,
                only_model_artifacts=only_model_artifacts,
                only_deployment_artifacts=only_deployment_artifacts,
                has_custom_name=has_custom_name,
                user=user,
            ),
            hydrate=hydrate,
        )

    def delete_model_version_artifact_link(
        self, model_version_id: UUID, artifact_version_id: UUID
    ) -> None:
        """Delete model version to artifact link in Model Control Plane.

        Args:
            model_version_id: The id of the model version holding the link.
            artifact_version_id: The id of the artifact version to be deleted.

        Raises:
            RuntimeError: If more than one artifact link is found for given filters.
        """
        artifact_links = self.list_model_version_artifact_links(
            model_version_id=model_version_id,
            artifact_version_id=artifact_version_id,
        )
        if artifact_links.items:
            if artifact_links.total > 1:
                raise RuntimeError(
                    "More than one artifact link found for give model version "
                    f"`{model_version_id}` and artifact version "
                    f"`{artifact_version_id}`. This should not be happening and "
                    "might indicate a corrupted state of your ZenML database. "
                    "Please seek support via Community Slack."
                )
            self.zen_store.delete_model_version_artifact_link(
                model_version_id=model_version_id,
                model_version_artifact_link_name_or_id=artifact_links.items[
                    0
                ].id,
            )

    def delete_all_model_version_artifact_links(
        self, model_version_id: UUID, only_links: bool
    ) -> None:
        """Delete all model version to artifact links in Model Control Plane.

        Args:
            model_version_id: The id of the model version holding the link.
            only_links: If true, only delete the link to the artifact.
        """
        self.zen_store.delete_all_model_version_artifact_links(
            model_version_id, only_links
        )

    #################################################
    # Model Versions Pipeline Runs
    #
    # Only view capabilities are exposed via client.
    #################################################

    def list_model_version_pipeline_run_links(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        model_version_id: Optional[Union[UUID, str]] = None,
        pipeline_run_id: Optional[Union[UUID, str]] = None,
        pipeline_run_name: Optional[str] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[ModelVersionPipelineRunResponse]:
        """Get all model version to pipeline run links by filter.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            model_version_id: Use the model version id for filtering
            pipeline_run_id: Use the pipeline run id for filtering
            pipeline_run_name: Use the pipeline run name for filtering
            user: Filter by user name or ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response

        Returns:
            A page of all model version to pipeline run links.
        """
        return self.zen_store.list_model_version_pipeline_run_links(
            ModelVersionPipelineRunFilter(
                sort_by=sort_by,
                logical_operator=logical_operator,
                page=page,
                size=size,
                created=created,
                updated=updated,
                model_version_id=model_version_id,
                pipeline_run_id=pipeline_run_id,
                pipeline_run_name=pipeline_run_name,
                user=user,
            ),
            hydrate=hydrate,
        )

    # --------------------------- Authorized Devices ---------------------------

    def list_authorized_devices(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        expires: Optional[Union[datetime, str]] = None,
        client_id: Union[UUID, str, None] = None,
        status: Union[OAuthDeviceStatus, str, None] = None,
        trusted_device: Union[bool, str, None] = None,
        user: Optional[Union[UUID, str]] = None,
        failed_auth_attempts: Union[int, str, None] = None,
        last_login: Optional[Union[datetime, str, None]] = None,
        hydrate: bool = False,
    ) -> Page[OAuthDeviceResponse]:
        """List all authorized devices.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            id: Use the id of the code repository to filter by.
            created: Use to filter by time of creation.
            updated: Use the last updated date for filtering.
            expires: Use the expiration date for filtering.
            client_id: Use the client id for filtering.
            status: Use the status for filtering.
            user: Filter by user name/ID.
            trusted_device: Use the trusted device flag for filtering.
            failed_auth_attempts: Use the failed auth attempts for filtering.
            last_login: Use the last login date for filtering.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of authorized devices matching the filter.
        """
        filter_model = OAuthDeviceFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            expires=expires,
            client_id=client_id,
            user=user,
            status=status,
            trusted_device=trusted_device,
            failed_auth_attempts=failed_auth_attempts,
            last_login=last_login,
        )
        return self.zen_store.list_authorized_devices(
            filter_model=filter_model,
            hydrate=hydrate,
        )

    def get_authorized_device(
        self,
        id_or_prefix: Union[UUID, str],
        allow_id_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> OAuthDeviceResponse:
        """Get an authorized device by id or prefix.

        Args:
            id_or_prefix: The ID or ID prefix of the authorized device.
            allow_id_prefix_match: If True, allow matching by ID prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The requested authorized device.

        Raises:
            KeyError: If no authorized device is found with the given ID or
                prefix.
        """
        if isinstance(id_or_prefix, str):
            try:
                id_or_prefix = UUID(id_or_prefix)
            except ValueError:
                if not allow_id_prefix_match:
                    raise KeyError(
                        f"No authorized device found with id or prefix "
                        f"'{id_or_prefix}'."
                    )
        if isinstance(id_or_prefix, UUID):
            return self.zen_store.get_authorized_device(
                id_or_prefix, hydrate=hydrate
            )
        return self._get_entity_by_prefix(
            get_method=self.zen_store.get_authorized_device,
            list_method=self.list_authorized_devices,
            partial_id_or_name=id_or_prefix,
            allow_name_prefix_match=False,
            hydrate=hydrate,
        )

    def update_authorized_device(
        self,
        id_or_prefix: Union[UUID, str],
        locked: Optional[bool] = None,
    ) -> OAuthDeviceResponse:
        """Update an authorized device.

        Args:
            id_or_prefix: The ID or ID prefix of the authorized device.
            locked: Whether to lock or unlock the authorized device.

        Returns:
            The updated authorized device.
        """
        device = self.get_authorized_device(
            id_or_prefix=id_or_prefix, allow_id_prefix_match=False
        )
        return self.zen_store.update_authorized_device(
            device_id=device.id,
            update=OAuthDeviceUpdate(
                locked=locked,
            ),
        )

    def delete_authorized_device(
        self,
        id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete an authorized device.

        Args:
            id_or_prefix: The ID or ID prefix of the authorized device.
        """
        device = self.get_authorized_device(
            id_or_prefix=id_or_prefix,
            allow_id_prefix_match=False,
        )
        self.zen_store.delete_authorized_device(device.id)

    # --------------------------- Trigger Executions ---------------------------

    def get_trigger_execution(
        self,
        trigger_execution_id: UUID,
        hydrate: bool = True,
    ) -> TriggerExecutionResponse:
        """Get a trigger execution by ID.

        Args:
            trigger_execution_id: The ID of the trigger execution to get.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The trigger execution.
        """
        return self.zen_store.get_trigger_execution(
            trigger_execution_id=trigger_execution_id, hydrate=hydrate
        )

    def list_trigger_executions(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        trigger_id: Optional[UUID] = None,
        user: Optional[Union[UUID, str]] = None,
        hydrate: bool = False,
    ) -> Page[TriggerExecutionResponse]:
        """List all trigger executions matching the given filter criteria.

        Args:
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            trigger_id: ID of the trigger to filter by.
            user: Filter by user name/ID.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A list of all trigger executions matching the filter criteria.
        """
        filter_model = TriggerExecutionFilter(
            trigger_id=trigger_id,
            sort_by=sort_by,
            page=page,
            size=size,
            user=user,
            logical_operator=logical_operator,
        )
        filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_trigger_executions(
            trigger_execution_filter_model=filter_model, hydrate=hydrate
        )

    def delete_trigger_execution(self, trigger_execution_id: UUID) -> None:
        """Delete a trigger execution.

        Args:
            trigger_execution_id: The ID of the trigger execution to delete.
        """
        self.zen_store.delete_trigger_execution(
            trigger_execution_id=trigger_execution_id
        )

    # ---- utility prefix matching get functions -----

    def _get_entity_by_id_or_name_or_prefix(
        self,
        get_method: Callable[..., AnyResponse],
        list_method: Callable[..., Page[AnyResponse]],
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> AnyResponse:
        """Fetches an entity using the id, name, or partial id/name.

        Args:
            get_method: The method to use to fetch the entity by id.
            list_method: The method to use to fetch all entities.
            name_id_or_prefix: The id, name or partial id of the entity to
                fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The entity with the given name, id or partial id.

        Raises:
            ZenKeyError: If there is more than one entity with that name
                or id prefix.
        """
        from zenml.utils.uuid_utils import is_valid_uuid

        entity_label = get_method.__name__.replace("get_", "") + "s"

        # First interpret as full UUID
        if is_valid_uuid(name_id_or_prefix):
            return get_method(name_id_or_prefix, hydrate=hydrate)

        # If not a UUID, try to find by name
        assert not isinstance(name_id_or_prefix, UUID)
        entity = list_method(
            name=f"equals:{name_id_or_prefix}",
            hydrate=hydrate,
        )

        # If only a single entity is found, return it
        if entity.total == 1:
            return entity.items[0]

        # If still no match, try with prefix now
        if entity.total == 0:
            return self._get_entity_by_prefix(
                get_method=get_method,
                list_method=list_method,
                partial_id_or_name=name_id_or_prefix,
                allow_name_prefix_match=allow_name_prefix_match,
                hydrate=hydrate,
            )

        # If more than one entity with the same name is found, raise an error.
        formatted_entity_items = [
            f"- {item.name}: (id: {item.id})\n"
            if hasattr(item, "name")
            else f"- {item.id}\n"
            for item in entity.items
        ]
        raise ZenKeyError(
            f"{entity.total} {entity_label} have been found that have "
            f"a name that matches the provided "
            f"string '{name_id_or_prefix}':\n"
            f"{formatted_entity_items}.\n"
            f"Please use the id to uniquely identify "
            f"only one of the {entity_label}s."
        )

    def _get_entity_version_by_id_or_name_or_prefix(
        self,
        get_method: Callable[..., AnyResponse],
        list_method: Callable[..., Page[AnyResponse]],
        name_id_or_prefix: Union[str, UUID],
        version: Optional[str],
        hydrate: bool = True,
    ) -> "AnyResponse":
        from zenml.utils.uuid_utils import is_valid_uuid

        entity_label = get_method.__name__.replace("get_", "") + "s"

        if is_valid_uuid(name_id_or_prefix):
            if version:
                logger.warning(
                    "You specified both an ID as well as a version of the "
                    f"{entity_label}. Ignoring the version and fetching the "
                    f"{entity_label} by ID."
                )
            if not isinstance(name_id_or_prefix, UUID):
                name_id_or_prefix = UUID(name_id_or_prefix, version=4)

            return get_method(name_id_or_prefix, hydrate=hydrate)

        assert not isinstance(name_id_or_prefix, UUID)
        exact_name_matches = list_method(
            size=1,
            sort_by="desc:created",
            name=name_id_or_prefix,
            version=version,
            hydrate=hydrate,
        )

        if len(exact_name_matches) == 1:
            # If the name matches exactly, use the explicitly specified version
            # or fallback to the latest if not given
            return exact_name_matches.items[0]

        partial_id_matches = list_method(
            id=f"startswith:{name_id_or_prefix}",
            hydrate=hydrate,
        )
        if partial_id_matches.total == 1:
            if version:
                logger.warning(
                    "You specified both a partial ID as well as a version of "
                    f"the {entity_label}. Ignoring the version and fetching "
                    f"the {entity_label} by partial ID."
                )
            return partial_id_matches[0]
        elif partial_id_matches.total == 0:
            raise KeyError(
                f"No {entity_label} found for name, ID or prefix "
                f"{name_id_or_prefix}."
            )
        else:
            raise ZenKeyError(
                f"{partial_id_matches.total} {entity_label} have been found "
                "that have an id prefix that matches the provided string "
                f"'{name_id_or_prefix}':\n"
                f"{partial_id_matches.items}.\n"
                f"Please provide more characters to uniquely identify "
                f"only one of the {entity_label}s."
            )

    def _get_entity_by_prefix(
        self,
        get_method: Callable[..., AnyResponse],
        list_method: Callable[..., Page[AnyResponse]],
        partial_id_or_name: str,
        allow_name_prefix_match: bool,
        hydrate: bool = True,
    ) -> AnyResponse:
        """Fetches an entity using a partial ID or name.

        Args:
            get_method: The method to use to fetch the entity by id.
            list_method: The method to use to fetch all entities.
            partial_id_or_name: The partial ID or name of the entity to fetch.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The entity with the given partial ID or name.

        Raises:
            KeyError: If no entity with the given partial ID or name is found.
            ZenKeyError: If there is more than one entity with that partial ID
                or name.
        """
        list_method_args: Dict[str, Any] = {
            "logical_operator": LogicalOperators.OR,
            "id": f"startswith:{partial_id_or_name}",
            "hydrate": hydrate,
        }
        if allow_name_prefix_match:
            list_method_args["name"] = f"startswith:{partial_id_or_name}"

        entity = list_method(**list_method_args)

        # If only a single entity is found, return it.
        if entity.total == 1:
            return entity.items[0]

        irregular_plurals = {"code_repository": "code_repositories"}
        entity_label = irregular_plurals.get(
            get_method.__name__.replace("get_", ""),
            get_method.__name__.replace("get_", "") + "s",
        )

        prefix_description = (
            "a name/ID prefix" if allow_name_prefix_match else "an ID prefix"
        )
        # If no entity is found, raise an error.
        if entity.total == 0:
            raise KeyError(
                f"No {entity_label} have been found that have "
                f"{prefix_description} that matches the provided string "
                f"'{partial_id_or_name}'."
            )

        # If more than one entity is found, raise an error.
        ambiguous_entities: List[str] = []
        for model in entity.items:
            model_name = getattr(model, "name", None)
            if model_name:
                ambiguous_entities.append(f"{model_name}: {model.id}")
            else:
                ambiguous_entities.append(str(model.id))
        raise ZenKeyError(
            f"{entity.total} {entity_label} have been found that have "
            f"{prefix_description} that matches the provided "
            f"string '{partial_id_or_name}':\n"
            f"{ambiguous_entities}.\n"
            f"Please provide more characters to uniquely identify "
            f"only one of the {entity_label}s."
        )

    # ---------------------------- Service Accounts ----------------------------

    def create_service_account(
        self,
        name: str,
        description: str = "",
    ) -> ServiceAccountResponse:
        """Create a new service account.

        Args:
            name: The name of the service account.
            description: The description of the service account.

        Returns:
            The created service account.
        """
        service_account = ServiceAccountRequest(
            name=name, description=description, active=True
        )
        created_service_account = self.zen_store.create_service_account(
            service_account=service_account
        )

        return created_service_account

    def get_service_account(
        self,
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> ServiceAccountResponse:
        """Gets a service account.

        Args:
            name_id_or_prefix: The name or ID of the service account.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The ServiceAccount
        """
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_service_account,
            list_method=self.list_service_accounts,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def list_service_accounts(
        self,
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
        hydrate: bool = False,
    ) -> Page[ServiceAccountResponse]:
        """List all service accounts.

        Args:
            sort_by: The column to sort by
            page: The page of items
            size: The maximum size of all pages
            logical_operator: Which logical operator to use [and, or]
            id: Use the id of stacks to filter by.
            created: Use to filter by time of creation
            updated: Use the last updated date for filtering
            name: Use the service account name for filtering
            description: Use the service account description for filtering
            active: Use the service account active status for filtering
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The list of service accounts matching the filter description.
        """
        return self.zen_store.list_service_accounts(
            ServiceAccountFilter(
                sort_by=sort_by,
                page=page,
                size=size,
                logical_operator=logical_operator,
                id=id,
                created=created,
                updated=updated,
                name=name,
                description=description,
                active=active,
            ),
            hydrate=hydrate,
        )

    def update_service_account(
        self,
        name_id_or_prefix: Union[str, UUID],
        updated_name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
    ) -> ServiceAccountResponse:
        """Update a service account.

        Args:
            name_id_or_prefix: The name or ID of the service account to update.
            updated_name: The new name of the service account.
            description: The new description of the service account.
            active: The new active status of the service account.

        Returns:
            The updated service account.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        service_account_update = ServiceAccountUpdate(
            name=updated_name,
            description=description,
            active=active,
        )

        return self.zen_store.update_service_account(
            service_account_name_or_id=service_account.id,
            service_account_update=service_account_update,
        )

    def delete_service_account(
        self,
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete a service account.

        Args:
            name_id_or_prefix: The name or ID of the service account to delete.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
        )
        self.zen_store.delete_service_account(
            service_account_name_or_id=service_account.id
        )

    # -------------------------------- API Keys --------------------------------

    def create_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name: str,
        description: str = "",
        set_key: bool = False,
    ) -> APIKeyResponse:
        """Create a new API key and optionally set it as the active API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to create the API key for.
            name: Name of the API key.
            description: The description of the API key.
            set_key: Whether to set the created API key as the active API key.

        Returns:
            The created API key.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=service_account_name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        request = APIKeyRequest(
            name=name,
            description=description,
        )
        api_key = self.zen_store.create_api_key(
            service_account_id=service_account.id, api_key=request
        )
        assert api_key.key is not None

        if set_key:
            self.set_api_key(key=api_key.key)

        return api_key

    def set_api_key(self, key: str) -> None:
        """Configure the client with an API key.

        Args:
            key: The API key to use.

        Raises:
            NotImplementedError: If the client is not connected to a ZenML
                server.
        """
        from zenml.login.credentials_store import get_credentials_store
        from zenml.zen_stores.rest_zen_store import RestZenStore

        zen_store = self.zen_store
        if not zen_store.TYPE == StoreType.REST:
            raise NotImplementedError(
                "API key configuration is only supported if connected to a "
                "ZenML server."
            )

        credentials_store = get_credentials_store()
        assert isinstance(zen_store, RestZenStore)

        credentials_store.set_api_key(server_url=zen_store.url, api_key=key)

        # Force a re-authentication to start using the new API key
        # right away.
        zen_store.authenticate(force=True)

    def list_api_keys(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        sort_by: str = "created",
        page: int = PAGINATION_STARTING_PAGE,
        size: int = PAGE_SIZE_DEFAULT,
        logical_operator: LogicalOperators = LogicalOperators.AND,
        id: Optional[Union[UUID, str]] = None,
        created: Optional[Union[datetime, str]] = None,
        updated: Optional[Union[datetime, str]] = None,
        name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
        last_login: Optional[Union[datetime, str]] = None,
        last_rotated: Optional[Union[datetime, str]] = None,
        hydrate: bool = False,
    ) -> Page[APIKeyResponse]:
        """List all API keys.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to list the API keys for.
            sort_by: The column to sort by.
            page: The page of items.
            size: The maximum size of all pages.
            logical_operator: Which logical operator to use [and, or].
            id: Use the id of the API key to filter by.
            created: Use to filter by time of creation.
            updated: Use the last updated date for filtering.
            name: The name of the API key to filter by.
            description: The description of the API key to filter by.
            active: Whether the API key is active or not.
            last_login: The last time the API key was used.
            last_rotated: The last time the API key was rotated.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of API keys matching the filter description.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=service_account_name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        filter_model = APIKeyFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            description=description,
            active=active,
            last_login=last_login,
            last_rotated=last_rotated,
        )
        return self.zen_store.list_api_keys(
            service_account_id=service_account.id,
            filter_model=filter_model,
            hydrate=hydrate,
        )

    def get_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[str, UUID],
        allow_name_prefix_match: bool = True,
        hydrate: bool = True,
    ) -> APIKeyResponse:
        """Get an API key by name, id or prefix.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to get the API key for.
            name_id_or_prefix: The name, ID or ID prefix of the API key.
            allow_name_prefix_match: If True, allow matching by name prefix.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The API key.
        """
        service_account = self.get_service_account(
            name_id_or_prefix=service_account_name_id_or_prefix,
            allow_name_prefix_match=False,
        )

        def get_api_key_method(
            api_key_name_or_id: str, hydrate: bool = True
        ) -> APIKeyResponse:
            return self.zen_store.get_api_key(
                service_account_id=service_account.id,
                api_key_name_or_id=api_key_name_or_id,
                hydrate=hydrate,
            )

        def list_api_keys_method(
            hydrate: bool = True,
            **filter_args: Any,
        ) -> Page[APIKeyResponse]:
            return self.list_api_keys(
                service_account_name_id_or_prefix=service_account.id,
                hydrate=hydrate,
                **filter_args,
            )

        return self._get_entity_by_id_or_name_or_prefix(
            get_method=get_api_key_method,
            list_method=list_api_keys_method,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )

    def update_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[UUID, str],
        name: Optional[str] = None,
        description: Optional[str] = None,
        active: Optional[bool] = None,
    ) -> APIKeyResponse:
        """Update an API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to update the API key for.
            name_id_or_prefix: Name, ID or prefix of the API key to update.
            name: New name of the API key.
            description: New description of the API key.
            active: Whether the API key is active or not.

        Returns:
            The updated API key.
        """
        api_key = self.get_api_key(
            service_account_name_id_or_prefix=service_account_name_id_or_prefix,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        update = APIKeyUpdate(
            name=name, description=description, active=active
        )
        return self.zen_store.update_api_key(
            service_account_id=api_key.service_account.id,
            api_key_name_or_id=api_key.id,
            api_key_update=update,
        )

    def rotate_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[UUID, str],
        retain_period_minutes: int = 0,
        set_key: bool = False,
    ) -> APIKeyResponse:
        """Rotate an API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to rotate the API key for.
            name_id_or_prefix: Name, ID or prefix of the API key to update.
            retain_period_minutes: The number of minutes to retain the old API
                key for. If set to 0, the old API key will be invalidated.
            set_key: Whether to set the rotated API key as the active API key.

        Returns:
            The updated API key.
        """
        api_key = self.get_api_key(
            service_account_name_id_or_prefix=service_account_name_id_or_prefix,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        rotate_request = APIKeyRotateRequest(
            retain_period_minutes=retain_period_minutes
        )
        new_key = self.zen_store.rotate_api_key(
            service_account_id=api_key.service_account.id,
            api_key_name_or_id=api_key.id,
            rotate_request=rotate_request,
        )
        assert new_key.key is not None
        if set_key:
            self.set_api_key(key=new_key.key)

        return new_key

    def delete_api_key(
        self,
        service_account_name_id_or_prefix: Union[str, UUID],
        name_id_or_prefix: Union[str, UUID],
    ) -> None:
        """Delete an API key.

        Args:
            service_account_name_id_or_prefix: The name, ID or prefix of the
                service account to delete the API key for.
            name_id_or_prefix: The name, ID or prefix of the API key.
        """
        api_key = self.get_api_key(
            service_account_name_id_or_prefix=service_account_name_id_or_prefix,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=False,
        )
        self.zen_store.delete_api_key(
            service_account_id=api_key.service_account.id,
            api_key_name_or_id=api_key.id,
        )

    #############################################
    # Tags
    #
    # Note: tag<>resource are not exposed and
    # can be accessed via relevant resources
    #############################################

    def create_tag(self, tag: TagRequest) -> TagResponse:
        """Creates a new tag.

        Args:
            tag: the Tag to be created.

        Returns:
            The newly created tag.
        """
        return self.zen_store.create_tag(tag=tag)

    def delete_tag(self, tag_name_or_id: Union[str, UUID]) -> None:
        """Deletes a tag.

        Args:
            tag_name_or_id: name or id of the tag to be deleted.
        """
        self.zen_store.delete_tag(tag_name_or_id=tag_name_or_id)

    def update_tag(
        self,
        tag_name_or_id: Union[str, UUID],
        tag_update_model: TagUpdate,
    ) -> TagResponse:
        """Updates an existing tag.

        Args:
            tag_name_or_id: name or UUID of the tag to be updated.
            tag_update_model: the tag to be updated.

        Returns:
            The updated tag.
        """
        return self.zen_store.update_tag(
            tag_name_or_id=tag_name_or_id, tag_update_model=tag_update_model
        )

    def get_tag(
        self, tag_name_or_id: Union[str, UUID], hydrate: bool = True
    ) -> TagResponse:
        """Get an existing tag.

        Args:
            tag_name_or_id: name or id of the model to be retrieved.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            The tag of interest.
        """
        return self.zen_store.get_tag(
            tag_name_or_id=tag_name_or_id, hydrate=hydrate
        )

    def list_tags(
        self,
        tag_filter_model: TagFilter,
        hydrate: bool = False,
    ) -> Page[TagResponse]:
        """Get tags by filter.

        Args:
            tag_filter_model: All filter parameters including pagination
                params.
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.

        Returns:
            A page of all tags.
        """
        return self.zen_store.list_tags(
            tag_filter_model=tag_filter_model, hydrate=hydrate
        )

active_stack property

The active stack for this client.

Returns:

Type Description
Stack

The active stack for this client.

active_stack_model property

The model of the active stack for this client.

If no active stack is configured locally for the client, the active stack in the global configuration is used instead.

Returns:

Type Description
StackResponse

The model of the active stack for this client.

Raises:

Type Description
RuntimeError

If the active stack is not set.

active_user property

Get the user that is currently in use.

Returns:

Type Description
UserResponse

The active user.

active_workspace property

Get the currently active workspace of the local client.

If no active workspace is configured locally for the client, the active workspace in the global configuration is used instead.

Returns:

Type Description
WorkspaceResponse

The active workspace.

Raises:

Type Description
RuntimeError

If the active workspace is not set.

config_directory property

The configuration directory of this client.

Returns:

Type Description
Optional[Path]

The configuration directory of this client, or None, if the

Optional[Path]

client doesn't have an active root.

root property

The root directory of this client.

Returns:

Type Description
Optional[Path]

The root directory of this client, or None, if the client

Optional[Path]

has not been initialized.

uses_local_configuration property

Check if the client is using a local configuration.

Returns:

Type Description
bool

True if the client is using a local configuration,

bool

False otherwise.

zen_store property

Shortcut to return the global zen store.

Returns:

Type Description
BaseZenStore

The global zen store.

__init__(root=None)

Initializes the global client instance.

Client is a singleton class: only one instance can exist. Calling this constructor multiple times will always yield the same instance (see the exception below).

The root argument is only meant for internal use and testing purposes. User code must never pass them to the constructor. When a custom root value is passed, an anonymous Client instance is created and returned independently of the Client singleton and that will have no effect as far as the rest of the ZenML core code is concerned.

Instead of creating a new Client instance to reflect a different repository root, to change the active root in the global Client, call Client().activate_root(<new-root>).

Parameters:

Name Type Description Default
root Optional[Path]

(internal use) custom root directory for the client. If no path is given, the repository root is determined using the environment variable ZENML_REPOSITORY_PATH (if set) and by recursively searching in the parent directories of the current working directory. Only used to initialize new clients internally.

None
Source code in src/zenml/client.py
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
def __init__(
    self,
    root: Optional[Path] = None,
) -> None:
    """Initializes the global client instance.

    Client is a singleton class: only one instance can exist. Calling
    this constructor multiple times will always yield the same instance (see
    the exception below).

    The `root` argument is only meant for internal use and testing purposes.
    User code must never pass them to the constructor.
    When a custom `root` value is passed, an anonymous Client instance
    is created and returned independently of the Client singleton and
    that will have no effect as far as the rest of the ZenML core code is
    concerned.

    Instead of creating a new Client instance to reflect a different
    repository root, to change the active root in the global Client,
    call `Client().activate_root(<new-root>)`.

    Args:
        root: (internal use) custom root directory for the client. If
            no path is given, the repository root is determined using the
            environment variable `ZENML_REPOSITORY_PATH` (if set) and by
            recursively searching in the parent directories of the
            current working directory. Only used to initialize new
            clients internally.
    """
    self._root: Optional[Path] = None
    self._config: Optional[ClientConfiguration] = None

    self._set_active_root(root)

activate_root(root=None)

Set the active repository root directory.

Parameters:

Name Type Description Default
root Optional[Path]

The path to set as the active repository root. If not set, the repository root is determined using the environment variable ZENML_REPOSITORY_PATH (if set) and by recursively searching in the parent directories of the current working directory.

None
Source code in src/zenml/client.py
666
667
668
669
670
671
672
673
674
675
676
def activate_root(self, root: Optional[Path] = None) -> None:
    """Set the active repository root directory.

    Args:
        root: The path to set as the active repository root. If not set,
            the repository root is determined using the environment
            variable `ZENML_REPOSITORY_PATH` (if set) and by recursively
            searching in the parent directories of the current working
            directory.
    """
    self._set_active_root(root)

activate_stack(stack_name_id_or_prefix)

Sets the stack as active.

Parameters:

Name Type Description Default
stack_name_id_or_prefix Union[str, UUID]

Model of the stack to activate.

required

Raises:

Type Description
KeyError

If the stack is not registered.

Source code in src/zenml/client.py
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
def activate_stack(
    self, stack_name_id_or_prefix: Union[str, UUID]
) -> None:
    """Sets the stack as active.

    Args:
        stack_name_id_or_prefix: Model of the stack to activate.

    Raises:
        KeyError: If the stack is not registered.
    """
    # Make sure the stack is registered
    try:
        stack = self.get_stack(name_id_or_prefix=stack_name_id_or_prefix)
    except KeyError as e:
        raise KeyError(
            f"Stack '{stack_name_id_or_prefix}' cannot be activated since "
            f"it is not registered yet. Please register it first."
        ) from e

    if self._config:
        self._config.set_active_stack(stack=stack)

    else:
        # set the active stack globally only if the client doesn't use
        # a local configuration
        GlobalConfiguration().set_active_stack(stack=stack)

backup_secrets(ignore_errors=True, delete_secrets=False)

Backs up all secrets to the configured backup secrets store.

Parameters:

Name Type Description Default
ignore_errors bool

Whether to ignore individual errors during the backup process and attempt to backup all secrets.

True
delete_secrets bool

Whether to delete the secrets that have been successfully backed up from the primary secrets store. Setting this flag effectively moves all secrets from the primary secrets store to the backup secrets store.

False
Source code in src/zenml/client.py
4926
4927
4928
4929
4930
4931
4932
4933
4934
4935
4936
4937
4938
4939
4940
4941
4942
4943
def backup_secrets(
    self,
    ignore_errors: bool = True,
    delete_secrets: bool = False,
) -> None:
    """Backs up all secrets to the configured backup secrets store.

    Args:
        ignore_errors: Whether to ignore individual errors during the backup
            process and attempt to backup all secrets.
        delete_secrets: Whether to delete the secrets that have been
            successfully backed up from the primary secrets store. Setting
            this flag effectively moves all secrets from the primary secrets
            store to the backup secrets store.
    """
    self.zen_store.backup_secrets(
        ignore_errors=ignore_errors, delete_secrets=delete_secrets
    )

create_action(name, flavor, action_type, configuration, service_account_id, auth_window=None, description='')

Create an action.

Parameters:

Name Type Description Default
name str

The name of the action.

required
flavor str

The flavor of the action,

required
action_type PluginSubType

The action subtype.

required
configuration Dict[str, Any]

The action configuration.

required
service_account_id UUID

The service account that is used to execute the action.

required
auth_window Optional[int]

The time window in minutes for which the service account is authorized to execute the action. Set this to 0 to authorize the service account indefinitely (not recommended).

None
description str

The description of the action.

''

Returns:

Type Description
ActionResponse

The created action

Source code in src/zenml/client.py
2937
2938
2939
2940
2941
2942
2943
2944
2945
2946
2947
2948
2949
2950
2951
2952
2953
2954
2955
2956
2957
2958
2959
2960
2961
2962
2963
2964
2965
2966
2967
2968
2969
2970
2971
2972
2973
2974
2975
2976
2977
@_fail_for_sql_zen_store
def create_action(
    self,
    name: str,
    flavor: str,
    action_type: PluginSubType,
    configuration: Dict[str, Any],
    service_account_id: UUID,
    auth_window: Optional[int] = None,
    description: str = "",
) -> ActionResponse:
    """Create an action.

    Args:
        name: The name of the action.
        flavor: The flavor of the action,
        action_type: The action subtype.
        configuration: The action configuration.
        service_account_id: The service account that is used to execute the
            action.
        auth_window: The time window in minutes for which the service
            account is authorized to execute the action. Set this to 0 to
            authorize the service account indefinitely (not recommended).
        description: The description of the action.

    Returns:
        The created action
    """
    action = ActionRequest(
        name=name,
        description=description,
        flavor=flavor,
        plugin_subtype=action_type,
        configuration=configuration,
        service_account_id=service_account_id,
        auth_window=auth_window,
        user=self.active_user.id,
        workspace=self.active_workspace.id,
    )

    return self.zen_store.create_action(action=action)

create_api_key(service_account_name_id_or_prefix, name, description='', set_key=False)

Create a new API key and optionally set it as the active API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to create the API key for.

required
name str

Name of the API key.

required
description str

The description of the API key.

''
set_key bool

Whether to set the created API key as the active API key.

False

Returns:

Type Description
APIKeyResponse

The created API key.

Source code in src/zenml/client.py
7230
7231
7232
7233
7234
7235
7236
7237
7238
7239
7240
7241
7242
7243
7244
7245
7246
7247
7248
7249
7250
7251
7252
7253
7254
7255
7256
7257
7258
7259
7260
7261
7262
7263
7264
7265
def create_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name: str,
    description: str = "",
    set_key: bool = False,
) -> APIKeyResponse:
    """Create a new API key and optionally set it as the active API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to create the API key for.
        name: Name of the API key.
        description: The description of the API key.
        set_key: Whether to set the created API key as the active API key.

    Returns:
        The created API key.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=service_account_name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    request = APIKeyRequest(
        name=name,
        description=description,
    )
    api_key = self.zen_store.create_api_key(
        service_account_id=service_account.id, api_key=request
    )
    assert api_key.key is not None

    if set_key:
        self.set_api_key(key=api_key.key)

    return api_key

create_code_repository(name, config, source, description=None, logo_url=None)

Create a new code repository.

Parameters:

Name Type Description Default
name str

Name of the code repository.

required
config Dict[str, Any]

The configuration for the code repository.

required
source Source

The code repository implementation source.

required
description Optional[str]

The code repository description.

None
logo_url Optional[str]

URL of a logo (png, jpg or svg) for the code repository.

None

Returns:

Type Description
CodeRepositoryResponse

The created code repository.

Source code in src/zenml/client.py
4993
4994
4995
4996
4997
4998
4999
5000
5001
5002
5003
5004
5005
5006
5007
5008
5009
5010
5011
5012
5013
5014
5015
5016
5017
5018
5019
5020
5021
5022
5023
5024
5025
def create_code_repository(
    self,
    name: str,
    config: Dict[str, Any],
    source: Source,
    description: Optional[str] = None,
    logo_url: Optional[str] = None,
) -> CodeRepositoryResponse:
    """Create a new code repository.

    Args:
        name: Name of the code repository.
        config: The configuration for the code repository.
        source: The code repository implementation source.
        description: The code repository description.
        logo_url: URL of a logo (png, jpg or svg) for the code repository.

    Returns:
        The created code repository.
    """
    self._validate_code_repository_config(source=source, config=config)
    repo_request = CodeRepositoryRequest(
        user=self.active_user.id,
        workspace=self.active_workspace.id,
        name=name,
        config=config,
        source=source,
        description=description,
        logo_url=logo_url,
    )
    return self.zen_store.create_code_repository(
        code_repository=repo_request
    )

create_event_source(name, configuration, flavor, event_source_subtype, description='')

Registers an event source.

Parameters:

Name Type Description Default
name str

The name of the event source to create.

required
configuration Dict[str, Any]

Configuration for this event source.

required
flavor str

The flavor of event source.

required
event_source_subtype PluginSubType

The event source subtype.

required
description str

The description of the event source.

''

Returns:

Type Description
EventSourceResponse

The model of the registered event source.

Source code in src/zenml/client.py
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
@_fail_for_sql_zen_store
def create_event_source(
    self,
    name: str,
    configuration: Dict[str, Any],
    flavor: str,
    event_source_subtype: PluginSubType,
    description: str = "",
) -> EventSourceResponse:
    """Registers an event source.

    Args:
        name: The name of the event source to create.
        configuration: Configuration for this event source.
        flavor: The flavor of event source.
        event_source_subtype: The event source subtype.
        description: The description of the event source.

    Returns:
        The model of the registered event source.
    """
    event_source = EventSourceRequest(
        name=name,
        configuration=configuration,
        description=description,
        flavor=flavor,
        plugin_type=PluginType.EVENT_SOURCE,
        plugin_subtype=event_source_subtype,
        user=self.active_user.id,
        workspace=self.active_workspace.id,
    )

    return self.zen_store.create_event_source(event_source=event_source)

create_flavor(source, component_type)

Creates a new flavor.

Parameters:

Name Type Description Default
source str

The flavor to create.

required
component_type StackComponentType

The type of the flavor.

required

Returns:

Type Description
FlavorResponse

The created flavor (in model form).

Raises:

Type Description
ValueError

in case the config_schema of the flavor is too large.

Source code in src/zenml/client.py
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
def create_flavor(
    self,
    source: str,
    component_type: StackComponentType,
) -> FlavorResponse:
    """Creates a new flavor.

    Args:
        source: The flavor to create.
        component_type: The type of the flavor.

    Returns:
        The created flavor (in model form).

    Raises:
        ValueError: in case the config_schema of the flavor is too large.
    """
    from zenml.stack.flavor import validate_flavor_source

    flavor = validate_flavor_source(
        source=source, component_type=component_type
    )()

    if len(flavor.config_schema) > TEXT_FIELD_MAX_LENGTH:
        raise ValueError(
            "Json representation of configuration schema"
            "exceeds max length. This could be caused by an"
            "overly long docstring on the flavors "
            "configuration class' docstring."
        )

    flavor_request = flavor.to_model(integration="custom", is_custom=True)
    return self.zen_store.create_flavor(flavor=flavor_request)

create_model(name, license=None, description=None, audience=None, use_cases=None, limitations=None, trade_offs=None, ethics=None, tags=None, save_models_to_registry=True)

Creates a new model in Model Control Plane.

Parameters:

Name Type Description Default
name str

The name of the model.

required
license Optional[str]

The license under which the model is created.

None
description Optional[str]

The description of the model.

None
audience Optional[str]

The target audience of the model.

None
use_cases Optional[str]

The use cases of the model.

None
limitations Optional[str]

The known limitations of the model.

None
trade_offs Optional[str]

The tradeoffs of the model.

None
ethics Optional[str]

The ethical implications of the model.

None
tags Optional[List[str]]

Tags associated with the model.

None
save_models_to_registry bool

Whether to save the model to the registry.

True

Returns:

Type Description
ModelResponse

The newly created model.

Source code in src/zenml/client.py
6043
6044
6045
6046
6047
6048
6049
6050
6051
6052
6053
6054
6055
6056
6057
6058
6059
6060
6061
6062
6063
6064
6065
6066
6067
6068
6069
6070
6071
6072
6073
6074
6075
6076
6077
6078
6079
6080
6081
6082
6083
6084
6085
6086
6087
6088
6089
def create_model(
    self,
    name: str,
    license: Optional[str] = None,
    description: Optional[str] = None,
    audience: Optional[str] = None,
    use_cases: Optional[str] = None,
    limitations: Optional[str] = None,
    trade_offs: Optional[str] = None,
    ethics: Optional[str] = None,
    tags: Optional[List[str]] = None,
    save_models_to_registry: bool = True,
) -> ModelResponse:
    """Creates a new model in Model Control Plane.

    Args:
        name: The name of the model.
        license: The license under which the model is created.
        description: The description of the model.
        audience: The target audience of the model.
        use_cases: The use cases of the model.
        limitations: The known limitations of the model.
        trade_offs: The tradeoffs of the model.
        ethics: The ethical implications of the model.
        tags: Tags associated with the model.
        save_models_to_registry: Whether to save the model to the
            registry.

    Returns:
        The newly created model.
    """
    return self.zen_store.create_model(
        model=ModelRequest(
            name=name,
            license=license,
            description=description,
            audience=audience,
            use_cases=use_cases,
            limitations=limitations,
            trade_offs=trade_offs,
            ethics=ethics,
            tags=tags,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
            save_models_to_registry=save_models_to_registry,
        )
    )

create_model_version(model_name_or_id, name=None, description=None, tags=None)

Creates a new model version in Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

the name or id of the model to create model version in.

required
name Optional[str]

the name of the Model Version to be created.

None
description Optional[str]

the description of the Model Version to be created.

None
tags Optional[List[str]]

Tags associated with the model.

None

Returns:

Type Description
ModelVersionResponse

The newly created model version.

Source code in src/zenml/client.py
6228
6229
6230
6231
6232
6233
6234
6235
6236
6237
6238
6239
6240
6241
6242
6243
6244
6245
6246
6247
6248
6249
6250
6251
6252
6253
6254
6255
6256
6257
6258
def create_model_version(
    self,
    model_name_or_id: Union[str, UUID],
    name: Optional[str] = None,
    description: Optional[str] = None,
    tags: Optional[List[str]] = None,
) -> ModelVersionResponse:
    """Creates a new model version in Model Control Plane.

    Args:
        model_name_or_id: the name or id of the model to create model
            version in.
        name: the name of the Model Version to be created.
        description: the description of the Model Version to be created.
        tags: Tags associated with the model.

    Returns:
        The newly created model version.
    """
    if not is_valid_uuid(model_name_or_id):
        model_name_or_id = self.get_model(model_name_or_id).id
    return self.zen_store.create_model_version(
        model_version=ModelVersionRequest(
            name=name,
            description=description,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
            model=model_name_or_id,
            tags=tags,
        )
    )

create_run_metadata(metadata, resources, stack_component_id=None, publisher_step_id=None)

Create run metadata.

Parameters:

Name Type Description Default
metadata Dict[str, MetadataType]

The metadata to create as a dictionary of key-value pairs.

required
resources List[RunMetadataResource]

The list of IDs and types of the resources for that the metadata was produced.

required
stack_component_id Optional[UUID]

The ID of the stack component that produced the metadata.

None
publisher_step_id Optional[UUID]

The ID of the step execution that publishes this metadata automatically.

None
Source code in src/zenml/client.py
4470
4471
4472
4473
4474
4475
4476
4477
4478
4479
4480
4481
4482
4483
4484
4485
4486
4487
4488
4489
4490
4491
4492
4493
4494
4495
4496
4497
4498
4499
4500
4501
4502
4503
4504
4505
4506
4507
4508
4509
4510
4511
4512
4513
4514
4515
4516
4517
4518
4519
4520
4521
def create_run_metadata(
    self,
    metadata: Dict[str, "MetadataType"],
    resources: List[RunMetadataResource],
    stack_component_id: Optional[UUID] = None,
    publisher_step_id: Optional[UUID] = None,
) -> None:
    """Create run metadata.

    Args:
        metadata: The metadata to create as a dictionary of key-value pairs.
        resources: The list of IDs and types of the resources for that the
            metadata was produced.
        stack_component_id: The ID of the stack component that produced
            the metadata.
        publisher_step_id: The ID of the step execution that publishes
            this metadata automatically.
    """
    from zenml.metadata.metadata_types import get_metadata_type

    values: Dict[str, "MetadataType"] = {}
    types: Dict[str, "MetadataTypeEnum"] = {}
    for key, value in metadata.items():
        # Skip metadata that is too large to be stored in the database.
        if len(json.dumps(value)) > TEXT_FIELD_MAX_LENGTH:
            logger.warning(
                f"Metadata value for key '{key}' is too large to be "
                "stored in the database. Skipping."
            )
            continue
        # Skip metadata that is not of a supported type.
        try:
            metadata_type = get_metadata_type(value)
        except ValueError as e:
            logger.warning(
                f"Metadata value for key '{key}' is not of a supported "
                f"type. Skipping. Full error: {e}"
            )
            continue
        values[key] = value
        types[key] = metadata_type

    run_metadata = RunMetadataRequest(
        workspace=self.active_workspace.id,
        user=self.active_user.id,
        resources=resources,
        stack_component_id=stack_component_id,
        publisher_step_id=publisher_step_id,
        values=values,
        types=types,
    )
    self.zen_store.create_run_metadata(run_metadata)

create_run_template(name, deployment_id, description=None, tags=None)

Create a run template.

Parameters:

Name Type Description Default
name str

The name of the run template.

required
deployment_id UUID

ID of the deployment which this template should be based off of.

required
description Optional[str]

The description of the run template.

None
tags Optional[List[str]]

Tags associated with the run template.

None

Returns:

Type Description
RunTemplateResponse

The created run template.

Source code in src/zenml/client.py
3455
3456
3457
3458
3459
3460
3461
3462
3463
3464
3465
3466
3467
3468
3469
3470
3471
3472
3473
3474
3475
3476
3477
3478
3479
3480
3481
3482
3483
def create_run_template(
    self,
    name: str,
    deployment_id: UUID,
    description: Optional[str] = None,
    tags: Optional[List[str]] = None,
) -> RunTemplateResponse:
    """Create a run template.

    Args:
        name: The name of the run template.
        deployment_id: ID of the deployment which this template should be
            based off of.
        description: The description of the run template.
        tags: Tags associated with the run template.

    Returns:
        The created run template.
    """
    return self.zen_store.create_run_template(
        template=RunTemplateRequest(
            name=name,
            description=description,
            source_deployment_id=deployment_id,
            tags=tags,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
        )
    )

create_secret(name, values, scope=SecretScope.WORKSPACE)

Creates a new secret.

Parameters:

Name Type Description Default
name str

The name of the secret.

required
values Dict[str, str]

The values of the secret.

required
scope SecretScope

The scope of the secret.

WORKSPACE

Returns:

Type Description
SecretResponse

The created secret (in model form).

Raises:

Type Description
NotImplementedError

If centralized secrets management is not enabled.

Source code in src/zenml/client.py
4525
4526
4527
4528
4529
4530
4531
4532
4533
4534
4535
4536
4537
4538
4539
4540
4541
4542
4543
4544
4545
4546
4547
4548
4549
4550
4551
4552
4553
4554
4555
4556
4557
4558
def create_secret(
    self,
    name: str,
    values: Dict[str, str],
    scope: SecretScope = SecretScope.WORKSPACE,
) -> SecretResponse:
    """Creates a new secret.

    Args:
        name: The name of the secret.
        values: The values of the secret.
        scope: The scope of the secret.

    Returns:
        The created secret (in model form).

    Raises:
        NotImplementedError: If centralized secrets management is not
            enabled.
    """
    create_secret_request = SecretRequest(
        name=name,
        values=values,
        scope=scope,
        user=self.active_user.id,
        workspace=self.active_workspace.id,
    )
    try:
        return self.zen_store.create_secret(secret=create_secret_request)
    except NotImplementedError:
        raise NotImplementedError(
            "centralized secrets management is not supported or explicitly "
            "disabled in the target ZenML deployment."
        )

create_service(config, service_type, model_version_id=None)

Registers a service.

Parameters:

Name Type Description Default
config ServiceConfig

The configuration of the service.

required
service_type ServiceType

The type of the service.

required
model_version_id Optional[UUID]

The ID of the model version to associate with the service.

None

Returns:

Type Description
ServiceResponse

The registered service.

Source code in src/zenml/client.py
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
def create_service(
    self,
    config: ServiceConfig,
    service_type: ServiceType,
    model_version_id: Optional[UUID] = None,
) -> ServiceResponse:
    """Registers a service.

    Args:
        config: The configuration of the service.
        service_type: The type of the service.
        model_version_id: The ID of the model version to associate with the
            service.

    Returns:
        The registered service.
    """
    service_request = ServiceRequest(
        name=config.service_name,
        service_type=service_type,
        config=config.model_dump(),
        workspace=self.active_workspace.id,
        user=self.active_user.id,
        model_version_id=model_version_id,
    )
    # Register the service
    return self.zen_store.create_service(service_request)

create_service_account(name, description='')

Create a new service account.

Parameters:

Name Type Description Default
name str

The name of the service account.

required
description str

The description of the service account.

''

Returns:

Type Description
ServiceAccountResponse

The created service account.

Source code in src/zenml/client.py
7083
7084
7085
7086
7087
7088
7089
7090
7091
7092
7093
7094
7095
7096
7097
7098
7099
7100
7101
7102
7103
7104
def create_service_account(
    self,
    name: str,
    description: str = "",
) -> ServiceAccountResponse:
    """Create a new service account.

    Args:
        name: The name of the service account.
        description: The description of the service account.

    Returns:
        The created service account.
    """
    service_account = ServiceAccountRequest(
        name=name, description=description, active=True
    )
    created_service_account = self.zen_store.create_service_account(
        service_account=service_account
    )

    return created_service_account

create_service_connector(name, connector_type, resource_type=None, auth_method=None, configuration=None, resource_id=None, description='', expiration_seconds=None, expires_at=None, expires_skew_tolerance=None, labels=None, auto_configure=False, verify=True, list_resources=True, register=True)

Create, validate and/or register a service connector.

Parameters:

Name Type Description Default
name str

The name of the service connector.

required
connector_type str

The service connector type.

required
auth_method Optional[str]

The authentication method of the service connector. May be omitted if auto-configuration is used.

None
resource_type Optional[str]

The resource type for the service connector.

None
configuration Optional[Dict[str, str]]

The configuration of the service connector.

None
resource_id Optional[str]

The resource id of the service connector.

None
description str

The description of the service connector.

''
expiration_seconds Optional[int]

The expiration time of the service connector.

None
expires_at Optional[datetime]

The expiration time of the service connector.

None
expires_skew_tolerance Optional[int]

The allowed expiration skew for the service connector credentials.

None
labels Optional[Dict[str, str]]

The labels of the service connector.

None
auto_configure bool

Whether to automatically configure the service connector from the local environment.

False
verify bool

Whether to verify that the service connector configuration and credentials can be used to gain access to the resource.

True
list_resources bool

Whether to also list the resources that the service connector can give access to (if verify is True).

True
register bool

Whether to register the service connector or not.

True

Returns:

Type Description
Optional[Union[ServiceConnectorResponse, ServiceConnectorRequest]]

The model of the registered service connector and the resources

Optional[ServiceConnectorResourcesModel]

that the service connector can give access to (if verify is True).

Raises:

Type Description
ValueError

If the arguments are invalid.

KeyError

If the service connector type is not found.

NotImplementedError

If auto-configuration is not supported or not implemented for the service connector type.

AuthorizationException

If the connector verification failed due to authorization issues.

Source code in src/zenml/client.py
5168
5169
5170
5171
5172
5173
5174
5175
5176
5177
5178
5179
5180
5181
5182
5183
5184
5185
5186
5187
5188
5189
5190
5191
5192
5193
5194
5195
5196
5197
5198
5199
5200
5201
5202
5203
5204
5205
5206
5207
5208
5209
5210
5211
5212
5213
5214
5215
5216
5217
5218
5219
5220
5221
5222
5223
5224
5225
5226
5227
5228
5229
5230
5231
5232
5233
5234
5235
5236
5237
5238
5239
5240
5241
5242
5243
5244
5245
5246
5247
5248
5249
5250
5251
5252
5253
5254
5255
5256
5257
5258
5259
5260
5261
5262
5263
5264
5265
5266
5267
5268
5269
5270
5271
5272
5273
5274
5275
5276
5277
5278
5279
5280
5281
5282
5283
5284
5285
5286
5287
5288
5289
5290
5291
5292
5293
5294
5295
5296
5297
5298
5299
5300
5301
5302
5303
5304
5305
5306
5307
5308
5309
5310
5311
5312
5313
5314
5315
5316
5317
5318
5319
5320
5321
5322
5323
5324
5325
5326
5327
5328
5329
5330
5331
5332
5333
5334
5335
5336
5337
5338
5339
5340
5341
5342
5343
5344
5345
5346
5347
5348
5349
5350
5351
5352
5353
5354
5355
5356
5357
5358
5359
5360
5361
5362
5363
5364
5365
5366
5367
5368
5369
5370
5371
5372
5373
5374
5375
5376
5377
5378
5379
5380
5381
5382
5383
5384
5385
5386
5387
5388
5389
def create_service_connector(
    self,
    name: str,
    connector_type: str,
    resource_type: Optional[str] = None,
    auth_method: Optional[str] = None,
    configuration: Optional[Dict[str, str]] = None,
    resource_id: Optional[str] = None,
    description: str = "",
    expiration_seconds: Optional[int] = None,
    expires_at: Optional[datetime] = None,
    expires_skew_tolerance: Optional[int] = None,
    labels: Optional[Dict[str, str]] = None,
    auto_configure: bool = False,
    verify: bool = True,
    list_resources: bool = True,
    register: bool = True,
) -> Tuple[
    Optional[
        Union[
            ServiceConnectorResponse,
            ServiceConnectorRequest,
        ]
    ],
    Optional[ServiceConnectorResourcesModel],
]:
    """Create, validate and/or register a service connector.

    Args:
        name: The name of the service connector.
        connector_type: The service connector type.
        auth_method: The authentication method of the service connector.
            May be omitted if auto-configuration is used.
        resource_type: The resource type for the service connector.
        configuration: The configuration of the service connector.
        resource_id: The resource id of the service connector.
        description: The description of the service connector.
        expiration_seconds: The expiration time of the service connector.
        expires_at: The expiration time of the service connector.
        expires_skew_tolerance: The allowed expiration skew for the service
            connector credentials.
        labels: The labels of the service connector.
        auto_configure: Whether to automatically configure the service
            connector from the local environment.
        verify: Whether to verify that the service connector configuration
            and credentials can be used to gain access to the resource.
        list_resources: Whether to also list the resources that the service
            connector can give access to (if verify is True).
        register: Whether to register the service connector or not.

    Returns:
        The model of the registered service connector and the resources
        that the service connector can give access to (if verify is True).

    Raises:
        ValueError: If the arguments are invalid.
        KeyError: If the service connector type is not found.
        NotImplementedError: If auto-configuration is not supported or
            not implemented for the service connector type.
        AuthorizationException: If the connector verification failed due
            to authorization issues.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    connector_instance: Optional[ServiceConnector] = None
    connector_resources: Optional[ServiceConnectorResourcesModel] = None

    # Get the service connector type class
    try:
        connector = self.zen_store.get_service_connector_type(
            connector_type=connector_type,
        )
    except KeyError:
        raise KeyError(
            f"Service connector type {connector_type} not found."
            "Please check that you have installed all required "
            "Python packages and ZenML integrations and try again."
        )

    if not resource_type and len(connector.resource_types) == 1:
        resource_type = connector.resource_types[0].resource_type

    # If auto_configure is set, we will try to automatically configure the
    # service connector from the local environment
    if auto_configure:
        if not connector.supports_auto_configuration:
            raise NotImplementedError(
                f"The {connector.name} service connector type "
                "does not support auto-configuration."
            )
        if not connector.local:
            raise NotImplementedError(
                f"The {connector.name} service connector type "
                "implementation is not available locally. Please "
                "check that you have installed all required Python "
                "packages and ZenML integrations and try again, or "
                "skip auto-configuration."
            )

        assert connector.connector_class is not None

        connector_instance = connector.connector_class.auto_configure(
            resource_type=resource_type,
            auth_method=auth_method,
            resource_id=resource_id,
        )
        assert connector_instance is not None
        connector_request = connector_instance.to_model(
            name=name,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
            description=description or "",
            labels=labels,
        )

        if verify:
            # Prefer to verify the connector config server-side if the
            # implementation if available there, because it ensures
            # that the connector can be shared with other users or used
            # from other machines and because some auth methods rely on the
            # server-side authentication environment
            if connector.remote:
                connector_resources = (
                    self.zen_store.verify_service_connector_config(
                        connector_request,
                        list_resources=list_resources,
                    )
                )
            else:
                connector_resources = connector_instance.verify(
                    list_resources=list_resources,
                )

            if connector_resources.error:
                # Raise an exception if the connector verification failed
                raise AuthorizationException(connector_resources.error)

    else:
        if not auth_method:
            if len(connector.auth_methods) == 1:
                auth_method = connector.auth_methods[0].auth_method
            else:
                raise ValueError(
                    f"Multiple authentication methods are available for "
                    f"the {connector.name} service connector type. Please "
                    f"specify one of the following: "
                    f"{list(connector.auth_method_dict.keys())}."
                )

        connector_request = ServiceConnectorRequest(
            name=name,
            connector_type=connector_type,
            description=description,
            auth_method=auth_method,
            expiration_seconds=expiration_seconds,
            expires_at=expires_at,
            expires_skew_tolerance=expires_skew_tolerance,
            user=self.active_user.id,
            workspace=self.active_workspace.id,
            labels=labels or {},
        )
        # Validate and configure the resources
        connector_request.validate_and_configure_resources(
            connector_type=connector,
            resource_types=resource_type,
            resource_id=resource_id,
            configuration=configuration,
        )
        if verify:
            # Prefer to verify the connector config server-side if the
            # implementation if available there, because it ensures
            # that the connector can be shared with other users or used
            # from other machines and because some auth methods rely on the
            # server-side authentication environment
            if connector.remote:
                connector_resources = (
                    self.zen_store.verify_service_connector_config(
                        connector_request,
                        list_resources=list_resources,
                    )
                )
            else:
                connector_instance = (
                    service_connector_registry.instantiate_connector(
                        model=connector_request
                    )
                )
                connector_resources = connector_instance.verify(
                    list_resources=list_resources,
                )

            if connector_resources.error:
                # Raise an exception if the connector verification failed
                raise AuthorizationException(connector_resources.error)

            # For resource types that don't support multi-instances, it's
            # better to save the default resource ID in the connector, if
            # available. Otherwise, we'll need to instantiate the connector
            # again to get the default resource ID.
            connector_request.resource_id = (
                connector_request.resource_id
                or connector_resources.get_default_resource_id()
            )

    if not register:
        return connector_request, connector_resources

    # Register the new model
    connector_response = self.zen_store.create_service_connector(
        service_connector=connector_request
    )

    if connector_resources:
        connector_resources.id = connector_response.id
        connector_resources.name = connector_response.name
        connector_resources.connector_type = (
            connector_response.connector_type
        )

    return connector_response, connector_resources

create_stack(name, components, stack_spec_file=None, labels=None)

Registers a stack and its components.

Parameters:

Name Type Description Default
name str

The name of the stack to register.

required
components Mapping[StackComponentType, Union[str, UUID]]

dictionary which maps component types to component names

required
stack_spec_file Optional[str]

path to the stack spec file

None
labels Optional[Dict[str, Any]]

The labels of the stack.

None

Returns:

Type Description
StackResponse

The model of the registered stack.

Source code in src/zenml/client.py
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
def create_stack(
    self,
    name: str,
    components: Mapping[StackComponentType, Union[str, UUID]],
    stack_spec_file: Optional[str] = None,
    labels: Optional[Dict[str, Any]] = None,
) -> StackResponse:
    """Registers a stack and its components.

    Args:
        name: The name of the stack to register.
        components: dictionary which maps component types to component names
        stack_spec_file: path to the stack spec file
        labels: The labels of the stack.

    Returns:
        The model of the registered stack.
    """
    stack_components = {}

    for c_type, c_identifier in components.items():
        # Skip non-existent components.
        if not c_identifier:
            continue

        # Get the component.
        component = self.get_stack_component(
            name_id_or_prefix=c_identifier,
            component_type=c_type,
        )
        stack_components[c_type] = [component.id]

    stack = StackRequest(
        name=name,
        components=stack_components,
        stack_spec_path=stack_spec_file,
        workspace=self.active_workspace.id,
        user=self.active_user.id,
        labels=labels,
    )

    self._validate_stack_configuration(stack=stack)

    return self.zen_store.create_stack(stack=stack)

create_stack_component(name, flavor, component_type, configuration, labels=None)

Registers a stack component.

Parameters:

Name Type Description Default
name str

The name of the stack component.

required
flavor str

The flavor of the stack component.

required
component_type StackComponentType

The type of the stack component.

required
configuration Dict[str, str]

The configuration of the stack component.

required
labels Optional[Dict[str, Any]]

The labels of the stack component.

None

Returns:

Type Description
ComponentResponse

The model of the registered component.

Source code in src/zenml/client.py
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
def create_stack_component(
    self,
    name: str,
    flavor: str,
    component_type: StackComponentType,
    configuration: Dict[str, str],
    labels: Optional[Dict[str, Any]] = None,
) -> "ComponentResponse":
    """Registers a stack component.

    Args:
        name: The name of the stack component.
        flavor: The flavor of the stack component.
        component_type: The type of the stack component.
        configuration: The configuration of the stack component.
        labels: The labels of the stack component.

    Returns:
        The model of the registered component.
    """
    from zenml.stack.utils import (
        validate_stack_component_config,
        warn_if_config_server_mismatch,
    )

    validated_config = validate_stack_component_config(
        configuration_dict=configuration,
        flavor=flavor,
        component_type=component_type,
        # Always enforce validation of custom flavors
        validate_custom_flavors=True,
    )
    # Guaranteed to not be None by setting
    # `validate_custom_flavors=True` above
    assert validated_config is not None
    warn_if_config_server_mismatch(validated_config)

    create_component_model = ComponentRequest(
        name=name,
        type=component_type,
        flavor=flavor,
        configuration=configuration,
        user=self.active_user.id,
        workspace=self.active_workspace.id,
        labels=labels,
    )

    # Register the new model
    return self.zen_store.create_stack_component(
        component=create_component_model
    )

create_tag(tag)

Creates a new tag.

Parameters:

Name Type Description Default
tag TagRequest

the Tag to be created.

required

Returns:

Type Description
TagResponse

The newly created tag.

Source code in src/zenml/client.py
7515
7516
7517
7518
7519
7520
7521
7522
7523
7524
def create_tag(self, tag: TagRequest) -> TagResponse:
    """Creates a new tag.

    Args:
        tag: the Tag to be created.

    Returns:
        The newly created tag.
    """
    return self.zen_store.create_tag(tag=tag)

create_trigger(name, event_source_id, event_filter, action_id, description='')

Registers a trigger.

Parameters:

Name Type Description Default
name str

The name of the trigger to create.

required
event_source_id UUID

The id of the event source id

required
event_filter Dict[str, Any]

The event filter

required
action_id UUID

The ID of the action that should be triggered.

required
description str

The description of the trigger

''

Returns:

Type Description
TriggerResponse

The created trigger.

Source code in src/zenml/client.py
3123
3124
3125
3126
3127
3128
3129
3130
3131
3132
3133
3134
3135
3136
3137
3138
3139
3140
3141
3142
3143
3144
3145
3146
3147
3148
3149
3150
3151
3152
3153
3154
@_fail_for_sql_zen_store
def create_trigger(
    self,
    name: str,
    event_source_id: UUID,
    event_filter: Dict[str, Any],
    action_id: UUID,
    description: str = "",
) -> TriggerResponse:
    """Registers a trigger.

    Args:
        name: The name of the trigger to create.
        event_source_id: The id of the event source id
        event_filter: The event filter
        action_id: The ID of the action that should be triggered.
        description: The description of the trigger

    Returns:
        The created trigger.
    """
    trigger = TriggerRequest(
        name=name,
        description=description,
        event_source_id=event_source_id,
        event_filter=event_filter,
        action_id=action_id,
        user=self.active_user.id,
        workspace=self.active_workspace.id,
    )

    return self.zen_store.create_trigger(trigger=trigger)

create_user(name, password=None, is_admin=False)

Create a new user.

Parameters:

Name Type Description Default
name str

The name of the user.

required
password Optional[str]

The password of the user. If not provided, the user will be created with empty password.

None
is_admin bool

Whether the user should be an admin.

False

Returns:

Type Description
UserResponse

The model of the created user.

Source code in src/zenml/client.py
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
def create_user(
    self,
    name: str,
    password: Optional[str] = None,
    is_admin: bool = False,
) -> UserResponse:
    """Create a new user.

    Args:
        name: The name of the user.
        password: The password of the user. If not provided, the user will
            be created with empty password.
        is_admin: Whether the user should be an admin.

    Returns:
        The model of the created user.
    """
    user = UserRequest(
        name=name, password=password or None, is_admin=is_admin
    )
    user.active = (
        password != "" if self.zen_store.type != StoreType.REST else True
    )
    created_user = self.zen_store.create_user(user=user)

    return created_user

create_workspace(name, description)

Create a new workspace.

Parameters:

Name Type Description Default
name str

Name of the workspace.

required
description str

Description of the workspace.

required

Returns:

Type Description
WorkspaceResponse

The created workspace.

Source code in src/zenml/client.py
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
def create_workspace(
    self, name: str, description: str
) -> WorkspaceResponse:
    """Create a new workspace.

    Args:
        name: Name of the workspace.
        description: Description of the workspace.

    Returns:
        The created workspace.
    """
    return self.zen_store.create_workspace(
        WorkspaceRequest(name=name, description=description)
    )

deactivate_user(name_id_or_prefix)

Deactivate a user and generate an activation token.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the user to reset.

required

Returns:

Type Description
UserResponse

The deactivated user.

Source code in src/zenml/client.py
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
@_fail_for_sql_zen_store
def deactivate_user(self, name_id_or_prefix: str) -> "UserResponse":
    """Deactivate a user and generate an activation token.

    Args:
        name_id_or_prefix: The name or ID of the user to reset.

    Returns:
        The deactivated user.
    """
    from zenml.zen_stores.rest_zen_store import RestZenStore

    user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
    assert isinstance(self.zen_store, RestZenStore)
    return self.zen_store.deactivate_user(user_name_or_id=user.name)

delete_action(name_id_or_prefix)

Delete an action.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the action to delete.

required
Source code in src/zenml/client.py
3106
3107
3108
3109
3110
3111
3112
3113
3114
3115
3116
3117
3118
3119
@_fail_for_sql_zen_store
def delete_action(self, name_id_or_prefix: Union[str, UUID]) -> None:
    """Delete an action.

    Args:
        name_id_or_prefix: The name, id or prefix id of the action
            to delete.
    """
    action = self.get_action(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    self.zen_store.delete_action(action_id=action.id)
    logger.info("Deleted action with name '%s'.", action.name)

Delete all model version to artifact links in Model Control Plane.

Parameters:

Name Type Description Default
model_version_id UUID

The id of the model version holding the link.

required
only_links bool

If true, only delete the link to the artifact.

required
Source code in src/zenml/client.py
6587
6588
6589
6590
6591
6592
6593
6594
6595
6596
6597
6598
def delete_all_model_version_artifact_links(
    self, model_version_id: UUID, only_links: bool
) -> None:
    """Delete all model version to artifact links in Model Control Plane.

    Args:
        model_version_id: The id of the model version holding the link.
        only_links: If true, only delete the link to the artifact.
    """
    self.zen_store.delete_all_model_version_artifact_links(
        model_version_id, only_links
    )

delete_api_key(service_account_name_id_or_prefix, name_id_or_prefix)

Delete an API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to delete the API key for.

required
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the API key.

required
Source code in src/zenml/client.py
7486
7487
7488
7489
7490
7491
7492
7493
7494
7495
7496
7497
7498
7499
7500
7501
7502
7503
7504
7505
7506
def delete_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Delete an API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to delete the API key for.
        name_id_or_prefix: The name, ID or prefix of the API key.
    """
    api_key = self.get_api_key(
        service_account_name_id_or_prefix=service_account_name_id_or_prefix,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    self.zen_store.delete_api_key(
        service_account_id=api_key.service_account.id,
        api_key_name_or_id=api_key.id,
    )

delete_artifact(name_id_or_prefix)

Delete an artifact.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to delete.

required
Source code in src/zenml/client.py
4146
4147
4148
4149
4150
4151
4152
4153
4154
4155
4156
4157
def delete_artifact(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Delete an artifact.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to delete.
    """
    artifact = self.get_artifact(name_id_or_prefix=name_id_or_prefix)
    self.zen_store.delete_artifact(artifact_id=artifact.id)
    logger.info(f"Deleted artifact '{artifact.name}'.")

delete_artifact_version(name_id_or_prefix, version=None, delete_metadata=True, delete_from_artifact_store=False)

Delete an artifact version.

By default, this will delete only the metadata of the artifact from the database, not the actual object stored in the artifact store.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The ID of artifact version or name or prefix of the artifact to delete.

required
version Optional[str]

The version of the artifact to delete.

None
delete_metadata bool

If True, delete the metadata of the artifact version from the database.

True
delete_from_artifact_store bool

If True, delete the artifact object itself from the artifact store.

False
Source code in src/zenml/client.py
4368
4369
4370
4371
4372
4373
4374
4375
4376
4377
4378
4379
4380
4381
4382
4383
4384
4385
4386
4387
4388
4389
4390
4391
4392
4393
4394
4395
4396
4397
def delete_artifact_version(
    self,
    name_id_or_prefix: Union[str, UUID],
    version: Optional[str] = None,
    delete_metadata: bool = True,
    delete_from_artifact_store: bool = False,
) -> None:
    """Delete an artifact version.

    By default, this will delete only the metadata of the artifact from the
    database, not the actual object stored in the artifact store.

    Args:
        name_id_or_prefix: The ID of artifact version or name or prefix of the artifact to
            delete.
        version: The version of the artifact to delete.
        delete_metadata: If True, delete the metadata of the artifact
            version from the database.
        delete_from_artifact_store: If True, delete the artifact object
            itself from the artifact store.
    """
    artifact_version = self.get_artifact_version(
        name_id_or_prefix=name_id_or_prefix, version=version
    )
    if delete_from_artifact_store:
        self._delete_artifact_from_artifact_store(
            artifact_version=artifact_version
        )
    if delete_metadata:
        self._delete_artifact_version(artifact_version=artifact_version)

delete_authorized_device(id_or_prefix)

Delete an authorized device.

Parameters:

Name Type Description Default
id_or_prefix Union[str, UUID]

The ID or ID prefix of the authorized device.

required
Source code in src/zenml/client.py
6785
6786
6787
6788
6789
6790
6791
6792
6793
6794
6795
6796
6797
6798
def delete_authorized_device(
    self,
    id_or_prefix: Union[str, UUID],
) -> None:
    """Delete an authorized device.

    Args:
        id_or_prefix: The ID or ID prefix of the authorized device.
    """
    device = self.get_authorized_device(
        id_or_prefix=id_or_prefix,
        allow_id_prefix_match=False,
    )
    self.zen_store.delete_authorized_device(device.id)

delete_build(id_or_prefix)

Delete a build.

Parameters:

Name Type Description Default
id_or_prefix str

The id or id prefix of the build.

required
Source code in src/zenml/client.py
2735
2736
2737
2738
2739
2740
2741
2742
def delete_build(self, id_or_prefix: str) -> None:
    """Delete a build.

    Args:
        id_or_prefix: The id or id prefix of the build.
    """
    build = self.get_build(id_or_prefix=id_or_prefix)
    self.zen_store.delete_build(build_id=build.id)

delete_code_repository(name_id_or_prefix)

Delete a code repository.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the code repository.

required
Source code in src/zenml/client.py
5152
5153
5154
5155
5156
5157
5158
5159
5160
5161
5162
5163
5164
def delete_code_repository(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Delete a code repository.

    Args:
        name_id_or_prefix: The name, ID or prefix of the code repository.
    """
    repo = self.get_code_repository(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    self.zen_store.delete_code_repository(code_repository_id=repo.id)

delete_deployment(id_or_prefix)

Delete a deployment.

Parameters:

Name Type Description Default
id_or_prefix str

The id or id prefix of the deployment.

required
Source code in src/zenml/client.py
3442
3443
3444
3445
3446
3447
3448
3449
3450
3451
def delete_deployment(self, id_or_prefix: str) -> None:
    """Delete a deployment.

    Args:
        id_or_prefix: The id or id prefix of the deployment.
    """
    deployment = self.get_deployment(
        id_or_prefix=id_or_prefix, hydrate=False
    )
    self.zen_store.delete_deployment(deployment_id=deployment.id)

delete_event_source(name_id_or_prefix)

Deletes an event_source.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the event_source to deregister.

required
Source code in src/zenml/client.py
2920
2921
2922
2923
2924
2925
2926
2927
2928
2929
2930
2931
2932
2933
@_fail_for_sql_zen_store
def delete_event_source(self, name_id_or_prefix: Union[str, UUID]) -> None:
    """Deletes an event_source.

    Args:
        name_id_or_prefix: The name, id or prefix id of the event_source
            to deregister.
    """
    event_source = self.get_event_source(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    self.zen_store.delete_event_source(event_source_id=event_source.id)
    logger.info("Deleted event_source with name '%s'.", event_source.name)

delete_flavor(name_id_or_prefix)

Deletes a flavor.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name, id or prefix of the id for the flavor to delete.

required
Source code in src/zenml/client.py
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
def delete_flavor(self, name_id_or_prefix: str) -> None:
    """Deletes a flavor.

    Args:
        name_id_or_prefix: The name, id or prefix of the id for the
            flavor to delete.
    """
    flavor = self.get_flavor(
        name_id_or_prefix, allow_name_prefix_match=False
    )
    self.zen_store.delete_flavor(flavor_id=flavor.id)

    logger.info(f"Deleted flavor '{flavor.name}' of type '{flavor.type}'.")

delete_model(model_name_or_id)

Deletes a model from Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

name or id of the model to be deleted.

required
Source code in src/zenml/client.py
6091
6092
6093
6094
6095
6096
6097
def delete_model(self, model_name_or_id: Union[str, UUID]) -> None:
    """Deletes a model from Model Control Plane.

    Args:
        model_name_or_id: name or id of the model to be deleted.
    """
    self.zen_store.delete_model(model_name_or_id=model_name_or_id)

delete_model_version(model_version_id)

Deletes a model version from Model Control Plane.

Parameters:

Name Type Description Default
model_version_id UUID

Id of the model version to be deleted.

required
Source code in src/zenml/client.py
6260
6261
6262
6263
6264
6265
6266
6267
6268
6269
6270
6271
def delete_model_version(
    self,
    model_version_id: UUID,
) -> None:
    """Deletes a model version from Model Control Plane.

    Args:
        model_version_id: Id of the model version to be deleted.
    """
    self.zen_store.delete_model_version(
        model_version_id=model_version_id,
    )

Delete model version to artifact link in Model Control Plane.

Parameters:

Name Type Description Default
model_version_id UUID

The id of the model version holding the link.

required
artifact_version_id UUID

The id of the artifact version to be deleted.

required

Raises:

Type Description
RuntimeError

If more than one artifact link is found for given filters.

Source code in src/zenml/client.py
6555
6556
6557
6558
6559
6560
6561
6562
6563
6564
6565
6566
6567
6568
6569
6570
6571
6572
6573
6574
6575
6576
6577
6578
6579
6580
6581
6582
6583
6584
6585
def delete_model_version_artifact_link(
    self, model_version_id: UUID, artifact_version_id: UUID
) -> None:
    """Delete model version to artifact link in Model Control Plane.

    Args:
        model_version_id: The id of the model version holding the link.
        artifact_version_id: The id of the artifact version to be deleted.

    Raises:
        RuntimeError: If more than one artifact link is found for given filters.
    """
    artifact_links = self.list_model_version_artifact_links(
        model_version_id=model_version_id,
        artifact_version_id=artifact_version_id,
    )
    if artifact_links.items:
        if artifact_links.total > 1:
            raise RuntimeError(
                "More than one artifact link found for give model version "
                f"`{model_version_id}` and artifact version "
                f"`{artifact_version_id}`. This should not be happening and "
                "might indicate a corrupted state of your ZenML database. "
                "Please seek support via Community Slack."
            )
        self.zen_store.delete_model_version_artifact_link(
            model_version_id=model_version_id,
            model_version_artifact_link_name_or_id=artifact_links.items[
                0
            ].id,
        )

delete_pipeline(name_id_or_prefix)

Delete a pipeline.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the pipeline.

required
Source code in src/zenml/client.py
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
def delete_pipeline(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Delete a pipeline.

    Args:
        name_id_or_prefix: The name, ID or ID prefix of the pipeline.
    """
    pipeline = self.get_pipeline(name_id_or_prefix=name_id_or_prefix)
    self.zen_store.delete_pipeline(pipeline_id=pipeline.id)

delete_pipeline_run(name_id_or_prefix)

Deletes a pipeline run.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name, ID, or prefix of the pipeline run.

required
Source code in src/zenml/client.py
3918
3919
3920
3921
3922
3923
3924
3925
3926
3927
3928
3929
3930
def delete_pipeline_run(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Deletes a pipeline run.

    Args:
        name_id_or_prefix: Name, ID, or prefix of the pipeline run.
    """
    run = self.get_pipeline_run(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    self.zen_store.delete_run(run_id=run.id)

delete_run_template(name_id_or_prefix)

Delete a run template.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name/ID/ID prefix of the template to delete.

required
Source code in src/zenml/client.py
3623
3624
3625
3626
3627
3628
3629
3630
3631
3632
3633
3634
3635
3636
3637
3638
3639
3640
def delete_run_template(self, name_id_or_prefix: Union[str, UUID]) -> None:
    """Delete a run template.

    Args:
        name_id_or_prefix: Name/ID/ID prefix of the template to delete.
    """
    if is_valid_uuid(name_id_or_prefix):
        template_id = (
            UUID(name_id_or_prefix)
            if isinstance(name_id_or_prefix, str)
            else name_id_or_prefix
        )
    else:
        template_id = self.get_run_template(
            name_id_or_prefix, hydrate=False
        ).id

    self.zen_store.delete_run_template(template_id=template_id)

delete_schedule(name_id_or_prefix)

Delete a schedule.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the schedule to delete.

required
Source code in src/zenml/client.py
3750
3751
3752
3753
3754
3755
3756
3757
3758
3759
3760
3761
3762
3763
3764
3765
def delete_schedule(self, name_id_or_prefix: Union[str, UUID]) -> None:
    """Delete a schedule.

    Args:
        name_id_or_prefix: The name, id or prefix id of the schedule
            to delete.
    """
    schedule = self.get_schedule(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    logger.warning(
        f"Deleting schedule '{name_id_or_prefix}'... This will only delete "
        "the reference of the schedule from ZenML. Please make sure to "
        "manually stop/delete this schedule in your orchestrator as well!"
    )
    self.zen_store.delete_schedule(schedule_id=schedule.id)

delete_secret(name_id_or_prefix, scope=None)

Deletes a secret.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the secret.

required
scope Optional[SecretScope]

The scope of the secret to delete.

None
Source code in src/zenml/client.py
4828
4829
4830
4831
4832
4833
4834
4835
4836
4837
4838
4839
4840
4841
4842
4843
4844
4845
def delete_secret(
    self, name_id_or_prefix: str, scope: Optional[SecretScope] = None
) -> None:
    """Deletes a secret.

    Args:
        name_id_or_prefix: The name or ID of the secret.
        scope: The scope of the secret to delete.
    """
    secret = self.get_secret(
        name_id_or_prefix=name_id_or_prefix,
        scope=scope,
        # Don't allow partial name matches, but allow partial ID matches
        allow_partial_name_match=False,
        allow_partial_id_match=True,
    )

    self.zen_store.delete_secret(secret_id=secret.id)

delete_service(name_id_or_prefix)

Delete a service.

Parameters:

Name Type Description Default
name_id_or_prefix UUID

The name or ID of the service to delete.

required
Source code in src/zenml/client.py
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
def delete_service(self, name_id_or_prefix: UUID) -> None:
    """Delete a service.

    Args:
        name_id_or_prefix: The name or ID of the service to delete.
    """
    service = self.get_service(
        name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    self.zen_store.delete_service(service_id=service.id)

delete_service_account(name_id_or_prefix)

Delete a service account.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service account to delete.

required
Source code in src/zenml/client.py
7212
7213
7214
7215
7216
7217
7218
7219
7220
7221
7222
7223
7224
7225
7226
def delete_service_account(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Delete a service account.

    Args:
        name_id_or_prefix: The name or ID of the service account to delete.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    self.zen_store.delete_service_account(
        service_account_name_or_id=service_account.id
    )

delete_service_connector(name_id_or_prefix)

Deletes a registered service connector.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The ID or name of the service connector to delete.

required
Source code in src/zenml/client.py
5762
5763
5764
5765
5766
5767
5768
5769
5770
5771
5772
5773
5774
5775
5776
5777
5778
5779
5780
5781
5782
5783
def delete_service_connector(
    self,
    name_id_or_prefix: Union[str, UUID],
) -> None:
    """Deletes a registered service connector.

    Args:
        name_id_or_prefix: The ID or name of the service connector to delete.
    """
    service_connector = self.get_service_connector(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    self.zen_store.delete_service_connector(
        service_connector_id=service_connector.id
    )
    logger.info(
        "Removed service connector (type: %s) with name '%s'.",
        service_connector.type,
        service_connector.name,
    )

delete_stack(name_id_or_prefix, recursive=False)

Deregisters a stack.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the stack to deregister.

required
recursive bool

If True, all components of the stack which are not associated with any other stack will also be deleted.

False

Raises:

Type Description
ValueError

If the stack is the currently active stack for this client.

Source code in src/zenml/client.py
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
def delete_stack(
    self, name_id_or_prefix: Union[str, UUID], recursive: bool = False
) -> None:
    """Deregisters a stack.

    Args:
        name_id_or_prefix: The name, id or prefix id of the stack
            to deregister.
        recursive: If `True`, all components of the stack which are not
            associated with any other stack will also be deleted.

    Raises:
        ValueError: If the stack is the currently active stack for this
            client.
    """
    stack = self.get_stack(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    if stack.id == self.active_stack_model.id:
        raise ValueError(
            f"Unable to deregister active stack '{stack.name}'. Make "
            f"sure to designate a new active stack before deleting this "
            f"one."
        )

    cfg = GlobalConfiguration()
    if stack.id == cfg.active_stack_id:
        raise ValueError(
            f"Unable to deregister '{stack.name}' as it is the active "
            f"stack within your global configuration. Make "
            f"sure to designate a new active stack before deleting this "
            f"one."
        )

    if recursive:
        stack_components_free_for_deletion = []

        # Get all stack components associated with this stack
        for component_type, component_model in stack.components.items():
            # Get stack associated with the stack component

            stacks = self.list_stacks(
                component_id=component_model[0].id, size=2, page=1
            )

            # Check if the stack component is part of another stack
            if len(stacks) == 1 and stack.id == stacks[0].id:
                stack_components_free_for_deletion.append(
                    (component_type, component_model)
                )

        self.delete_stack(stack.id)

        for (
            stack_component_type,
            stack_component_model,
        ) in stack_components_free_for_deletion:
            self.delete_stack_component(
                stack_component_model[0].name, stack_component_type
            )

        logger.info("Deregistered stack with name '%s'.", stack.name)
        return

    self.zen_store.delete_stack(stack_id=stack.id)
    logger.info("Deregistered stack with name '%s'.", stack.name)

delete_stack_component(name_id_or_prefix, component_type)

Deletes a registered stack component.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The model of the component to delete.

required
component_type StackComponentType

The type of the component to delete.

required
Source code in src/zenml/client.py
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
def delete_stack_component(
    self,
    name_id_or_prefix: Union[str, UUID],
    component_type: StackComponentType,
) -> None:
    """Deletes a registered stack component.

    Args:
        name_id_or_prefix: The model of the component to delete.
        component_type: The type of the component to delete.
    """
    component = self.get_stack_component(
        name_id_or_prefix=name_id_or_prefix,
        component_type=component_type,
        allow_name_prefix_match=False,
    )

    self.zen_store.delete_stack_component(component_id=component.id)
    logger.info(
        "Deregistered stack component (type: %s) with name '%s'.",
        component.type,
        component.name,
    )

delete_tag(tag_name_or_id)

Deletes a tag.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or id of the tag to be deleted.

required
Source code in src/zenml/client.py
7526
7527
7528
7529
7530
7531
7532
def delete_tag(self, tag_name_or_id: Union[str, UUID]) -> None:
    """Deletes a tag.

    Args:
        tag_name_or_id: name or id of the tag to be deleted.
    """
    self.zen_store.delete_tag(tag_name_or_id=tag_name_or_id)

delete_trigger(name_id_or_prefix)

Deletes an trigger.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix id of the trigger to deregister.

required
Source code in src/zenml/client.py
3306
3307
3308
3309
3310
3311
3312
3313
3314
3315
3316
3317
3318
3319
@_fail_for_sql_zen_store
def delete_trigger(self, name_id_or_prefix: Union[str, UUID]) -> None:
    """Deletes an trigger.

    Args:
        name_id_or_prefix: The name, id or prefix id of the trigger
            to deregister.
    """
    trigger = self.get_trigger(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    self.zen_store.delete_trigger(trigger_id=trigger.id)
    logger.info("Deleted trigger with name '%s'.", trigger.name)

delete_trigger_execution(trigger_execution_id)

Delete a trigger execution.

Parameters:

Name Type Description Default
trigger_execution_id UUID

The ID of the trigger execution to delete.

required
Source code in src/zenml/client.py
6859
6860
6861
6862
6863
6864
6865
6866
6867
def delete_trigger_execution(self, trigger_execution_id: UUID) -> None:
    """Delete a trigger execution.

    Args:
        trigger_execution_id: The ID of the trigger execution to delete.
    """
    self.zen_store.delete_trigger_execution(
        trigger_execution_id=trigger_execution_id
    )

delete_user(name_id_or_prefix)

Delete a user.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the user to delete.

required
Source code in src/zenml/client.py
947
948
949
950
951
952
953
954
def delete_user(self, name_id_or_prefix: str) -> None:
    """Delete a user.

    Args:
        name_id_or_prefix: The name or ID of the user to delete.
    """
    user = self.get_user(name_id_or_prefix, allow_name_prefix_match=False)
    self.zen_store.delete_user(user_name_or_id=user.name)

delete_workspace(name_id_or_prefix)

Delete a workspace.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or ID of the workspace to delete.

required

Raises:

Type Description
IllegalOperationError

If the workspace to delete is the active workspace.

Source code in src/zenml/client.py
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
def delete_workspace(self, name_id_or_prefix: str) -> None:
    """Delete a workspace.

    Args:
        name_id_or_prefix: The name or ID of the workspace to delete.

    Raises:
        IllegalOperationError: If the workspace to delete is the active
            workspace.
    """
    workspace = self.get_workspace(
        name_id_or_prefix, allow_name_prefix_match=False
    )
    if self.active_workspace.id == workspace.id:
        raise IllegalOperationError(
            f"Workspace '{name_id_or_prefix}' cannot be deleted since "
            "it is currently active. Please set another workspace as "
            "active first."
        )
    self.zen_store.delete_workspace(workspace_name_or_id=workspace.id)

find_repository(path=None, enable_warnings=False) staticmethod

Search for a ZenML repository directory.

Parameters:

Name Type Description Default
path Optional[Path]

Optional path to look for the repository. If no path is given, this function tries to find the repository using the environment variable ZENML_REPOSITORY_PATH (if set) and recursively searching in the parent directories of the current working directory.

None
enable_warnings bool

If True, warnings are printed if the repository root cannot be found.

False

Returns:

Type Description
Optional[Path]

Absolute path to a ZenML repository directory or None if no

Optional[Path]

repository directory was found.

Source code in src/zenml/client.py
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
@staticmethod
def find_repository(
    path: Optional[Path] = None, enable_warnings: bool = False
) -> Optional[Path]:
    """Search for a ZenML repository directory.

    Args:
        path: Optional path to look for the repository. If no path is
            given, this function tries to find the repository using the
            environment variable `ZENML_REPOSITORY_PATH` (if set) and
            recursively searching in the parent directories of the current
            working directory.
        enable_warnings: If `True`, warnings are printed if the repository
            root cannot be found.

    Returns:
        Absolute path to a ZenML repository directory or None if no
        repository directory was found.
    """
    if not path:
        # try to get path from the environment variable
        env_var_path = os.getenv(ENV_ZENML_REPOSITORY_PATH)
        if env_var_path:
            path = Path(env_var_path)

    if path:
        # explicit path via parameter or environment variable, don't search
        # parent directories
        search_parent_directories = False
        warning_message = (
            f"Unable to find ZenML repository at path '{path}'. Make sure "
            f"to create a ZenML repository by calling `zenml init` when "
            f"specifying an explicit repository path in code or via the "
            f"environment variable '{ENV_ZENML_REPOSITORY_PATH}'."
        )
    else:
        # try to find the repository in the parent directories of the
        # current working directory
        path = Path.cwd()
        search_parent_directories = True
        warning_message = (
            f"Unable to find ZenML repository in your current working "
            f"directory ({path}) or any parent directories. If you "
            f"want to use an existing repository which is in a different "
            f"location, set the environment variable "
            f"'{ENV_ZENML_REPOSITORY_PATH}'. If you want to create a new "
            f"repository, run `zenml init`."
        )

    def _find_repository_helper(path_: Path) -> Optional[Path]:
        """Recursively search parent directories for a ZenML repository.

        Args:
            path_: The path to search.

        Returns:
            Absolute path to a ZenML repository directory or None if no
            repository directory was found.
        """
        if Client.is_repository_directory(path_):
            return path_

        if not search_parent_directories or io_utils.is_root(str(path_)):
            return None

        return _find_repository_helper(path_.parent)

    repository_path = _find_repository_helper(path)

    if repository_path:
        return repository_path.resolve()
    if enable_warnings:
        logger.warning(warning_message)
    return None

get_action(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get an action by name, ID or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, ID or prefix of the action.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ActionResponse

The action.

Source code in src/zenml/client.py
2979
2980
2981
2982
2983
2984
2985
2986
2987
2988
2989
2990
2991
2992
2993
2994
2995
2996
2997
2998
2999
3000
3001
3002
3003
@_fail_for_sql_zen_store
def get_action(
    self,
    name_id_or_prefix: Union[UUID, str],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> ActionResponse:
    """Get an action by name, ID or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the action.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The action.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_action,
        list_method=self.list_actions,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_api_key(service_account_name_id_or_prefix, name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get an API key by name, id or prefix.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to get the API key for.

required
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the API key.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
APIKeyResponse

The API key.

Source code in src/zenml/client.py
7360
7361
7362
7363
7364
7365
7366
7367
7368
7369
7370
7371
7372
7373
7374
7375
7376
7377
7378
7379
7380
7381
7382
7383
7384
7385
7386
7387
7388
7389
7390
7391
7392
7393
7394
7395
7396
7397
7398
7399
7400
7401
7402
7403
7404
7405
7406
7407
7408
7409
7410
def get_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> APIKeyResponse:
    """Get an API key by name, id or prefix.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to get the API key for.
        name_id_or_prefix: The name, ID or ID prefix of the API key.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The API key.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=service_account_name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    def get_api_key_method(
        api_key_name_or_id: str, hydrate: bool = True
    ) -> APIKeyResponse:
        return self.zen_store.get_api_key(
            service_account_id=service_account.id,
            api_key_name_or_id=api_key_name_or_id,
            hydrate=hydrate,
        )

    def list_api_keys_method(
        hydrate: bool = True,
        **filter_args: Any,
    ) -> Page[APIKeyResponse]:
        return self.list_api_keys(
            service_account_name_id_or_prefix=service_account.id,
            hydrate=hydrate,
            **filter_args,
        )

    return self._get_entity_by_id_or_name_or_prefix(
        get_method=get_api_key_method,
        list_method=list_api_keys_method,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_artifact(name_id_or_prefix, hydrate=False)

Get an artifact by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to get.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
ArtifactResponse

The artifact.

Source code in src/zenml/client.py
4043
4044
4045
4046
4047
4048
4049
4050
4051
4052
4053
4054
4055
4056
4057
4058
4059
4060
4061
4062
4063
def get_artifact(
    self,
    name_id_or_prefix: Union[str, UUID],
    hydrate: bool = False,
) -> ArtifactResponse:
    """Get an artifact by name, id or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The artifact.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_artifact,
        list_method=self.list_artifacts,
        name_id_or_prefix=name_id_or_prefix,
        hydrate=hydrate,
    )

get_artifact_version(name_id_or_prefix, version=None, hydrate=True)

Get an artifact version by ID or artifact name.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Either the ID of the artifact version or the name of the artifact.

required
version Optional[str]

The version of the artifact to get. Only used if name_id_or_prefix is the name of the artifact. If not specified, the latest version is returned.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ArtifactVersionResponse

The artifact version.

Source code in src/zenml/client.py
4184
4185
4186
4187
4188
4189
4190
4191
4192
4193
4194
4195
4196
4197
4198
4199
4200
4201
4202
4203
4204
4205
4206
4207
4208
4209
4210
4211
4212
4213
4214
4215
4216
4217
4218
4219
4220
4221
4222
4223
4224
4225
4226
4227
4228
4229
4230
4231
4232
def get_artifact_version(
    self,
    name_id_or_prefix: Union[str, UUID],
    version: Optional[str] = None,
    hydrate: bool = True,
) -> ArtifactVersionResponse:
    """Get an artifact version by ID or artifact name.

    Args:
        name_id_or_prefix: Either the ID of the artifact version or the
            name of the artifact.
        version: The version of the artifact to get. Only used if
            `name_id_or_prefix` is the name of the artifact. If not
            specified, the latest version is returned.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The artifact version.
    """
    from zenml import get_step_context

    if cll := client_lazy_loader(
        method_name="get_artifact_version",
        name_id_or_prefix=name_id_or_prefix,
        version=version,
        hydrate=hydrate,
    ):
        return cll  # type: ignore[return-value]

    artifact = self._get_entity_version_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_artifact_version,
        list_method=self.list_artifact_versions,
        name_id_or_prefix=name_id_or_prefix,
        version=version,
        hydrate=hydrate,
    )
    try:
        step_run = get_step_context().step_run
        client = Client()
        client.zen_store.update_run_step(
            step_run_id=step_run.id,
            step_run_update=StepRunUpdate(
                loaded_artifact_versions={artifact.name: artifact.id}
            ),
        )
    except RuntimeError:
        pass  # Cannot link to step run if called outside a step
    return artifact

get_authorized_device(id_or_prefix, allow_id_prefix_match=True, hydrate=True)

Get an authorized device by id or prefix.

Parameters:

Name Type Description Default
id_or_prefix Union[UUID, str]

The ID or ID prefix of the authorized device.

required
allow_id_prefix_match bool

If True, allow matching by ID prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
OAuthDeviceResponse

The requested authorized device.

Raises:

Type Description
KeyError

If no authorized device is found with the given ID or prefix.

Source code in src/zenml/client.py
6719
6720
6721
6722
6723
6724
6725
6726
6727
6728
6729
6730
6731
6732
6733
6734
6735
6736
6737
6738
6739
6740
6741
6742
6743
6744
6745
6746
6747
6748
6749
6750
6751
6752
6753
6754
6755
6756
6757
6758
6759
def get_authorized_device(
    self,
    id_or_prefix: Union[UUID, str],
    allow_id_prefix_match: bool = True,
    hydrate: bool = True,
) -> OAuthDeviceResponse:
    """Get an authorized device by id or prefix.

    Args:
        id_or_prefix: The ID or ID prefix of the authorized device.
        allow_id_prefix_match: If True, allow matching by ID prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The requested authorized device.

    Raises:
        KeyError: If no authorized device is found with the given ID or
            prefix.
    """
    if isinstance(id_or_prefix, str):
        try:
            id_or_prefix = UUID(id_or_prefix)
        except ValueError:
            if not allow_id_prefix_match:
                raise KeyError(
                    f"No authorized device found with id or prefix "
                    f"'{id_or_prefix}'."
                )
    if isinstance(id_or_prefix, UUID):
        return self.zen_store.get_authorized_device(
            id_or_prefix, hydrate=hydrate
        )
    return self._get_entity_by_prefix(
        get_method=self.zen_store.get_authorized_device,
        list_method=self.list_authorized_devices,
        partial_id_or_name=id_or_prefix,
        allow_name_prefix_match=False,
        hydrate=hydrate,
    )

get_build(id_or_prefix, hydrate=True)

Get a build by id or prefix.

Parameters:

Name Type Description Default
id_or_prefix Union[str, UUID]

The id or id prefix of the build.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
PipelineBuildResponse

The build.

Raises:

Type Description
KeyError

If no build was found for the given id or prefix.

ZenKeyError

If multiple builds were found that match the given id or prefix.

Source code in src/zenml/client.py
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
def get_build(
    self,
    id_or_prefix: Union[str, UUID],
    hydrate: bool = True,
) -> PipelineBuildResponse:
    """Get a build by id or prefix.

    Args:
        id_or_prefix: The id or id prefix of the build.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The build.

    Raises:
        KeyError: If no build was found for the given id or prefix.
        ZenKeyError: If multiple builds were found that match the given
            id or prefix.
    """
    from zenml.utils.uuid_utils import is_valid_uuid

    # First interpret as full UUID
    if is_valid_uuid(id_or_prefix):
        if not isinstance(id_or_prefix, UUID):
            id_or_prefix = UUID(id_or_prefix, version=4)

        return self.zen_store.get_build(
            id_or_prefix,
            hydrate=hydrate,
        )

    entity = self.list_builds(
        id=f"startswith:{id_or_prefix}", hydrate=hydrate
    )

    # If only a single entity is found, return it.
    if entity.total == 1:
        return entity.items[0]

    # If no entity is found, raise an error.
    if entity.total == 0:
        raise KeyError(
            f"No builds have been found that have either an id or prefix "
            f"that matches the provided string '{id_or_prefix}'."
        )

    raise ZenKeyError(
        f"{entity.total} builds have been found that have "
        f"an ID that matches the provided "
        f"string '{id_or_prefix}':\n"
        f"{[entity.items]}.\n"
        f"Please use the id to uniquely identify "
        f"only one of the builds."
    )

get_code_repository(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get a code repository by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the code repository.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
CodeRepositoryResponse

The code repository.

Source code in src/zenml/client.py
5027
5028
5029
5030
5031
5032
5033
5034
5035
5036
5037
5038
5039
5040
5041
5042
5043
5044
5045
5046
5047
5048
5049
5050
def get_code_repository(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> CodeRepositoryResponse:
    """Get a code repository by name, id or prefix.

    Args:
        name_id_or_prefix: The name, ID or ID prefix of the code repository.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The code repository.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_code_repository,
        list_method=self.list_code_repositories,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_deployment(id_or_prefix, hydrate=True)

Get a deployment by id or prefix.

Parameters:

Name Type Description Default
id_or_prefix Union[str, UUID]

The id or id prefix of the deployment.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
PipelineDeploymentResponse

The deployment.

Raises:

Type Description
KeyError

If no deployment was found for the given id or prefix.

ZenKeyError

If multiple deployments were found that match the given id or prefix.

Source code in src/zenml/client.py
3323
3324
3325
3326
3327
3328
3329
3330
3331
3332
3333
3334
3335
3336
3337
3338
3339
3340
3341
3342
3343
3344
3345
3346
3347
3348
3349
3350
3351
3352
3353
3354
3355
3356
3357
3358
3359
3360
3361
3362
3363
3364
3365
3366
3367
3368
3369
3370
3371
3372
3373
3374
3375
3376
3377
def get_deployment(
    self,
    id_or_prefix: Union[str, UUID],
    hydrate: bool = True,
) -> PipelineDeploymentResponse:
    """Get a deployment by id or prefix.

    Args:
        id_or_prefix: The id or id prefix of the deployment.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The deployment.

    Raises:
        KeyError: If no deployment was found for the given id or prefix.
        ZenKeyError: If multiple deployments were found that match the given
            id or prefix.
    """
    from zenml.utils.uuid_utils import is_valid_uuid

    # First interpret as full UUID
    if is_valid_uuid(id_or_prefix):
        id_ = (
            UUID(id_or_prefix)
            if isinstance(id_or_prefix, str)
            else id_or_prefix
        )
        return self.zen_store.get_deployment(id_, hydrate=hydrate)

    entity = self.list_deployments(
        id=f"startswith:{id_or_prefix}",
        hydrate=hydrate,
    )

    # If only a single entity is found, return it.
    if entity.total == 1:
        return entity.items[0]

    # If no entity is found, raise an error.
    if entity.total == 0:
        raise KeyError(
            f"No deployment have been found that have either an id or "
            f"prefix that matches the provided string '{id_or_prefix}'."
        )

    raise ZenKeyError(
        f"{entity.total} deployments have been found that have "
        f"an ID that matches the provided "
        f"string '{id_or_prefix}':\n"
        f"{[entity.items]}.\n"
        f"Please use the id to uniquely identify "
        f"only one of the deployments."
    )

get_event_source(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get an event source by name, ID or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, ID or prefix of the stack.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
EventSourceResponse

The event_source.

Source code in src/zenml/client.py
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
@_fail_for_sql_zen_store
def get_event_source(
    self,
    name_id_or_prefix: Union[UUID, str],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> EventSourceResponse:
    """Get an event source by name, ID or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the stack.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The event_source.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_event_source,
        list_method=self.list_event_sources,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_flavor(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get a stack component flavor.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name, ID or prefix to the id of the flavor to get.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
FlavorResponse

The stack component flavor.

Source code in src/zenml/client.py
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
def get_flavor(
    self,
    name_id_or_prefix: str,
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> FlavorResponse:
    """Get a stack component flavor.

    Args:
        name_id_or_prefix: The name, ID or prefix to the id of the flavor
            to get.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The stack component flavor.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_flavor,
        list_method=self.list_flavors,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_flavor_by_name_and_type(name, component_type)

Fetches a registered flavor.

Parameters:

Name Type Description Default
component_type StackComponentType

The type of the component to fetch.

required
name str

The name of the flavor to fetch.

required

Returns:

Type Description
FlavorResponse

The registered flavor.

Raises:

Type Description
KeyError

If no flavor exists for the given type and name.

Source code in src/zenml/client.py
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
def get_flavor_by_name_and_type(
    self, name: str, component_type: "StackComponentType"
) -> FlavorResponse:
    """Fetches a registered flavor.

    Args:
        component_type: The type of the component to fetch.
        name: The name of the flavor to fetch.

    Returns:
        The registered flavor.

    Raises:
        KeyError: If no flavor exists for the given type and name.
    """
    logger.debug(
        f"Fetching the flavor of type {component_type} with name {name}."
    )

    if not (
        flavors := self.list_flavors(
            type=component_type, name=name, hydrate=True
        ).items
    ):
        raise KeyError(
            f"No flavor with name '{name}' and type '{component_type}' "
            "exists."
        )
    if len(flavors) > 1:
        raise KeyError(
            f"More than one flavor with name {name} and type "
            f"{component_type} exists."
        )

    return flavors[0]

get_flavors_by_type(component_type)

Fetches the list of flavor for a stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

The type of the component to fetch.

required

Returns:

Type Description
Page[FlavorResponse]

The list of flavors.

Source code in src/zenml/client.py
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
def get_flavors_by_type(
    self, component_type: "StackComponentType"
) -> Page[FlavorResponse]:
    """Fetches the list of flavor for a stack component type.

    Args:
        component_type: The type of the component to fetch.

    Returns:
        The list of flavors.
    """
    logger.debug(f"Fetching the flavors of type {component_type}.")

    return self.list_flavors(
        type=component_type,
    )

get_instance() classmethod

Return the Client singleton instance.

Returns:

Type Description
Optional[Client]

The Client singleton instance or None, if the Client hasn't

Optional[Client]

been initialized yet.

Source code in src/zenml/client.py
384
385
386
387
388
389
390
391
392
@classmethod
def get_instance(cls) -> Optional["Client"]:
    """Return the Client singleton instance.

    Returns:
        The Client singleton instance or None, if the Client hasn't
        been initialized yet.
    """
    return cls._global_client

get_model(model_name_or_id, hydrate=True)

Get an existing model from Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

name or id of the model to be retrieved.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ModelResponse

The model of interest.

Source code in src/zenml/client.py
6153
6154
6155
6156
6157
6158
6159
6160
6161
6162
6163
6164
6165
6166
6167
6168
6169
6170
6171
6172
6173
6174
6175
def get_model(
    self,
    model_name_or_id: Union[str, UUID],
    hydrate: bool = True,
) -> ModelResponse:
    """Get an existing model from Model Control Plane.

    Args:
        model_name_or_id: name or id of the model to be retrieved.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The model of interest.
    """
    if cll := client_lazy_loader(
        "get_model", model_name_or_id=model_name_or_id, hydrate=hydrate
    ):
        return cll  # type: ignore[return-value]
    return self.zen_store.get_model(
        model_name_or_id=model_name_or_id,
        hydrate=hydrate,
    )

get_model_version(model_name_or_id=None, model_version_name_or_number_or_id=None, hydrate=True)

Get an existing model version from Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Optional[Union[str, UUID]]

name or id of the model containing the model version.

None
model_version_name_or_number_or_id Optional[Union[str, int, ModelStages, UUID]]

name, id, stage or number of the model version to be retrieved. If skipped - latest version is retrieved.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ModelVersionResponse

The model version of interest.

Raises:

Type Description
RuntimeError

In case method inputs don't adhere to restrictions.

KeyError

In case no model version with the identifiers exists.

ValueError

In case retrieval is attempted using non UUID model version identifier and no model identifier provided.

Source code in src/zenml/client.py
6273
6274
6275
6276
6277
6278
6279
6280
6281
6282
6283
6284
6285
6286
6287
6288
6289
6290
6291
6292
6293
6294
6295
6296
6297
6298
6299
6300
6301
6302
6303
6304
6305
6306
6307
6308
6309
6310
6311
6312
6313
6314
6315
6316
6317
6318
6319
6320
6321
6322
6323
6324
6325
6326
6327
6328
6329
6330
6331
6332
6333
6334
6335
6336
6337
6338
6339
6340
6341
6342
6343
6344
6345
6346
6347
6348
6349
6350
6351
6352
6353
6354
6355
6356
6357
6358
6359
6360
6361
6362
6363
6364
6365
6366
6367
6368
6369
6370
6371
6372
6373
6374
6375
6376
6377
6378
6379
6380
6381
6382
6383
def get_model_version(
    self,
    model_name_or_id: Optional[Union[str, UUID]] = None,
    model_version_name_or_number_or_id: Optional[
        Union[str, int, ModelStages, UUID]
    ] = None,
    hydrate: bool = True,
) -> ModelVersionResponse:
    """Get an existing model version from Model Control Plane.

    Args:
        model_name_or_id: name or id of the model containing the model
            version.
        model_version_name_or_number_or_id: name, id, stage or number of
            the model version to be retrieved. If skipped - latest version
            is retrieved.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The model version of interest.

    Raises:
        RuntimeError: In case method inputs don't adhere to restrictions.
        KeyError: In case no model version with the identifiers exists.
        ValueError: In case retrieval is attempted using non UUID model version
            identifier and no model identifier provided.
    """
    if (
        not is_valid_uuid(model_version_name_or_number_or_id)
        and model_name_or_id is None
    ):
        raise ValueError(
            "No model identifier provided and model version identifier "
            f"`{model_version_name_or_number_or_id}` is not a valid UUID."
        )
    if cll := client_lazy_loader(
        "get_model_version",
        model_name_or_id=model_name_or_id,
        model_version_name_or_number_or_id=model_version_name_or_number_or_id,
        hydrate=hydrate,
    ):
        return cll  # type: ignore[return-value]

    if model_version_name_or_number_or_id is None:
        model_version_name_or_number_or_id = ModelStages.LATEST

    if isinstance(model_version_name_or_number_or_id, UUID):
        return self.zen_store.get_model_version(
            model_version_id=model_version_name_or_number_or_id,
            hydrate=hydrate,
        )
    elif isinstance(model_version_name_or_number_or_id, int):
        model_versions = self.zen_store.list_model_versions(
            model_name_or_id=model_name_or_id,
            model_version_filter_model=ModelVersionFilter(
                number=model_version_name_or_number_or_id,
            ),
            hydrate=hydrate,
        ).items
    elif isinstance(model_version_name_or_number_or_id, str):
        if model_version_name_or_number_or_id == ModelStages.LATEST:
            model_versions = self.zen_store.list_model_versions(
                model_name_or_id=model_name_or_id,
                model_version_filter_model=ModelVersionFilter(
                    sort_by=f"{SorterOps.DESCENDING}:number"
                ),
                hydrate=hydrate,
            ).items

            if len(model_versions) > 0:
                model_versions = [model_versions[0]]
            else:
                model_versions = []
        elif model_version_name_or_number_or_id in ModelStages.values():
            model_versions = self.zen_store.list_model_versions(
                model_name_or_id=model_name_or_id,
                model_version_filter_model=ModelVersionFilter(
                    stage=model_version_name_or_number_or_id
                ),
                hydrate=hydrate,
            ).items
        else:
            model_versions = self.zen_store.list_model_versions(
                model_name_or_id=model_name_or_id,
                model_version_filter_model=ModelVersionFilter(
                    name=model_version_name_or_number_or_id
                ),
                hydrate=hydrate,
            ).items
    else:
        raise RuntimeError(
            f"The model version identifier "
            f"`{model_version_name_or_number_or_id}` is not"
            f"of the correct type."
        )

    if len(model_versions) == 1:
        return model_versions[0]
    elif len(model_versions) == 0:
        raise KeyError(
            f"No model version found for model "
            f"`{model_name_or_id}` with version identifier "
            f"`{model_version_name_or_number_or_id}`."
        )
    else:
        raise RuntimeError(
            f"The model version identifier "
            f"`{model_version_name_or_number_or_id}` is not"
            f"unique for model `{model_name_or_id}`."
        )

get_pipeline(name_id_or_prefix, hydrate=True)

Get a pipeline by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or ID prefix of the pipeline.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
PipelineResponse

The pipeline.

Source code in src/zenml/client.py
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
def get_pipeline(
    self,
    name_id_or_prefix: Union[str, UUID],
    hydrate: bool = True,
) -> PipelineResponse:
    """Get a pipeline by name, id or prefix.

    Args:
        name_id_or_prefix: The name, ID or ID prefix of the pipeline.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The pipeline.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_pipeline,
        list_method=self.list_pipelines,
        name_id_or_prefix=name_id_or_prefix,
        hydrate=hydrate,
    )

get_pipeline_run(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Gets a pipeline run by name, ID, or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name, ID, or prefix of the pipeline run.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
PipelineRunResponse

The pipeline run.

Source code in src/zenml/client.py
3769
3770
3771
3772
3773
3774
3775
3776
3777
3778
3779
3780
3781
3782
3783
3784
3785
3786
3787
3788
3789
3790
3791
3792
def get_pipeline_run(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> PipelineRunResponse:
    """Gets a pipeline run by name, ID, or prefix.

    Args:
        name_id_or_prefix: Name, ID, or prefix of the pipeline run.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The pipeline run.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_run,
        list_method=self.list_pipeline_runs,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_run_step(step_run_id, hydrate=True)

Get a step run by ID.

Parameters:

Name Type Description Default
step_run_id UUID

The ID of the step run to get.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
StepRunResponse

The step run.

Source code in src/zenml/client.py
3934
3935
3936
3937
3938
3939
3940
3941
3942
3943
3944
3945
3946
3947
3948
3949
3950
3951
3952
def get_run_step(
    self,
    step_run_id: UUID,
    hydrate: bool = True,
) -> StepRunResponse:
    """Get a step run by ID.

    Args:
        step_run_id: The ID of the step run to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The step run.
    """
    return self.zen_store.get_run_step(
        step_run_id,
        hydrate=hydrate,
    )

get_run_template(name_id_or_prefix, hydrate=True)

Get a run template.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name/ID/ID prefix of the template to get.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
RunTemplateResponse

The run template.

Source code in src/zenml/client.py
3485
3486
3487
3488
3489
3490
3491
3492
3493
3494
3495
3496
3497
3498
3499
3500
3501
3502
3503
3504
3505
3506
def get_run_template(
    self,
    name_id_or_prefix: Union[str, UUID],
    hydrate: bool = True,
) -> RunTemplateResponse:
    """Get a run template.

    Args:
        name_id_or_prefix: Name/ID/ID prefix of the template to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The run template.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_run_template,
        list_method=self.list_run_templates,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
        hydrate=hydrate,
    )

get_schedule(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get a schedule by name, id or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix of the schedule.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ScheduleResponse

The schedule.

Source code in src/zenml/client.py
3644
3645
3646
3647
3648
3649
3650
3651
3652
3653
3654
3655
3656
3657
3658
3659
3660
3661
3662
3663
3664
3665
3666
3667
def get_schedule(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> ScheduleResponse:
    """Get a schedule by name, id or prefix.

    Args:
        name_id_or_prefix: The name, id or prefix of the schedule.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The schedule.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_schedule,
        list_method=self.list_schedules,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_secret(name_id_or_prefix, scope=None, allow_partial_name_match=True, allow_partial_id_match=True, hydrate=True)

Get a secret.

Get a secret identified by a name, ID or prefix of the name or ID and optionally a scope.

If a scope is not provided, the secret will be searched for in all scopes starting with the innermost scope (user) to the outermost scope (workspace). When a name or prefix is used instead of a UUID value, each scope is first searched for an exact match, then for a ID prefix or name substring match before moving on to the next scope.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix to the id of the secret to get.

required
scope Optional[SecretScope]

The scope of the secret. If not set, all scopes will be searched starting with the innermost scope (user) to the outermost scope (global) until a secret is found.

None
allow_partial_name_match bool

If True, allow partial name matches.

True
allow_partial_id_match bool

If True, allow partial ID matches.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
SecretResponse

The secret.

Raises:

Type Description
KeyError

If no secret is found.

ZenKeyError

If multiple secrets are found.

NotImplementedError

If centralized secrets management is not enabled.

Source code in src/zenml/client.py
4560
4561
4562
4563
4564
4565
4566
4567
4568
4569
4570
4571
4572
4573
4574
4575
4576
4577
4578
4579
4580
4581
4582
4583
4584
4585
4586
4587
4588
4589
4590
4591
4592
4593
4594
4595
4596
4597
4598
4599
4600
4601
4602
4603
4604
4605
4606
4607
4608
4609
4610
4611
4612
4613
4614
4615
4616
4617
4618
4619
4620
4621
4622
4623
4624
4625
4626
4627
4628
4629
4630
4631
4632
4633
4634
4635
4636
4637
4638
4639
4640
4641
4642
4643
4644
4645
4646
4647
4648
4649
4650
4651
4652
4653
4654
4655
4656
4657
4658
4659
4660
4661
4662
4663
4664
4665
4666
4667
4668
4669
4670
4671
4672
4673
4674
4675
4676
4677
4678
4679
4680
4681
4682
4683
4684
4685
4686
4687
def get_secret(
    self,
    name_id_or_prefix: Union[str, UUID],
    scope: Optional[SecretScope] = None,
    allow_partial_name_match: bool = True,
    allow_partial_id_match: bool = True,
    hydrate: bool = True,
) -> SecretResponse:
    """Get a secret.

    Get a secret identified by a name, ID or prefix of the name or ID and
    optionally a scope.

    If a scope is not provided, the secret will be searched for in all
    scopes starting with the innermost scope (user) to the outermost scope
    (workspace). When a name or prefix is used instead of a UUID value, each
    scope is first searched for an exact match, then for a ID prefix or
    name substring match before moving on to the next scope.

    Args:
        name_id_or_prefix: The name, ID or prefix to the id of the secret
            to get.
        scope: The scope of the secret. If not set, all scopes will be
            searched starting with the innermost scope (user) to the
            outermost scope (global) until a secret is found.
        allow_partial_name_match: If True, allow partial name matches.
        allow_partial_id_match: If True, allow partial ID matches.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The secret.

    Raises:
        KeyError: If no secret is found.
        ZenKeyError: If multiple secrets are found.
        NotImplementedError: If centralized secrets management is not
            enabled.
    """
    from zenml.utils.uuid_utils import is_valid_uuid

    try:
        # First interpret as full UUID
        if is_valid_uuid(name_id_or_prefix):
            # Fetch by ID; filter by scope if provided
            secret = self.zen_store.get_secret(
                secret_id=UUID(name_id_or_prefix)
                if isinstance(name_id_or_prefix, str)
                else name_id_or_prefix,
                hydrate=hydrate,
            )
            if scope is not None and secret.scope != scope:
                raise KeyError(
                    f"No secret found with ID {str(name_id_or_prefix)}"
                )

            return secret
    except NotImplementedError:
        raise NotImplementedError(
            "centralized secrets management is not supported or explicitly "
            "disabled in the target ZenML deployment."
        )

    # If not a UUID, try to find by name and then by prefix
    assert not isinstance(name_id_or_prefix, UUID)

    # Scopes to search in order of priority
    search_scopes = (
        [SecretScope.USER, SecretScope.WORKSPACE]
        if scope is None
        else [scope]
    )

    secrets = self.list_secrets(
        logical_operator=LogicalOperators.OR,
        name=f"contains:{name_id_or_prefix}"
        if allow_partial_name_match
        else f"equals:{name_id_or_prefix}",
        id=f"startswith:{name_id_or_prefix}"
        if allow_partial_id_match
        else None,
        hydrate=hydrate,
    )

    for search_scope in search_scopes:
        partial_matches: List[SecretResponse] = []
        for secret in secrets.items:
            if secret.scope != search_scope:
                continue
            # Exact match
            if secret.name == name_id_or_prefix:
                # Need to fetch the secret again to get the secret values
                return self.zen_store.get_secret(
                    secret_id=secret.id,
                    hydrate=hydrate,
                )
            # Partial match
            partial_matches.append(secret)

        if len(partial_matches) > 1:
            match_summary = "\n".join(
                [
                    f"[{secret.id}]: name = {secret.name}"
                    for secret in partial_matches
                ]
            )
            raise ZenKeyError(
                f"{len(partial_matches)} secrets have been found that have "
                f"a name or ID that matches the provided "
                f"string '{name_id_or_prefix}':\n"
                f"{match_summary}.\n"
                f"Please use the id to uniquely identify "
                f"only one of the secrets."
            )

        # If only a single secret is found, return it
        if len(partial_matches) == 1:
            # Need to fetch the secret again to get the secret values
            return self.zen_store.get_secret(
                secret_id=partial_matches[0].id,
                hydrate=hydrate,
            )

    msg = f"No secret found with name, ID or prefix '{name_id_or_prefix}'"
    if scope is not None:
        msg += f" in scope '{scope}'"

    raise KeyError(msg)

get_secret_by_name_and_scope(name, scope=None, hydrate=True)

Fetches a registered secret with a given name and optional scope.

This is a version of get_secret that restricts the search to a given name and an optional scope, without doing any prefix or UUID matching.

If no scope is provided, the search will be done first in the user scope, then in the workspace scope.

Parameters:

Name Type Description Default
name str

The name of the secret to get.

required
scope Optional[SecretScope]

The scope of the secret to get.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
SecretResponse

The registered secret.

Raises:

Type Description
KeyError

If no secret exists for the given name in the given scope.

Source code in src/zenml/client.py
4847
4848
4849
4850
4851
4852
4853
4854
4855
4856
4857
4858
4859
4860
4861
4862
4863
4864
4865
4866
4867
4868
4869
4870
4871
4872
4873
4874
4875
4876
4877
4878
4879
4880
4881
4882
4883
4884
4885
4886
4887
4888
4889
4890
4891
4892
4893
4894
4895
4896
4897
4898
4899
4900
4901
4902
def get_secret_by_name_and_scope(
    self,
    name: str,
    scope: Optional[SecretScope] = None,
    hydrate: bool = True,
) -> SecretResponse:
    """Fetches a registered secret with a given name and optional scope.

    This is a version of get_secret that restricts the search to a given
    name and an optional scope, without doing any prefix or UUID matching.

    If no scope is provided, the search will be done first in the user
    scope, then in the workspace scope.

    Args:
        name: The name of the secret to get.
        scope: The scope of the secret to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The registered secret.

    Raises:
        KeyError: If no secret exists for the given name in the given scope.
    """
    logger.debug(
        f"Fetching the secret with name '{name}' and scope '{scope}'."
    )

    # Scopes to search in order of priority
    search_scopes = (
        [SecretScope.USER, SecretScope.WORKSPACE]
        if scope is None
        else [scope]
    )

    for search_scope in search_scopes:
        secrets = self.list_secrets(
            logical_operator=LogicalOperators.AND,
            name=f"equals:{name}",
            scope=search_scope,
            hydrate=hydrate,
        )

        if len(secrets.items) >= 1:
            # Need to fetch the secret again to get the secret values
            return self.zen_store.get_secret(
                secret_id=secrets.items[0].id, hydrate=hydrate
            )

    msg = f"No secret with name '{name}' was found"
    if scope is not None:
        msg += f" in scope '{scope.value}'"

    raise KeyError(msg)

get_service(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True, type=None)

Gets a service.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True
type Optional[str]

The type of the service.

None

Returns:

Type Description
ServiceResponse

The Service

Source code in src/zenml/client.py
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
def get_service(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
    type: Optional[str] = None,
) -> ServiceResponse:
    """Gets a service.

    Args:
        name_id_or_prefix: The name or ID of the service.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        type: The type of the service.

    Returns:
        The Service
    """

    def type_scoped_list_method(
        hydrate: bool = True,
        **kwargs: Any,
    ) -> Page[ServiceResponse]:
        """Call `zen_store.list_services` with type scoping.

        Args:
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            **kwargs: Keyword arguments to pass to `ServiceFilterModel`.

        Returns:
            The type-scoped list of services.
        """
        service_filter_model = ServiceFilter(**kwargs)
        if type:
            service_filter_model.set_type(type=type)
        service_filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_services(
            filter_model=service_filter_model,
            hydrate=hydrate,
        )

    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_service,
        list_method=type_scoped_list_method,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_service_account(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Gets a service account.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service account.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ServiceAccountResponse

The ServiceAccount

Source code in src/zenml/client.py
7106
7107
7108
7109
7110
7111
7112
7113
7114
7115
7116
7117
7118
7119
7120
7121
7122
7123
7124
7125
7126
7127
7128
7129
def get_service_account(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> ServiceAccountResponse:
    """Gets a service account.

    Args:
        name_id_or_prefix: The name or ID of the service account.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The ServiceAccount
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_service_account,
        list_method=self.list_service_accounts,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_service_connector(name_id_or_prefix, allow_name_prefix_match=True, load_secrets=False, hydrate=True)

Fetches a registered service connector.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The id of the service connector to fetch.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
load_secrets bool

If True, load the secrets for the service connector.

False
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ServiceConnectorResponse

The registered service connector.

Source code in src/zenml/client.py
5391
5392
5393
5394
5395
5396
5397
5398
5399
5400
5401
5402
5403
5404
5405
5406
5407
5408
5409
5410
5411
5412
5413
5414
5415
5416
5417
5418
5419
5420
5421
5422
5423
5424
5425
5426
5427
5428
5429
5430
5431
5432
5433
5434
5435
5436
5437
5438
5439
5440
5441
5442
5443
5444
5445
5446
5447
5448
5449
5450
5451
5452
5453
5454
5455
5456
5457
5458
def get_service_connector(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    load_secrets: bool = False,
    hydrate: bool = True,
) -> ServiceConnectorResponse:
    """Fetches a registered service connector.

    Args:
        name_id_or_prefix: The id of the service connector to fetch.
        allow_name_prefix_match: If True, allow matching by name prefix.
        load_secrets: If True, load the secrets for the service connector.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The registered service connector.
    """

    def scoped_list_method(
        hydrate: bool = False,
        **kwargs: Any,
    ) -> Page[ServiceConnectorResponse]:
        """Call `zen_store.list_service_connectors` with workspace scoping.

        Args:
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            **kwargs: Keyword arguments to pass to
                `ServiceConnectorFilterModel`.

        Returns:
            The list of service connectors.
        """
        filter_model = ServiceConnectorFilter(**kwargs)
        filter_model.set_scope_workspace(self.active_workspace.id)
        return self.zen_store.list_service_connectors(
            filter_model=filter_model,
            hydrate=hydrate,
        )

    connector = self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_service_connector,
        list_method=scoped_list_method,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

    if load_secrets and connector.secret_id:
        client = Client()
        try:
            secret = client.get_secret(
                name_id_or_prefix=connector.secret_id,
                allow_partial_id_match=False,
                allow_partial_name_match=False,
            )
        except KeyError as err:
            logger.error(
                "Unable to retrieve secret values associated with "
                f"service connector '{connector.name}': {err}"
            )
        else:
            # Add secret values to connector configuration
            connector.secrets.update(secret.values)

    return connector

get_service_connector_client(name_id_or_prefix, resource_type=None, resource_id=None, verify=False)

Get the client side of a service connector instance to use with a local client.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to use.

required
resource_type Optional[str]

The type of the resource to connect to. If not provided, the resource type from the service connector configuration will be used.

None
resource_id Optional[str]

The ID of a particular resource instance to configure the local client to connect to. If the connector instance is already configured with a resource ID that is not the same or equivalent to the one requested, a ValueError exception is raised. May be omitted for connectors and resource types that do not support multiple resource instances.

None
verify bool

Whether to verify that the service connector configuration and credentials can be used to gain access to the resource.

False

Returns:

Type Description
ServiceConnector

The client side of the indicated service connector instance that can

ServiceConnector

be used to connect to the resource locally.

Source code in src/zenml/client.py
5898
5899
5900
5901
5902
5903
5904
5905
5906
5907
5908
5909
5910
5911
5912
5913
5914
5915
5916
5917
5918
5919
5920
5921
5922
5923
5924
5925
5926
5927
5928
5929
5930
5931
5932
5933
5934
5935
5936
5937
5938
5939
5940
5941
5942
5943
5944
5945
5946
5947
5948
5949
5950
5951
5952
5953
5954
5955
5956
5957
5958
5959
5960
5961
5962
5963
5964
5965
5966
5967
5968
5969
5970
5971
5972
5973
5974
5975
def get_service_connector_client(
    self,
    name_id_or_prefix: Union[UUID, str],
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    verify: bool = False,
) -> "ServiceConnector":
    """Get the client side of a service connector instance to use with a local client.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to use.
        resource_type: The type of the resource to connect to. If not
            provided, the resource type from the service connector
            configuration will be used.
        resource_id: The ID of a particular resource instance to configure
            the local client to connect to. If the connector instance is
            already configured with a resource ID that is not the same or
            equivalent to the one requested, a `ValueError` exception is
            raised. May be omitted for connectors and resource types that do
            not support multiple resource instances.
        verify: Whether to verify that the service connector configuration
            and credentials can be used to gain access to the resource.

    Returns:
        The client side of the indicated service connector instance that can
        be used to connect to the resource locally.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    # Get the service connector model
    service_connector = self.get_service_connector(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    connector_type = self.get_service_connector_type(
        service_connector.type
    )

    # Prefer to fetch the connector client from the server if the
    # implementation if available there, because some auth methods rely on
    # the server-side authentication environment
    if connector_type.remote:
        connector_client_model = (
            self.zen_store.get_service_connector_client(
                service_connector_id=service_connector.id,
                resource_type=resource_type,
                resource_id=resource_id,
            )
        )

        connector_client = (
            service_connector_registry.instantiate_connector(
                model=connector_client_model
            )
        )

        if verify:
            # Verify the connector client on the local machine, because the
            # server-side implementation may not be able to do so
            connector_client.verify()
    else:
        connector_instance = (
            service_connector_registry.instantiate_connector(
                model=service_connector
            )
        )

        # Fetch the connector client
        connector_client = connector_instance.get_connector_client(
            resource_type=resource_type,
            resource_id=resource_id,
        )

    return connector_client

get_service_connector_type(connector_type)

Returns the requested service connector type.

Parameters:

Name Type Description Default
connector_type str

the service connector type identifier.

required

Returns:

Type Description
ServiceConnectorTypeModel

The requested service connector type.

Source code in src/zenml/client.py
6023
6024
6025
6026
6027
6028
6029
6030
6031
6032
6033
6034
6035
6036
6037
def get_service_connector_type(
    self,
    connector_type: str,
) -> ServiceConnectorTypeModel:
    """Returns the requested service connector type.

    Args:
        connector_type: the service connector type identifier.

    Returns:
        The requested service connector type.
    """
    return self.zen_store.get_service_connector_type(
        connector_type=connector_type,
    )

get_settings(hydrate=True)

Get the server settings.

Parameters:

Name Type Description Default
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ServerSettingsResponse

The server settings.

Source code in src/zenml/client.py
705
706
707
708
709
710
711
712
713
714
715
def get_settings(self, hydrate: bool = True) -> ServerSettingsResponse:
    """Get the server settings.

    Args:
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The server settings.
    """
    return self.zen_store.get_server_settings(hydrate=hydrate)

get_stack(name_id_or_prefix=None, allow_name_prefix_match=True, hydrate=True)

Get a stack by name, ID or prefix.

If no name, ID or prefix is provided, the active stack is returned.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name, ID or prefix of the stack.

None
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
StackResponse

The stack.

Source code in src/zenml/client.py
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
def get_stack(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]] = None,
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> StackResponse:
    """Get a stack by name, ID or prefix.

    If no name, ID or prefix is provided, the active stack is returned.

    Args:
        name_id_or_prefix: The name, ID or prefix of the stack.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The stack.
    """
    if name_id_or_prefix is not None:
        return self._get_entity_by_id_or_name_or_prefix(
            get_method=self.zen_store.get_stack,
            list_method=self.list_stacks,
            name_id_or_prefix=name_id_or_prefix,
            allow_name_prefix_match=allow_name_prefix_match,
            hydrate=hydrate,
        )
    else:
        return self.active_stack_model

get_stack_component(component_type, name_id_or_prefix=None, allow_name_prefix_match=True, hydrate=True)

Fetches a registered stack component.

If the name_id_or_prefix is provided, it will try to fetch the component with the corresponding identifier. If not, it will try to fetch the active component of the given type.

Parameters:

Name Type Description Default
component_type StackComponentType

The type of the component to fetch

required
name_id_or_prefix Optional[Union[str, UUID]]

The id of the component to fetch.

None
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
ComponentResponse

The registered stack component.

Raises:

Type Description
KeyError

If no name_id_or_prefix is provided and no such component is part of the active stack.

Source code in src/zenml/client.py
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
def get_stack_component(
    self,
    component_type: StackComponentType,
    name_id_or_prefix: Optional[Union[str, UUID]] = None,
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> ComponentResponse:
    """Fetches a registered stack component.

    If the name_id_or_prefix is provided, it will try to fetch the component
    with the corresponding identifier. If not, it will try to fetch the
    active component of the given type.

    Args:
        component_type: The type of the component to fetch
        name_id_or_prefix: The id of the component to fetch.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The registered stack component.

    Raises:
        KeyError: If no name_id_or_prefix is provided and no such component
            is part of the active stack.
    """
    # If no `name_id_or_prefix` provided, try to get the active component.
    if not name_id_or_prefix:
        components = self.active_stack_model.components.get(
            component_type, None
        )
        if components:
            return components[0]
        raise KeyError(
            "No name_id_or_prefix provided and there is no active "
            f"{component_type} in the current active stack."
        )

    # Else, try to fetch the component with an explicit type filter
    def type_scoped_list_method(
        hydrate: bool = False,
        **kwargs: Any,
    ) -> Page[ComponentResponse]:
        """Call `zen_store.list_stack_components` with type scoping.

        Args:
            hydrate: Flag deciding whether to hydrate the output model(s)
                by including metadata fields in the response.
            **kwargs: Keyword arguments to pass to `ComponentFilterModel`.

        Returns:
            The type-scoped list of components.
        """
        component_filter_model = ComponentFilter(**kwargs)
        component_filter_model.set_scope_type(
            component_type=component_type
        )
        component_filter_model.set_scope_workspace(
            self.active_workspace.id
        )
        return self.zen_store.list_stack_components(
            component_filter_model=component_filter_model,
            hydrate=hydrate,
        )

    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_stack_component,
        list_method=type_scoped_list_method,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_tag(tag_name_or_id, hydrate=True)

Get an existing tag.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or id of the model to be retrieved.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
TagResponse

The tag of interest.

Source code in src/zenml/client.py
7552
7553
7554
7555
7556
7557
7558
7559
7560
7561
7562
7563
7564
7565
7566
7567
def get_tag(
    self, tag_name_or_id: Union[str, UUID], hydrate: bool = True
) -> TagResponse:
    """Get an existing tag.

    Args:
        tag_name_or_id: name or id of the model to be retrieved.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The tag of interest.
    """
    return self.zen_store.get_tag(
        tag_name_or_id=tag_name_or_id, hydrate=hydrate
    )

get_trigger(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Get a trigger by name, ID or prefix.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, ID or prefix of the trigger.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
TriggerResponse

The trigger.

Source code in src/zenml/client.py
3156
3157
3158
3159
3160
3161
3162
3163
3164
3165
3166
3167
3168
3169
3170
3171
3172
3173
3174
3175
3176
3177
3178
3179
3180
@_fail_for_sql_zen_store
def get_trigger(
    self,
    name_id_or_prefix: Union[UUID, str],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> TriggerResponse:
    """Get a trigger by name, ID or prefix.

    Args:
        name_id_or_prefix: The name, ID or prefix of the trigger.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The trigger.
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_trigger,
        list_method=self.list_triggers,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_trigger_execution(trigger_execution_id, hydrate=True)

Get a trigger execution by ID.

Parameters:

Name Type Description Default
trigger_execution_id UUID

The ID of the trigger execution to get.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
TriggerExecutionResponse

The trigger execution.

Source code in src/zenml/client.py
6802
6803
6804
6805
6806
6807
6808
6809
6810
6811
6812
6813
6814
6815
6816
6817
6818
6819
def get_trigger_execution(
    self,
    trigger_execution_id: UUID,
    hydrate: bool = True,
) -> TriggerExecutionResponse:
    """Get a trigger execution by ID.

    Args:
        trigger_execution_id: The ID of the trigger execution to get.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The trigger execution.
    """
    return self.zen_store.get_trigger_execution(
        trigger_execution_id=trigger_execution_id, hydrate=hydrate
    )

get_user(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Gets a user.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the user.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
UserResponse

The User

Source code in src/zenml/client.py
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
def get_user(
    self,
    name_id_or_prefix: Union[str, UUID],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> UserResponse:
    """Gets a user.

    Args:
        name_id_or_prefix: The name or ID of the user.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The User
    """
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_user,
        list_method=self.list_users,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

get_workspace(name_id_or_prefix, allow_name_prefix_match=True, hydrate=True)

Gets a workspace.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name or ID of the workspace.

required
allow_name_prefix_match bool

If True, allow matching by name prefix.

True
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

True

Returns:

Type Description
WorkspaceResponse

The workspace

Source code in src/zenml/client.py
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
def get_workspace(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]],
    allow_name_prefix_match: bool = True,
    hydrate: bool = True,
) -> WorkspaceResponse:
    """Gets a workspace.

    Args:
        name_id_or_prefix: The name or ID of the workspace.
        allow_name_prefix_match: If True, allow matching by name prefix.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The workspace
    """
    if not name_id_or_prefix:
        return self.active_workspace
    return self._get_entity_by_id_or_name_or_prefix(
        get_method=self.zen_store.get_workspace,
        list_method=self.list_workspaces,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=allow_name_prefix_match,
        hydrate=hydrate,
    )

initialize(root=None) staticmethod

Initializes a new ZenML repository at the given path.

Parameters:

Name Type Description Default
root Optional[Path]

The root directory where the repository should be created. If None, the current working directory is used.

None

Raises:

Type Description
InitializationException

If the root directory already contains a ZenML repository.

Source code in src/zenml/client.py
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
@staticmethod
def initialize(
    root: Optional[Path] = None,
) -> None:
    """Initializes a new ZenML repository at the given path.

    Args:
        root: The root directory where the repository should be created.
            If None, the current working directory is used.

    Raises:
        InitializationException: If the root directory already contains a
            ZenML repository.
    """
    root = root or Path.cwd()
    logger.debug("Initializing new repository at path %s.", root)
    if Client.is_repository_directory(root):
        raise InitializationException(
            f"Found existing ZenML repository at path '{root}'."
        )

    config_directory = str(root / REPOSITORY_DIRECTORY_NAME)
    io_utils.create_dir_recursive_if_not_exists(config_directory)
    # Initialize the repository configuration at the custom path
    Client(root=root)

is_inside_repository(file_path) staticmethod

Returns whether a file is inside the active ZenML repository.

Parameters:

Name Type Description Default
file_path str

A file path.

required

Returns:

Type Description
bool

True if the file is inside the active ZenML repository, False

bool

otherwise.

Source code in src/zenml/client.py
622
623
624
625
626
627
628
629
630
631
632
633
634
635
@staticmethod
def is_inside_repository(file_path: str) -> bool:
    """Returns whether a file is inside the active ZenML repository.

    Args:
        file_path: A file path.

    Returns:
        True if the file is inside the active ZenML repository, False
        otherwise.
    """
    if repo_path := Client.find_repository():
        return repo_path in Path(file_path).resolve().parents
    return False

is_repository_directory(path) staticmethod

Checks whether a ZenML client exists at the given path.

Parameters:

Name Type Description Default
path Path

The path to check.

required

Returns:

Type Description
bool

True if a ZenML client exists at the given path,

bool

False otherwise.

Source code in src/zenml/client.py
533
534
535
536
537
538
539
540
541
542
543
544
545
@staticmethod
def is_repository_directory(path: Path) -> bool:
    """Checks whether a ZenML client exists at the given path.

    Args:
        path: The path to check.

    Returns:
        True if a ZenML client exists at the given path,
        False otherwise.
    """
    config_dir = path / REPOSITORY_DIRECTORY_NAME
    return fileio.isdir(str(config_dir))

list_actions(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, flavor=None, action_type=None, workspace_id=None, user_id=None, user=None, hydrate=False)

List actions.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of the action to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the action to filter by.

None
flavor Optional[str]

The flavor of the action to filter by.

None
action_type Optional[str]

The type of the action to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ActionResponse]

A page of actions.

Source code in src/zenml/client.py
3005
3006
3007
3008
3009
3010
3011
3012
3013
3014
3015
3016
3017
3018
3019
3020
3021
3022
3023
3024
3025
3026
3027
3028
3029
3030
3031
3032
3033
3034
3035
3036
3037
3038
3039
3040
3041
3042
3043
3044
3045
3046
3047
3048
3049
3050
3051
3052
3053
3054
3055
3056
3057
3058
3059
3060
3061
@_fail_for_sql_zen_store
def list_actions(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    flavor: Optional[str] = None,
    action_type: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ActionResponse]:
    """List actions.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of the action to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The id of the user to filter by.
        user: Filter by user name/ID.
        name: The name of the action to filter by.
        flavor: The flavor of the action to filter by.
        action_type: The type of the action to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of actions.
    """
    filter_model = ActionFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        name=name,
        id=id,
        flavor=flavor,
        plugin_subtype=action_type,
        created=created,
        updated=updated,
    )
    filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_actions(filter_model, hydrate=hydrate)

list_api_keys(service_account_name_id_or_prefix, sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, description=None, active=None, last_login=None, last_rotated=None, hydrate=False)

List all API keys.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to list the API keys for.

required
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
id Optional[Union[UUID, str]]

Use the id of the API key to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation.

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering.

None
name Optional[str]

The name of the API key to filter by.

None
description Optional[str]

The description of the API key to filter by.

None
active Optional[bool]

Whether the API key is active or not.

None
last_login Optional[Union[datetime, str]]

The last time the API key was used.

None
last_rotated Optional[Union[datetime, str]]

The last time the API key was rotated.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[APIKeyResponse]

A page of API keys matching the filter description.

Source code in src/zenml/client.py
7296
7297
7298
7299
7300
7301
7302
7303
7304
7305
7306
7307
7308
7309
7310
7311
7312
7313
7314
7315
7316
7317
7318
7319
7320
7321
7322
7323
7324
7325
7326
7327
7328
7329
7330
7331
7332
7333
7334
7335
7336
7337
7338
7339
7340
7341
7342
7343
7344
7345
7346
7347
7348
7349
7350
7351
7352
7353
7354
7355
7356
7357
7358
def list_api_keys(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
    last_login: Optional[Union[datetime, str]] = None,
    last_rotated: Optional[Union[datetime, str]] = None,
    hydrate: bool = False,
) -> Page[APIKeyResponse]:
    """List all API keys.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to list the API keys for.
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        id: Use the id of the API key to filter by.
        created: Use to filter by time of creation.
        updated: Use the last updated date for filtering.
        name: The name of the API key to filter by.
        description: The description of the API key to filter by.
        active: Whether the API key is active or not.
        last_login: The last time the API key was used.
        last_rotated: The last time the API key was rotated.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of API keys matching the filter description.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=service_account_name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    filter_model = APIKeyFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        description=description,
        active=active,
        last_login=last_login,
        last_rotated=last_rotated,
    )
    return self.zen_store.list_api_keys(
        service_account_id=service_account.id,
        filter_model=filter_model,
        hydrate=hydrate,
    )

list_artifact_versions(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, artifact_id=None, name=None, version=None, version_number=None, artifact_store_id=None, type=None, data_type=None, uri=None, materializer=None, workspace_id=None, user_id=None, model_version_id=None, only_unused=False, has_custom_name=None, user=None, model=None, pipeline_run=None, run_metadata=None, tag=None, hydrate=False)

Get a list of artifact versions.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of artifact version to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
artifact_id Optional[Union[str, UUID]]

The id of the artifact to filter by.

None
name Optional[str]

The name of the artifact to filter by.

None
version Optional[Union[str, int]]

The version of the artifact to filter by.

None
version_number Optional[int]

The version number of the artifact to filter by.

None
artifact_store_id Optional[Union[str, UUID]]

The id of the artifact store to filter by.

None
type Optional[ArtifactType]

The type of the artifact to filter by.

None
data_type Optional[str]

The data type of the artifact to filter by.

None
uri Optional[str]

The uri of the artifact to filter by.

None
materializer Optional[str]

The materializer of the artifact to filter by.

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
model_version_id Optional[Union[str, UUID]]

Filter by model version ID.

None
only_unused Optional[bool]

Only return artifact versions that are not used in any pipeline runs.

False
has_custom_name Optional[bool]

Filter artifacts with/without custom names.

None
tag Optional[str]

A tag to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name or ID.

None
model Optional[Union[UUID, str]]

Filter by model name or ID.

None
pipeline_run Optional[Union[UUID, str]]

Filter by pipeline run name or ID.

None
run_metadata Optional[Dict[str, Any]]

Filter by run metadata.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ArtifactVersionResponse]

A list of artifact versions.

Source code in src/zenml/client.py
4234
4235
4236
4237
4238
4239
4240
4241
4242
4243
4244
4245
4246
4247
4248
4249
4250
4251
4252
4253
4254
4255
4256
4257
4258
4259
4260
4261
4262
4263
4264
4265
4266
4267
4268
4269
4270
4271
4272
4273
4274
4275
4276
4277
4278
4279
4280
4281
4282
4283
4284
4285
4286
4287
4288
4289
4290
4291
4292
4293
4294
4295
4296
4297
4298
4299
4300
4301
4302
4303
4304
4305
4306
4307
4308
4309
4310
4311
4312
4313
4314
4315
4316
4317
4318
4319
4320
4321
4322
4323
4324
4325
4326
4327
4328
4329
4330
4331
4332
4333
4334
def list_artifact_versions(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    artifact_id: Optional[Union[str, UUID]] = None,
    name: Optional[str] = None,
    version: Optional[Union[str, int]] = None,
    version_number: Optional[int] = None,
    artifact_store_id: Optional[Union[str, UUID]] = None,
    type: Optional[ArtifactType] = None,
    data_type: Optional[str] = None,
    uri: Optional[str] = None,
    materializer: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    only_unused: Optional[bool] = False,
    has_custom_name: Optional[bool] = None,
    user: Optional[Union[UUID, str]] = None,
    model: Optional[Union[UUID, str]] = None,
    pipeline_run: Optional[Union[UUID, str]] = None,
    run_metadata: Optional[Dict[str, Any]] = None,
    tag: Optional[str] = None,
    hydrate: bool = False,
) -> Page[ArtifactVersionResponse]:
    """Get a list of artifact versions.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of artifact version to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        artifact_id: The id of the artifact to filter by.
        name: The name of the artifact to filter by.
        version: The version of the artifact to filter by.
        version_number: The version number of the artifact to filter by.
        artifact_store_id: The id of the artifact store to filter by.
        type: The type of the artifact to filter by.
        data_type: The data type of the artifact to filter by.
        uri: The uri of the artifact to filter by.
        materializer: The materializer of the artifact to filter by.
        workspace_id: The id of the workspace to filter by.
        user_id: The id of the user to filter by.
        model_version_id: Filter by model version ID.
        only_unused: Only return artifact versions that are not used in
            any pipeline runs.
        has_custom_name: Filter artifacts with/without custom names.
        tag: A tag to filter by.
        user: Filter by user name or ID.
        model: Filter by model name or ID.
        pipeline_run: Filter by pipeline run name or ID.
        run_metadata: Filter by run metadata.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of artifact versions.
    """
    artifact_version_filter_model = ArtifactVersionFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        artifact_id=artifact_id,
        name=name,
        version=str(version) if version else None,
        version_number=version_number,
        artifact_store_id=artifact_store_id,
        type=type,
        data_type=data_type,
        uri=uri,
        materializer=materializer,
        workspace_id=workspace_id,
        user_id=user_id,
        model_version_id=model_version_id,
        only_unused=only_unused,
        has_custom_name=has_custom_name,
        tag=tag,
        user=user,
        model=model,
        pipeline_run=pipeline_run,
        run_metadata=run_metadata,
    )
    artifact_version_filter_model.set_scope_workspace(
        self.active_workspace.id
    )
    return self.zen_store.list_artifact_versions(
        artifact_version_filter_model,
        hydrate=hydrate,
    )

list_artifacts(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, has_custom_name=None, hydrate=False, tag=None)

Get a list of artifacts.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of artifact to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the artifact to filter by.

None
has_custom_name Optional[bool]

Filter artifacts with/without custom names.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
tag Optional[str]

Filter artifacts by tag.

None

Returns:

Type Description
Page[ArtifactResponse]

A list of artifacts.

Source code in src/zenml/client.py
4065
4066
4067
4068
4069
4070
4071
4072
4073
4074
4075
4076
4077
4078
4079
4080
4081
4082
4083
4084
4085
4086
4087
4088
4089
4090
4091
4092
4093
4094
4095
4096
4097
4098
4099
4100
4101
4102
4103
4104
4105
4106
4107
4108
4109
4110
4111
4112
4113
def list_artifacts(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    has_custom_name: Optional[bool] = None,
    hydrate: bool = False,
    tag: Optional[str] = None,
) -> Page[ArtifactResponse]:
    """Get a list of artifacts.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of artifact to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the artifact to filter by.
        has_custom_name: Filter artifacts with/without custom names.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        tag: Filter artifacts by tag.

    Returns:
        A list of artifacts.
    """
    artifact_filter_model = ArtifactFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        has_custom_name=has_custom_name,
        tag=tag,
    )
    return self.zen_store.list_artifacts(
        artifact_filter_model,
        hydrate=hydrate,
    )

list_authorized_devices(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, expires=None, client_id=None, status=None, trusted_device=None, user=None, failed_auth_attempts=None, last_login=None, hydrate=False)

List all authorized devices.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
id Optional[Union[UUID, str]]

Use the id of the code repository to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation.

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering.

None
expires Optional[Union[datetime, str]]

Use the expiration date for filtering.

None
client_id Union[UUID, str, None]

Use the client id for filtering.

None
status Union[OAuthDeviceStatus, str, None]

Use the status for filtering.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
trusted_device Union[bool, str, None]

Use the trusted device flag for filtering.

None
failed_auth_attempts Union[int, str, None]

Use the failed auth attempts for filtering.

None
last_login Optional[Union[datetime, str, None]]

Use the last login date for filtering.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[OAuthDeviceResponse]

A page of authorized devices matching the filter.

Source code in src/zenml/client.py
6657
6658
6659
6660
6661
6662
6663
6664
6665
6666
6667
6668
6669
6670
6671
6672
6673
6674
6675
6676
6677
6678
6679
6680
6681
6682
6683
6684
6685
6686
6687
6688
6689
6690
6691
6692
6693
6694
6695
6696
6697
6698
6699
6700
6701
6702
6703
6704
6705
6706
6707
6708
6709
6710
6711
6712
6713
6714
6715
6716
6717
def list_authorized_devices(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    expires: Optional[Union[datetime, str]] = None,
    client_id: Union[UUID, str, None] = None,
    status: Union[OAuthDeviceStatus, str, None] = None,
    trusted_device: Union[bool, str, None] = None,
    user: Optional[Union[UUID, str]] = None,
    failed_auth_attempts: Union[int, str, None] = None,
    last_login: Optional[Union[datetime, str, None]] = None,
    hydrate: bool = False,
) -> Page[OAuthDeviceResponse]:
    """List all authorized devices.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        id: Use the id of the code repository to filter by.
        created: Use to filter by time of creation.
        updated: Use the last updated date for filtering.
        expires: Use the expiration date for filtering.
        client_id: Use the client id for filtering.
        status: Use the status for filtering.
        user: Filter by user name/ID.
        trusted_device: Use the trusted device flag for filtering.
        failed_auth_attempts: Use the failed auth attempts for filtering.
        last_login: Use the last login date for filtering.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of authorized devices matching the filter.
    """
    filter_model = OAuthDeviceFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        expires=expires,
        client_id=client_id,
        user=user,
        status=status,
        trusted_device=trusted_device,
        failed_auth_attempts=failed_auth_attempts,
        last_login=last_login,
    )
    return self.zen_store.list_authorized_devices(
        filter_model=filter_model,
        hydrate=hydrate,
    )

list_builds(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, workspace_id=None, user_id=None, user=None, pipeline_id=None, stack_id=None, container_registry_id=None, is_local=None, contains_code=None, zenml_version=None, python_version=None, checksum=None, stack_checksum=None, hydrate=False)

List all builds.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of build to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
container_registry_id Optional[Union[UUID, str]]

The id of the container registry to filter by.

None
is_local Optional[bool]

Use to filter local builds.

None
contains_code Optional[bool]

Use to filter builds that contain code.

None
zenml_version Optional[str]

The version of ZenML to filter by.

None
python_version Optional[str]

The Python version to filter by.

None
checksum Optional[str]

The build checksum to filter by.

None
stack_checksum Optional[str]

The stack checksum to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[PipelineBuildResponse]

A page with builds fitting the filter description

Source code in src/zenml/client.py
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
def list_builds(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    container_registry_id: Optional[Union[UUID, str]] = None,
    is_local: Optional[bool] = None,
    contains_code: Optional[bool] = None,
    zenml_version: Optional[str] = None,
    python_version: Optional[str] = None,
    checksum: Optional[str] = None,
    stack_checksum: Optional[str] = None,
    hydrate: bool = False,
) -> Page[PipelineBuildResponse]:
    """List all builds.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of build to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        pipeline_id: The id of the pipeline to filter by.
        stack_id: The id of the stack to filter by.
        container_registry_id: The id of the container registry to
            filter by.
        is_local: Use to filter local builds.
        contains_code: Use to filter builds that contain code.
        zenml_version: The version of ZenML to filter by.
        python_version: The Python version to filter by.
        checksum: The build checksum to filter by.
        stack_checksum: The stack checksum to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with builds fitting the filter description
    """
    build_filter_model = PipelineBuildFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        pipeline_id=pipeline_id,
        stack_id=stack_id,
        container_registry_id=container_registry_id,
        is_local=is_local,
        contains_code=contains_code,
        zenml_version=zenml_version,
        python_version=python_version,
        checksum=checksum,
        stack_checksum=stack_checksum,
    )
    build_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_builds(
        build_filter_model=build_filter_model,
        hydrate=hydrate,
    )

list_code_repositories(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, workspace_id=None, user_id=None, user=None, hydrate=False)

List all code repositories.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
id Optional[Union[UUID, str]]

Use the id of the code repository to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation.

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering.

None
name Optional[str]

The name of the code repository to filter by.

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[CodeRepositoryResponse]

A page of code repositories matching the filter description.

Source code in src/zenml/client.py
5052
5053
5054
5055
5056
5057
5058
5059
5060
5061
5062
5063
5064
5065
5066
5067
5068
5069
5070
5071
5072
5073
5074
5075
5076
5077
5078
5079
5080
5081
5082
5083
5084
5085
5086
5087
5088
5089
5090
5091
5092
5093
5094
5095
5096
5097
5098
5099
5100
5101
5102
5103
5104
def list_code_repositories(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[CodeRepositoryResponse]:
    """List all code repositories.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        id: Use the id of the code repository to filter by.
        created: Use to filter by time of creation.
        updated: Use the last updated date for filtering.
        name: The name of the code repository to filter by.
        workspace_id: The id of the workspace to filter by.
        user_id: The id of the user to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of code repositories matching the filter description.
    """
    filter_model = CodeRepositoryFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
    )
    filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_code_repositories(
        filter_model=filter_model,
        hydrate=hydrate,
    )

list_deployments(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, workspace_id=None, user_id=None, user=None, pipeline_id=None, stack_id=None, build_id=None, template_id=None, hydrate=False)

List all deployments.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of build to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
build_id Optional[Union[str, UUID]]

The id of the build to filter by.

None
template_id Optional[Union[str, UUID]]

The ID of the template to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[PipelineDeploymentResponse]

A page with deployments fitting the filter description

Source code in src/zenml/client.py
3379
3380
3381
3382
3383
3384
3385
3386
3387
3388
3389
3390
3391
3392
3393
3394
3395
3396
3397
3398
3399
3400
3401
3402
3403
3404
3405
3406
3407
3408
3409
3410
3411
3412
3413
3414
3415
3416
3417
3418
3419
3420
3421
3422
3423
3424
3425
3426
3427
3428
3429
3430
3431
3432
3433
3434
3435
3436
3437
3438
3439
3440
def list_deployments(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    build_id: Optional[Union[str, UUID]] = None,
    template_id: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
) -> Page[PipelineDeploymentResponse]:
    """List all deployments.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of build to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        pipeline_id: The id of the pipeline to filter by.
        stack_id: The id of the stack to filter by.
        build_id: The id of the build to filter by.
        template_id: The ID of the template to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with deployments fitting the filter description
    """
    deployment_filter_model = PipelineDeploymentFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        pipeline_id=pipeline_id,
        stack_id=stack_id,
        build_id=build_id,
        template_id=template_id,
    )
    deployment_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_deployments(
        deployment_filter_model=deployment_filter_model,
        hydrate=hydrate,
    )

list_event_sources(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, flavor=None, event_source_type=None, workspace_id=None, user_id=None, user=None, hydrate=False)

Lists all event_sources.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of event_sources to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the event_source to filter by.

None
flavor Optional[str]

The flavor of the event_source to filter by.

None
event_source_type Optional[str]

The subtype of the event_source to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[EventSourceResponse]

A page of event_sources.

Source code in src/zenml/client.py
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
def list_event_sources(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    flavor: Optional[str] = None,
    event_source_type: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[EventSourceResponse]:
    """Lists all event_sources.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of event_sources to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        name: The name of the event_source to filter by.
        flavor: The flavor of the event_source to filter by.
        event_source_type: The subtype of the event_source to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of event_sources.
    """
    event_source_filter_model = EventSourceFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        name=name,
        flavor=flavor,
        plugin_subtype=event_source_type,
        id=id,
        created=created,
        updated=updated,
    )
    event_source_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_event_sources(
        event_source_filter_model, hydrate=hydrate
    )

list_flavors(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, type=None, integration=None, user_id=None, user=None, hydrate=False)

Fetches all the flavor models.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of flavors to filter by.

None
created Optional[datetime]

Use to flavors by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the flavor to filter by.

None
type Optional[str]

The type of the flavor to filter by.

None
integration Optional[str]

The integration of the flavor to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[FlavorResponse]

A list of all the flavor models.

Source code in src/zenml/client.py
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
def list_flavors(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    type: Optional[str] = None,
    integration: Optional[str] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[FlavorResponse]:
    """Fetches all the flavor models.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of flavors to filter by.
        created: Use to flavors by time of creation
        updated: Use the last updated date for filtering
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        name: The name of the flavor to filter by.
        type: The type of the flavor to filter by.
        integration: The integration of the flavor to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of all the flavor models.
    """
    flavor_filter_model = FlavorFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        user_id=user_id,
        user=user,
        name=name,
        type=type,
        integration=integration,
        id=id,
        created=created,
        updated=updated,
    )
    flavor_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_flavors(
        flavor_filter_model=flavor_filter_model, hydrate=hydrate
    )

Get model version to artifact links by filter in Model Control Plane.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
model_version_id Optional[Union[UUID, str]]

Use the model version id for filtering

None
artifact_version_id Optional[Union[UUID, str]]

Use the artifact id for filtering

None
artifact_name Optional[str]

Use the artifact name for filtering

None
only_data_artifacts Optional[bool]

Use to filter by data artifacts

None
only_model_artifacts Optional[bool]

Use to filter by model artifacts

None
only_deployment_artifacts Optional[bool]

Use to filter by deployment artifacts

None
has_custom_name Optional[bool]

Filter artifacts with/without custom names.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ModelVersionArtifactResponse]

A page of all model version to artifact links.

Source code in src/zenml/client.py
6494
6495
6496
6497
6498
6499
6500
6501
6502
6503
6504
6505
6506
6507
6508
6509
6510
6511
6512
6513
6514
6515
6516
6517
6518
6519
6520
6521
6522
6523
6524
6525
6526
6527
6528
6529
6530
6531
6532
6533
6534
6535
6536
6537
6538
6539
6540
6541
6542
6543
6544
6545
6546
6547
6548
6549
6550
6551
6552
6553
def list_model_version_artifact_links(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    model_version_id: Optional[Union[UUID, str]] = None,
    artifact_version_id: Optional[Union[UUID, str]] = None,
    artifact_name: Optional[str] = None,
    only_data_artifacts: Optional[bool] = None,
    only_model_artifacts: Optional[bool] = None,
    only_deployment_artifacts: Optional[bool] = None,
    has_custom_name: Optional[bool] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ModelVersionArtifactResponse]:
    """Get model version to artifact links by filter in Model Control Plane.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        model_version_id: Use the model version id for filtering
        artifact_version_id: Use the artifact id for filtering
        artifact_name: Use the artifact name for filtering
        only_data_artifacts: Use to filter by data artifacts
        only_model_artifacts: Use to filter by model artifacts
        only_deployment_artifacts: Use to filter by deployment artifacts
        has_custom_name: Filter artifacts with/without custom names.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of all model version to artifact links.
    """
    return self.zen_store.list_model_version_artifact_links(
        ModelVersionArtifactFilter(
            sort_by=sort_by,
            logical_operator=logical_operator,
            page=page,
            size=size,
            created=created,
            updated=updated,
            model_version_id=model_version_id,
            artifact_version_id=artifact_version_id,
            artifact_name=artifact_name,
            only_data_artifacts=only_data_artifacts,
            only_model_artifacts=only_model_artifacts,
            only_deployment_artifacts=only_deployment_artifacts,
            has_custom_name=has_custom_name,
            user=user,
        ),
        hydrate=hydrate,
    )

Get all model version to pipeline run links by filter.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
model_version_id Optional[Union[UUID, str]]

Use the model version id for filtering

None
pipeline_run_id Optional[Union[UUID, str]]

Use the pipeline run id for filtering

None
pipeline_run_name Optional[str]

Use the pipeline run name for filtering

None
user Optional[Union[UUID, str]]

Filter by user name or ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response

False

Returns:

Type Description
Page[ModelVersionPipelineRunResponse]

A page of all model version to pipeline run links.

Source code in src/zenml/client.py
6606
6607
6608
6609
6610
6611
6612
6613
6614
6615
6616
6617
6618
6619
6620
6621
6622
6623
6624
6625
6626
6627
6628
6629
6630
6631
6632
6633
6634
6635
6636
6637
6638
6639
6640
6641
6642
6643
6644
6645
6646
6647
6648
6649
6650
6651
6652
6653
def list_model_version_pipeline_run_links(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    model_version_id: Optional[Union[UUID, str]] = None,
    pipeline_run_id: Optional[Union[UUID, str]] = None,
    pipeline_run_name: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ModelVersionPipelineRunResponse]:
    """Get all model version to pipeline run links by filter.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        model_version_id: Use the model version id for filtering
        pipeline_run_id: Use the pipeline run id for filtering
        pipeline_run_name: Use the pipeline run name for filtering
        user: Filter by user name or ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response

    Returns:
        A page of all model version to pipeline run links.
    """
    return self.zen_store.list_model_version_pipeline_run_links(
        ModelVersionPipelineRunFilter(
            sort_by=sort_by,
            logical_operator=logical_operator,
            page=page,
            size=size,
            created=created,
            updated=updated,
            model_version_id=model_version_id,
            pipeline_run_id=pipeline_run_id,
            pipeline_run_name=pipeline_run_name,
            user=user,
        ),
        hydrate=hydrate,
    )

list_model_versions(model_name_or_id=None, sort_by='number', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, created=None, updated=None, name=None, number=None, stage=None, user=None, hydrate=False, tag=None)

Get model versions by filter from Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Optional[Union[str, UUID]]

name or id of the model containing the model version.

None
sort_by str

The column to sort by

'number'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

name or id of the model version.

None
number Optional[int]

number of the model version.

None
stage Optional[Union[str, ModelStages]]

stage of the model version.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
tag Optional[str]

The tag to filter by.

None

Returns:

Type Description
Page[ModelVersionResponse]

A page object with all model versions.

Source code in src/zenml/client.py
6385
6386
6387
6388
6389
6390
6391
6392
6393
6394
6395
6396
6397
6398
6399
6400
6401
6402
6403
6404
6405
6406
6407
6408
6409
6410
6411
6412
6413
6414
6415
6416
6417
6418
6419
6420
6421
6422
6423
6424
6425
6426
6427
6428
6429
6430
6431
6432
6433
6434
6435
6436
6437
6438
6439
6440
6441
def list_model_versions(
    self,
    model_name_or_id: Optional[Union[str, UUID]] = None,
    sort_by: str = "number",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    number: Optional[int] = None,
    stage: Optional[Union[str, ModelStages]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
    tag: Optional[str] = None,
) -> Page[ModelVersionResponse]:
    """Get model versions by filter from Model Control Plane.

    Args:
        model_name_or_id: name or id of the model containing the model
            version.
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: name or id of the model version.
        number: number of the model version.
        stage: stage of the model version.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        tag: The tag to filter by.

    Returns:
        A page object with all model versions.
    """
    model_version_filter_model = ModelVersionFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        created=created,
        updated=updated,
        name=name,
        number=number,
        stage=stage,
        tag=tag,
        user=user,
    )

    return self.zen_store.list_model_versions(
        model_name_or_id=model_name_or_id,
        model_version_filter_model=model_version_filter_model,
        hydrate=hydrate,
    )

list_models(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, created=None, updated=None, name=None, user=None, hydrate=False, tag=None)

Get models by filter from Model Control Plane.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the model to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
tag Optional[str]

The tag of the model to filter by.

None

Returns:

Type Description
Page[ModelResponse]

A page object with all models.

Source code in src/zenml/client.py
6177
6178
6179
6180
6181
6182
6183
6184
6185
6186
6187
6188
6189
6190
6191
6192
6193
6194
6195
6196
6197
6198
6199
6200
6201
6202
6203
6204
6205
6206
6207
6208
6209
6210
6211
6212
6213
6214
6215
6216
6217
6218
6219
6220
6221
6222
def list_models(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
    tag: Optional[str] = None,
) -> Page[ModelResponse]:
    """Get models by filter from Model Control Plane.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the model to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        tag: The tag of the model to filter by.

    Returns:
        A page object with all models.
    """
    filter = ModelFilter(
        name=name,
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        created=created,
        updated=updated,
        tag=tag,
        user=user,
    )

    return self.zen_store.list_models(
        model_filter_model=filter, hydrate=hydrate
    )

list_pipeline_runs(sort_by='desc:created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, workspace_id=None, pipeline_id=None, pipeline_name=None, user_id=None, stack_id=None, schedule_id=None, build_id=None, deployment_id=None, code_repository_id=None, template_id=None, model_version_id=None, orchestrator_run_id=None, status=None, start_time=None, end_time=None, num_steps=None, unlisted=None, templatable=None, tag=None, user=None, run_metadata=None, pipeline=None, code_repository=None, model=None, stack=None, stack_component=None, hydrate=False)

List all pipeline runs.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'desc:created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

The id of the runs to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
pipeline_name Optional[str]

DEPRECATED. Use pipeline instead to filter by pipeline name.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
schedule_id Optional[Union[str, UUID]]

The id of the schedule to filter by.

None
build_id Optional[Union[str, UUID]]

The id of the build to filter by.

None
deployment_id Optional[Union[str, UUID]]

The id of the deployment to filter by.

None
code_repository_id Optional[Union[str, UUID]]

The id of the code repository to filter by.

None
template_id Optional[Union[str, UUID]]

The ID of the template to filter by.

None
model_version_id Optional[Union[str, UUID]]

The ID of the model version to filter by.

None
orchestrator_run_id Optional[str]

The run id of the orchestrator to filter by.

None
name Optional[str]

The name of the run to filter by.

None
status Optional[str]

The status of the pipeline run

None
start_time Optional[Union[datetime, str]]

The start_time for the pipeline run

None
end_time Optional[Union[datetime, str]]

The end_time for the pipeline run

None
num_steps Optional[Union[int, str]]

The number of steps for the pipeline run

None
unlisted Optional[bool]

If the runs should be unlisted or not.

None
templatable Optional[bool]

If the runs should be templatable or not.

None
tag Optional[str]

Tag to filter by.

None
user Optional[Union[UUID, str]]

The name/ID of the user to filter by.

None
run_metadata Optional[Dict[str, Any]]

The run_metadata of the run to filter by.

None
pipeline Optional[Union[UUID, str]]

The name/ID of the pipeline to filter by.

None
code_repository Optional[Union[UUID, str]]

Filter by code repository name/ID.

None
model Optional[Union[UUID, str]]

Filter by model name/ID.

None
stack Optional[Union[UUID, str]]

Filter by stack name/ID.

None
stack_component Optional[Union[UUID, str]]

Filter by stack component name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[PipelineRunResponse]

A page with Pipeline Runs fitting the filter description

Source code in src/zenml/client.py
3794
3795
3796
3797
3798
3799
3800
3801
3802
3803
3804
3805
3806
3807
3808
3809
3810
3811
3812
3813
3814
3815
3816
3817
3818
3819
3820
3821
3822
3823
3824
3825
3826
3827
3828
3829
3830
3831
3832
3833
3834
3835
3836
3837
3838
3839
3840
3841
3842
3843
3844
3845
3846
3847
3848
3849
3850
3851
3852
3853
3854
3855
3856
3857
3858
3859
3860
3861
3862
3863
3864
3865
3866
3867
3868
3869
3870
3871
3872
3873
3874
3875
3876
3877
3878
3879
3880
3881
3882
3883
3884
3885
3886
3887
3888
3889
3890
3891
3892
3893
3894
3895
3896
3897
3898
3899
3900
3901
3902
3903
3904
3905
3906
3907
3908
3909
3910
3911
3912
3913
3914
3915
3916
def list_pipeline_runs(
    self,
    sort_by: str = "desc:created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    pipeline_name: Optional[str] = None,
    user_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    schedule_id: Optional[Union[str, UUID]] = None,
    build_id: Optional[Union[str, UUID]] = None,
    deployment_id: Optional[Union[str, UUID]] = None,
    code_repository_id: Optional[Union[str, UUID]] = None,
    template_id: Optional[Union[str, UUID]] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    orchestrator_run_id: Optional[str] = None,
    status: Optional[str] = None,
    start_time: Optional[Union[datetime, str]] = None,
    end_time: Optional[Union[datetime, str]] = None,
    num_steps: Optional[Union[int, str]] = None,
    unlisted: Optional[bool] = None,
    templatable: Optional[bool] = None,
    tag: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    run_metadata: Optional[Dict[str, Any]] = None,
    pipeline: Optional[Union[UUID, str]] = None,
    code_repository: Optional[Union[UUID, str]] = None,
    model: Optional[Union[UUID, str]] = None,
    stack: Optional[Union[UUID, str]] = None,
    stack_component: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[PipelineRunResponse]:
    """List all pipeline runs.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: The id of the runs to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        workspace_id: The id of the workspace to filter by.
        pipeline_id: The id of the pipeline to filter by.
        pipeline_name: DEPRECATED. Use `pipeline` instead to filter by
            pipeline name.
        user_id: The id of the user to filter by.
        stack_id: The id of the stack to filter by.
        schedule_id: The id of the schedule to filter by.
        build_id: The id of the build to filter by.
        deployment_id: The id of the deployment to filter by.
        code_repository_id: The id of the code repository to filter by.
        template_id: The ID of the template to filter by.
        model_version_id: The ID of the model version to filter by.
        orchestrator_run_id: The run id of the orchestrator to filter by.
        name: The name of the run to filter by.
        status: The status of the pipeline run
        start_time: The start_time for the pipeline run
        end_time: The end_time for the pipeline run
        num_steps: The number of steps for the pipeline run
        unlisted: If the runs should be unlisted or not.
        templatable: If the runs should be templatable or not.
        tag: Tag to filter by.
        user: The name/ID of the user to filter by.
        run_metadata: The run_metadata of the run to filter by.
        pipeline: The name/ID of the pipeline to filter by.
        code_repository: Filter by code repository name/ID.
        model: Filter by model name/ID.
        stack: Filter by stack name/ID.
        stack_component: Filter by stack component name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with Pipeline Runs fitting the filter description
    """
    runs_filter_model = PipelineRunFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        workspace_id=workspace_id,
        pipeline_id=pipeline_id,
        pipeline_name=pipeline_name,
        schedule_id=schedule_id,
        build_id=build_id,
        deployment_id=deployment_id,
        code_repository_id=code_repository_id,
        template_id=template_id,
        model_version_id=model_version_id,
        orchestrator_run_id=orchestrator_run_id,
        user_id=user_id,
        stack_id=stack_id,
        status=status,
        start_time=start_time,
        end_time=end_time,
        num_steps=num_steps,
        tag=tag,
        unlisted=unlisted,
        user=user,
        run_metadata=run_metadata,
        pipeline=pipeline,
        code_repository=code_repository,
        stack=stack,
        model=model,
        stack_component=stack_component,
        templatable=templatable,
    )
    runs_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_runs(
        runs_filter_model=runs_filter_model,
        hydrate=hydrate,
    )

list_pipelines(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, latest_run_status=None, workspace_id=None, user_id=None, user=None, tag=None, hydrate=False)

List all pipelines.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of pipeline to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the pipeline to filter by.

None
latest_run_status Optional[str]

Filter by the status of the latest run of a pipeline.

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

The name/ID of the user to filter by.

None
tag Optional[str]

Tag to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[PipelineResponse]

A page with Pipeline fitting the filter description

Source code in src/zenml/client.py
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
def list_pipelines(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    latest_run_status: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    tag: Optional[str] = None,
    hydrate: bool = False,
) -> Page[PipelineResponse]:
    """List all pipelines.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of pipeline to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the pipeline to filter by.
        latest_run_status: Filter by the status of the latest run of a
            pipeline.
        workspace_id: The id of the workspace to filter by.
        user_id: The id of the user to filter by.
        user: The name/ID of the user to filter by.
        tag: Tag to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with Pipeline fitting the filter description
    """
    pipeline_filter_model = PipelineFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        latest_run_status=latest_run_status,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        tag=tag,
    )
    pipeline_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_pipelines(
        pipeline_filter_model=pipeline_filter_model,
        hydrate=hydrate,
    )

list_run_steps(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, cache_key=None, code_hash=None, status=None, start_time=None, end_time=None, pipeline_run_id=None, deployment_id=None, original_step_run_id=None, workspace_id=None, user_id=None, user=None, model_version_id=None, model=None, run_metadata=None, hydrate=False)

List all pipelines.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of runs to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
start_time Optional[Union[datetime, str]]

Use to filter by the time when the step started running

None
end_time Optional[Union[datetime, str]]

Use to filter by the time when the step finished running

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_run_id Optional[Union[str, UUID]]

The id of the pipeline run to filter by.

None
deployment_id Optional[Union[str, UUID]]

The id of the deployment to filter by.

None
original_step_run_id Optional[Union[str, UUID]]

The id of the original step run to filter by.

None
model_version_id Optional[Union[str, UUID]]

The ID of the model version to filter by.

None
model Optional[Union[UUID, str]]

Filter by model name/ID.

None
name Optional[str]

The name of the step run to filter by.

None
cache_key Optional[str]

The cache key of the step run to filter by.

None
code_hash Optional[str]

The code hash of the step run to filter by.

None
status Optional[str]

The name of the run to filter by.

None
run_metadata Optional[Dict[str, Any]]

Filter by run metadata.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[StepRunResponse]

A page with Pipeline fitting the filter description

Source code in src/zenml/client.py
3954
3955
3956
3957
3958
3959
3960
3961
3962
3963
3964
3965
3966
3967
3968
3969
3970
3971
3972
3973
3974
3975
3976
3977
3978
3979
3980
3981
3982
3983
3984
3985
3986
3987
3988
3989
3990
3991
3992
3993
3994
3995
3996
3997
3998
3999
4000
4001
4002
4003
4004
4005
4006
4007
4008
4009
4010
4011
4012
4013
4014
4015
4016
4017
4018
4019
4020
4021
4022
4023
4024
4025
4026
4027
4028
4029
4030
4031
4032
4033
4034
4035
4036
4037
4038
4039
def list_run_steps(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    cache_key: Optional[str] = None,
    code_hash: Optional[str] = None,
    status: Optional[str] = None,
    start_time: Optional[Union[datetime, str]] = None,
    end_time: Optional[Union[datetime, str]] = None,
    pipeline_run_id: Optional[Union[str, UUID]] = None,
    deployment_id: Optional[Union[str, UUID]] = None,
    original_step_run_id: Optional[Union[str, UUID]] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    model: Optional[Union[UUID, str]] = None,
    run_metadata: Optional[Dict[str, Any]] = None,
    hydrate: bool = False,
) -> Page[StepRunResponse]:
    """List all pipelines.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of runs to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        start_time: Use to filter by the time when the step started running
        end_time: Use to filter by the time when the step finished running
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        pipeline_run_id: The id of the pipeline run to filter by.
        deployment_id: The id of the deployment to filter by.
        original_step_run_id: The id of the original step run to filter by.
        model_version_id: The ID of the model version to filter by.
        model: Filter by model name/ID.
        name: The name of the step run to filter by.
        cache_key: The cache key of the step run to filter by.
        code_hash: The code hash of the step run to filter by.
        status: The name of the run to filter by.
        run_metadata: Filter by run metadata.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page with Pipeline fitting the filter description
    """
    step_run_filter_model = StepRunFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        cache_key=cache_key,
        code_hash=code_hash,
        pipeline_run_id=pipeline_run_id,
        deployment_id=deployment_id,
        original_step_run_id=original_step_run_id,
        status=status,
        created=created,
        updated=updated,
        start_time=start_time,
        end_time=end_time,
        name=name,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        model_version_id=model_version_id,
        model=model,
        run_metadata=run_metadata,
    )
    step_run_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_run_steps(
        step_run_filter_model=step_run_filter_model,
        hydrate=hydrate,
    )

list_run_templates(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, created=None, updated=None, id=None, name=None, tag=None, workspace_id=None, user_id=None, pipeline_id=None, build_id=None, stack_id=None, code_repository_id=None, user=None, pipeline=None, stack=None, hydrate=False)

Get a page of run templates.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
created Optional[Union[datetime, str]]

Filter by the creation date.

None
updated Optional[Union[datetime, str]]

Filter by the last updated date.

None
id Optional[Union[UUID, str]]

Filter by run template ID.

None
name Optional[str]

Filter by run template name.

None
tag Optional[str]

Filter by run template tags.

None
workspace_id Optional[Union[str, UUID]]

Filter by workspace ID.

None
user_id Optional[Union[str, UUID]]

Filter by user ID.

None
pipeline_id Optional[Union[str, UUID]]

Filter by pipeline ID.

None
build_id Optional[Union[str, UUID]]

Filter by build ID.

None
stack_id Optional[Union[str, UUID]]

Filter by stack ID.

None
code_repository_id Optional[Union[str, UUID]]

Filter by code repository ID.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline Optional[Union[UUID, str]]

Filter by pipeline name/ID.

None
stack Optional[Union[UUID, str]]

Filter by stack name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[RunTemplateResponse]

A page of run templates.

Source code in src/zenml/client.py
3508
3509
3510
3511
3512
3513
3514
3515
3516
3517
3518
3519
3520
3521
3522
3523
3524
3525
3526
3527
3528
3529
3530
3531
3532
3533
3534
3535
3536
3537
3538
3539
3540
3541
3542
3543
3544
3545
3546
3547
3548
3549
3550
3551
3552
3553
3554
3555
3556
3557
3558
3559
3560
3561
3562
3563
3564
3565
3566
3567
3568
3569
3570
3571
3572
3573
3574
3575
3576
3577
3578
3579
3580
def list_run_templates(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    id: Optional[Union[UUID, str]] = None,
    name: Optional[str] = None,
    tag: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    build_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    code_repository_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline: Optional[Union[UUID, str]] = None,
    stack: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[RunTemplateResponse]:
    """Get a page of run templates.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        created: Filter by the creation date.
        updated: Filter by the last updated date.
        id: Filter by run template ID.
        name: Filter by run template name.
        tag: Filter by run template tags.
        workspace_id: Filter by workspace ID.
        user_id: Filter by user ID.
        pipeline_id: Filter by pipeline ID.
        build_id: Filter by build ID.
        stack_id: Filter by stack ID.
        code_repository_id: Filter by code repository ID.
        user: Filter by user name/ID.
        pipeline: Filter by pipeline name/ID.
        stack: Filter by stack name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of run templates.
    """
    filter = RunTemplateFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        created=created,
        updated=updated,
        id=id,
        name=name,
        tag=tag,
        workspace_id=workspace_id,
        user_id=user_id,
        pipeline_id=pipeline_id,
        build_id=build_id,
        stack_id=stack_id,
        code_repository_id=code_repository_id,
        user=user,
        pipeline=pipeline,
        stack=stack,
    )

    return self.zen_store.list_run_templates(
        template_filter_model=filter, hydrate=hydrate
    )

list_schedules(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, workspace_id=None, user_id=None, user=None, pipeline_id=None, orchestrator_id=None, active=None, cron_expression=None, start_time=None, end_time=None, interval_second=None, catchup=None, hydrate=False, run_once_start_time=None)

List schedules.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

The name of the stack to filter by.

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
pipeline_id Optional[Union[str, UUID]]

The id of the pipeline to filter by.

None
orchestrator_id Optional[Union[str, UUID]]

The id of the orchestrator to filter by.

None
active Optional[Union[str, bool]]

Use to filter by active status.

None
cron_expression Optional[str]

Use to filter by cron expression.

None
start_time Optional[Union[datetime, str]]

Use to filter by start time.

None
end_time Optional[Union[datetime, str]]

Use to filter by end time.

None
interval_second Optional[int]

Use to filter by interval second.

None
catchup Optional[Union[str, bool]]

Use to filter by catchup.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
run_once_start_time Optional[Union[datetime, str]]

Use to filter by run once start time.

None

Returns:

Type Description
Page[ScheduleResponse]

A list of schedules.

Source code in src/zenml/client.py
3669
3670
3671
3672
3673
3674
3675
3676
3677
3678
3679
3680
3681
3682
3683
3684
3685
3686
3687
3688
3689
3690
3691
3692
3693
3694
3695
3696
3697
3698
3699
3700
3701
3702
3703
3704
3705
3706
3707
3708
3709
3710
3711
3712
3713
3714
3715
3716
3717
3718
3719
3720
3721
3722
3723
3724
3725
3726
3727
3728
3729
3730
3731
3732
3733
3734
3735
3736
3737
3738
3739
3740
3741
3742
3743
3744
3745
3746
3747
3748
def list_schedules(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    pipeline_id: Optional[Union[str, UUID]] = None,
    orchestrator_id: Optional[Union[str, UUID]] = None,
    active: Optional[Union[str, bool]] = None,
    cron_expression: Optional[str] = None,
    start_time: Optional[Union[datetime, str]] = None,
    end_time: Optional[Union[datetime, str]] = None,
    interval_second: Optional[int] = None,
    catchup: Optional[Union[str, bool]] = None,
    hydrate: bool = False,
    run_once_start_time: Optional[Union[datetime, str]] = None,
) -> Page[ScheduleResponse]:
    """List schedules.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: The name of the stack to filter by.
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        pipeline_id: The id of the pipeline to filter by.
        orchestrator_id: The id of the orchestrator to filter by.
        active: Use to filter by active status.
        cron_expression: Use to filter by cron expression.
        start_time: Use to filter by start time.
        end_time: Use to filter by end time.
        interval_second: Use to filter by interval second.
        catchup: Use to filter by catchup.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        run_once_start_time: Use to filter by run once start time.

    Returns:
        A list of schedules.
    """
    schedule_filter_model = ScheduleFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        name=name,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        pipeline_id=pipeline_id,
        orchestrator_id=orchestrator_id,
        active=active,
        cron_expression=cron_expression,
        start_time=start_time,
        end_time=end_time,
        interval_second=interval_second,
        catchup=catchup,
        run_once_start_time=run_once_start_time,
    )
    schedule_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_schedules(
        schedule_filter_model=schedule_filter_model,
        hydrate=hydrate,
    )

list_secrets(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, scope=None, workspace_id=None, user_id=None, user=None, hydrate=False)

Fetches all the secret models.

The returned secrets do not contain the secret values. To get the secret values, use get_secret individually for each secret.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of secrets to filter by.

None
created Optional[datetime]

Use to secrets by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
name Optional[str]

The name of the secret to filter by.

None
scope Optional[SecretScope]

The scope of the secret to filter by.

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[SecretResponse]

A list of all the secret models without the secret values.

Raises:

Type Description
NotImplementedError

If centralized secrets management is not enabled.

Source code in src/zenml/client.py
4689
4690
4691
4692
4693
4694
4695
4696
4697
4698
4699
4700
4701
4702
4703
4704
4705
4706
4707
4708
4709
4710
4711
4712
4713
4714
4715
4716
4717
4718
4719
4720
4721
4722
4723
4724
4725
4726
4727
4728
4729
4730
4731
4732
4733
4734
4735
4736
4737
4738
4739
4740
4741
4742
4743
4744
4745
4746
4747
4748
4749
4750
4751
4752
4753
4754
4755
4756
4757
def list_secrets(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    scope: Optional[SecretScope] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[SecretResponse]:
    """Fetches all the secret models.

    The returned secrets do not contain the secret values. To get the
    secret values, use `get_secret` individually for each secret.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of secrets to filter by.
        created: Use to secrets by time of creation
        updated: Use the last updated date for filtering
        name: The name of the secret to filter by.
        scope: The scope of the secret to filter by.
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of all the secret models without the secret values.

    Raises:
        NotImplementedError: If centralized secrets management is not
            enabled.
    """
    secret_filter_model = SecretFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        user_id=user_id,
        user=user,
        workspace_id=workspace_id,
        name=name,
        scope=scope,
        id=id,
        created=created,
        updated=updated,
    )
    secret_filter_model.set_scope_workspace(self.active_workspace.id)
    try:
        return self.zen_store.list_secrets(
            secret_filter_model=secret_filter_model,
            hydrate=hydrate,
        )
    except NotImplementedError:
        raise NotImplementedError(
            "centralized secrets management is not supported or explicitly "
            "disabled in the target ZenML deployment."
        )

list_secrets_in_scope(scope, hydrate=False)

Fetches the list of secret in a given scope.

The returned secrets do not contain the secret values. To get the secret values, use get_secret individually for each secret.

Parameters:

Name Type Description Default
scope SecretScope

The secrets scope to search for.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[SecretResponse]

The list of secrets in the given scope without the secret values.

Source code in src/zenml/client.py
4904
4905
4906
4907
4908
4909
4910
4911
4912
4913
4914
4915
4916
4917
4918
4919
4920
4921
4922
4923
4924
def list_secrets_in_scope(
    self,
    scope: SecretScope,
    hydrate: bool = False,
) -> Page[SecretResponse]:
    """Fetches the list of secret in a given scope.

    The returned secrets do not contain the secret values. To get the
    secret values, use `get_secret` individually for each secret.

    Args:
        scope: The secrets scope to search for.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The list of secrets in the given scope without the secret values.
    """
    logger.debug(f"Fetching the secrets in scope {scope.value}.")

    return self.list_secrets(scope=scope, hydrate=hydrate)

list_service_accounts(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, description=None, active=None, hydrate=False)

List all service accounts.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

Use the service account name for filtering

None
description Optional[str]

Use the service account description for filtering

None
active Optional[bool]

Use the service account active status for filtering

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ServiceAccountResponse]

The list of service accounts matching the filter description.

Source code in src/zenml/client.py
7131
7132
7133
7134
7135
7136
7137
7138
7139
7140
7141
7142
7143
7144
7145
7146
7147
7148
7149
7150
7151
7152
7153
7154
7155
7156
7157
7158
7159
7160
7161
7162
7163
7164
7165
7166
7167
7168
7169
7170
7171
7172
7173
7174
7175
7176
7177
7178
def list_service_accounts(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
    hydrate: bool = False,
) -> Page[ServiceAccountResponse]:
    """List all service accounts.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: Use the service account name for filtering
        description: Use the service account description for filtering
        active: Use the service account active status for filtering
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The list of service accounts matching the filter description.
    """
    return self.zen_store.list_service_accounts(
        ServiceAccountFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
            description=description,
            active=active,
        ),
        hydrate=hydrate,
    )

list_service_connector_resources(connector_type=None, resource_type=None, resource_id=None)

List resources that can be accessed by service connectors.

Parameters:

Name Type Description Default
connector_type Optional[str]

The type of service connector to filter by.

None
resource_type Optional[str]

The type of resource to filter by.

None
resource_id Optional[str]

The ID of a particular resource instance to filter by.

None

Returns:

Type Description
List[ServiceConnectorResourcesModel]

The matching list of resources that available service

List[ServiceConnectorResourcesModel]

connectors have access to.

Source code in src/zenml/client.py
5977
5978
5979
5980
5981
5982
5983
5984
5985
5986
5987
5988
5989
5990
5991
5992
5993
5994
5995
5996
5997
5998
5999
def list_service_connector_resources(
    self,
    connector_type: Optional[str] = None,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
) -> List[ServiceConnectorResourcesModel]:
    """List resources that can be accessed by service connectors.

    Args:
        connector_type: The type of service connector to filter by.
        resource_type: The type of resource to filter by.
        resource_id: The ID of a particular resource instance to filter by.

    Returns:
        The matching list of resources that available service
        connectors have access to.
    """
    return self.zen_store.list_service_connector_resources(
        workspace_name_or_id=self.active_workspace.id,
        connector_type=connector_type,
        resource_type=resource_type,
        resource_id=resource_id,
    )

list_service_connector_types(connector_type=None, resource_type=None, auth_method=None)

Get a list of service connector types.

Parameters:

Name Type Description Default
connector_type Optional[str]

Filter by connector type.

None
resource_type Optional[str]

Filter by resource type.

None
auth_method Optional[str]

Filter by authentication method.

None

Returns:

Type Description
List[ServiceConnectorTypeModel]

List of service connector types.

Source code in src/zenml/client.py
6001
6002
6003
6004
6005
6006
6007
6008
6009
6010
6011
6012
6013
6014
6015
6016
6017
6018
6019
6020
6021
def list_service_connector_types(
    self,
    connector_type: Optional[str] = None,
    resource_type: Optional[str] = None,
    auth_method: Optional[str] = None,
) -> List[ServiceConnectorTypeModel]:
    """Get a list of service connector types.

    Args:
        connector_type: Filter by connector type.
        resource_type: Filter by resource type.
        auth_method: Filter by authentication method.

    Returns:
        List of service connector types.
    """
    return self.zen_store.list_service_connector_types(
        connector_type=connector_type,
        resource_type=resource_type,
        auth_method=auth_method,
    )

list_service_connectors(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, connector_type=None, auth_method=None, resource_type=None, resource_id=None, workspace_id=None, user_id=None, user=None, labels=None, secret_id=None, hydrate=False)

Lists all registered service connectors.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

The id of the service connector to filter by.

None
created Optional[datetime]

Filter service connectors by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
connector_type Optional[str]

Use the service connector type for filtering

None
auth_method Optional[str]

Use the service connector auth method for filtering

None
resource_type Optional[str]

Filter service connectors by the resource type that they can give access to.

None
resource_id Optional[str]

Filter service connectors by the resource id that they can give access to.

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the service connector to filter by.

None
labels Optional[Dict[str, Optional[str]]]

The labels of the service connector to filter by.

None
secret_id Optional[Union[str, UUID]]

Filter by the id of the secret that is referenced by the service connector.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ServiceConnectorResponse]

A page of service connectors.

Source code in src/zenml/client.py
5460
5461
5462
5463
5464
5465
5466
5467
5468
5469
5470
5471
5472
5473
5474
5475
5476
5477
5478
5479
5480
5481
5482
5483
5484
5485
5486
5487
5488
5489
5490
5491
5492
5493
5494
5495
5496
5497
5498
5499
5500
5501
5502
5503
5504
5505
5506
5507
5508
5509
5510
5511
5512
5513
5514
5515
5516
5517
5518
5519
5520
5521
5522
5523
5524
5525
5526
5527
5528
5529
5530
5531
5532
5533
def list_service_connectors(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    connector_type: Optional[str] = None,
    auth_method: Optional[str] = None,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    labels: Optional[Dict[str, Optional[str]]] = None,
    secret_id: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
) -> Page[ServiceConnectorResponse]:
    """Lists all registered service connectors.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: The id of the service connector to filter by.
        created: Filter service connectors by time of creation
        updated: Use the last updated date for filtering
        connector_type: Use the service connector type for filtering
        auth_method: Use the service connector auth method for filtering
        resource_type: Filter service connectors by the resource type that
            they can give access to.
        resource_id: Filter service connectors by the resource id that
            they can give access to.
        workspace_id: The id of the workspace to filter by.
        user_id: The id of the user to filter by.
        user: Filter by user name/ID.
        name: The name of the service connector to filter by.
        labels: The labels of the service connector to filter by.
        secret_id: Filter by the id of the secret that is referenced by the
            service connector.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of service connectors.
    """
    connector_filter_model = ServiceConnectorFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        workspace_id=workspace_id or self.active_workspace.id,
        user_id=user_id,
        user=user,
        name=name,
        connector_type=connector_type,
        auth_method=auth_method,
        resource_type=resource_type,
        resource_id=resource_id,
        id=id,
        created=created,
        updated=updated,
        labels=labels,
        secret_id=secret_id,
    )
    connector_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_service_connectors(
        filter_model=connector_filter_model,
        hydrate=hydrate,
    )

list_services(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, type=None, flavor=None, user=None, workspace_id=None, user_id=None, hydrate=False, running=None, service_name=None, pipeline_name=None, pipeline_run_id=None, pipeline_step_name=None, model_version_id=None, config=None)

List all services.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of services to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
type Optional[str]

Use the service type for filtering

None
flavor Optional[str]

Use the service flavor for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False
running Optional[bool]

Use the running status for filtering

None
pipeline_name Optional[str]

Use the pipeline name for filtering

None
service_name Optional[str]

Use the service name or model name for filtering

None
pipeline_step_name Optional[str]

Use the pipeline step name for filtering

None
model_version_id Optional[Union[str, UUID]]

Use the model version id for filtering

None
config Optional[Dict[str, Any]]

Use the config for filtering

None
pipeline_run_id Optional[str]

Use the pipeline run id for filtering

None

Returns:

Type Description
Page[ServiceResponse]

The Service response page.

Source code in src/zenml/client.py
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
def list_services(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    type: Optional[str] = None,
    flavor: Optional[str] = None,
    user: Optional[Union[UUID, str]] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    hydrate: bool = False,
    running: Optional[bool] = None,
    service_name: Optional[str] = None,
    pipeline_name: Optional[str] = None,
    pipeline_run_id: Optional[str] = None,
    pipeline_step_name: Optional[str] = None,
    model_version_id: Optional[Union[str, UUID]] = None,
    config: Optional[Dict[str, Any]] = None,
) -> Page[ServiceResponse]:
    """List all services.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of services to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        type: Use the service type for filtering
        flavor: Use the service flavor for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The id of the user to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.
        running: Use the running status for filtering
        pipeline_name: Use the pipeline name for filtering
        service_name: Use the service name or model name
            for filtering
        pipeline_step_name: Use the pipeline step name for filtering
        model_version_id: Use the model version id for filtering
        config: Use the config for filtering
        pipeline_run_id: Use the pipeline run id for filtering

    Returns:
        The Service response page.
    """
    service_filter_model = ServiceFilter(
        sort_by=sort_by,
        page=page,
        size=size,
        logical_operator=logical_operator,
        id=id,
        created=created,
        updated=updated,
        type=type,
        flavor=flavor,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        running=running,
        name=service_name,
        pipeline_name=pipeline_name,
        pipeline_step_name=pipeline_step_name,
        model_version_id=model_version_id,
        pipeline_run_id=pipeline_run_id,
        config=dict_to_bytes(config) if config else None,
    )
    service_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_services(
        filter_model=service_filter_model, hydrate=hydrate
    )

list_stack_components(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, flavor=None, type=None, workspace_id=None, user_id=None, connector_id=None, stack_id=None, user=None, hydrate=False)

Lists all registered stack components.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of component to filter by.

None
created Optional[datetime]

Use to component by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
flavor Optional[str]

Use the component flavor for filtering

None
type Optional[str]

Use the component type for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
connector_id Optional[Union[str, UUID]]

The id of the connector to filter by.

None
stack_id Optional[Union[str, UUID]]

The id of the stack to filter by.

None
name Optional[str]

The name of the component to filter by.

None
user Optional[Union[UUID, str]]

The ID of name of the user to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[ComponentResponse]

A page of stack components.

Source code in src/zenml/client.py
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
def list_stack_components(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    flavor: Optional[str] = None,
    type: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    connector_id: Optional[Union[str, UUID]] = None,
    stack_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[ComponentResponse]:
    """Lists all registered stack components.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of component to filter by.
        created: Use to component by time of creation
        updated: Use the last updated date for filtering
        flavor: Use the component flavor for filtering
        type: Use the component type for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The id of the user to filter by.
        connector_id: The id of the connector to filter by.
        stack_id: The id of the stack to filter by.
        name: The name of the component to filter by.
        user: The ID of name of the user to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of stack components.
    """
    component_filter_model = ComponentFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        workspace_id=workspace_id or self.active_workspace.id,
        user_id=user_id,
        connector_id=connector_id,
        stack_id=stack_id,
        name=name,
        flavor=flavor,
        type=type,
        id=id,
        created=created,
        updated=updated,
        user=user,
    )
    component_filter_model.set_scope_workspace(self.active_workspace.id)

    return self.zen_store.list_stack_components(
        component_filter_model=component_filter_model, hydrate=hydrate
    )

list_stacks(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, description=None, workspace_id=None, user_id=None, component_id=None, user=None, component=None, hydrate=False)

Lists all stacks.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
description Optional[str]

Use the stack description for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
component_id Optional[Union[str, UUID]]

The id of the component to filter by.

None
user Optional[Union[UUID, str]]

The name/ID of the user to filter by.

None
component Optional[Union[UUID, str]]

The name/ID of the component to filter by.

None
name Optional[str]

The name of the stack to filter by.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[StackResponse]

A page of stacks.

Source code in src/zenml/client.py
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
def list_stacks(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    description: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    component_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    component: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[StackResponse]:
    """Lists all stacks.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        description: Use the stack description for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        component_id: The id of the component to filter by.
        user: The name/ID of the user to filter by.
        component: The name/ID of the component to filter by.
        name: The name of the stack to filter by.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of stacks.
    """
    stack_filter_model = StackFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        workspace_id=workspace_id,
        user_id=user_id,
        component_id=component_id,
        user=user,
        component=component,
        name=name,
        description=description,
        id=id,
        created=created,
        updated=updated,
    )
    stack_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_stacks(stack_filter_model, hydrate=hydrate)

list_tags(tag_filter_model, hydrate=False)

Get tags by filter.

Parameters:

Name Type Description Default
tag_filter_model TagFilter

All filter parameters including pagination params.

required
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[TagResponse]

A page of all tags.

Source code in src/zenml/client.py
7569
7570
7571
7572
7573
7574
7575
7576
7577
7578
7579
7580
7581
7582
7583
7584
7585
7586
7587
def list_tags(
    self,
    tag_filter_model: TagFilter,
    hydrate: bool = False,
) -> Page[TagResponse]:
    """Get tags by filter.

    Args:
        tag_filter_model: All filter parameters including pagination
            params.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of all tags.
    """
    return self.zen_store.list_tags(
        tag_filter_model=tag_filter_model, hydrate=hydrate
    )

list_trigger_executions(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, trigger_id=None, user=None, hydrate=False)

List all trigger executions matching the given filter criteria.

Parameters:

Name Type Description Default
sort_by str

The column to sort by.

'created'
page int

The page of items.

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages.

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or].

AND
trigger_id Optional[UUID]

ID of the trigger to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[TriggerExecutionResponse]

A list of all trigger executions matching the filter criteria.

Source code in src/zenml/client.py
6821
6822
6823
6824
6825
6826
6827
6828
6829
6830
6831
6832
6833
6834
6835
6836
6837
6838
6839
6840
6841
6842
6843
6844
6845
6846
6847
6848
6849
6850
6851
6852
6853
6854
6855
6856
6857
def list_trigger_executions(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    trigger_id: Optional[UUID] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[TriggerExecutionResponse]:
    """List all trigger executions matching the given filter criteria.

    Args:
        sort_by: The column to sort by.
        page: The page of items.
        size: The maximum size of all pages.
        logical_operator: Which logical operator to use [and, or].
        trigger_id: ID of the trigger to filter by.
        user: Filter by user name/ID.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A list of all trigger executions matching the filter criteria.
    """
    filter_model = TriggerExecutionFilter(
        trigger_id=trigger_id,
        sort_by=sort_by,
        page=page,
        size=size,
        user=user,
        logical_operator=logical_operator,
    )
    filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_trigger_executions(
        trigger_execution_filter_model=filter_model, hydrate=hydrate
    )

list_triggers(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, event_source_id=None, action_id=None, event_source_flavor=None, event_source_subtype=None, action_flavor=None, action_subtype=None, workspace_id=None, user_id=None, user=None, hydrate=False)

Lists all triggers.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of triggers to filter by.

None
created Optional[datetime]

Use to filter by time of creation

None
updated Optional[datetime]

Use the last updated date for filtering

None
workspace_id Optional[Union[str, UUID]]

The id of the workspace to filter by.

None
user_id Optional[Union[str, UUID]]

The id of the user to filter by.

None
user Optional[Union[UUID, str]]

Filter by user name/ID.

None
name Optional[str]

The name of the trigger to filter by.

None
event_source_id Optional[UUID]

The event source associated with the trigger.

None
action_id Optional[UUID]

The action associated with the trigger.

None
event_source_flavor Optional[str]

Flavor of the event source associated with the trigger.

None
event_source_subtype Optional[str]

Type of the event source associated with the trigger.

None
action_flavor Optional[str]

Flavor of the action associated with the trigger.

None
action_subtype Optional[str]

Type of the action associated with the trigger.

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[TriggerResponse]

A page of triggers.

Source code in src/zenml/client.py
3182
3183
3184
3185
3186
3187
3188
3189
3190
3191
3192
3193
3194
3195
3196
3197
3198
3199
3200
3201
3202
3203
3204
3205
3206
3207
3208
3209
3210
3211
3212
3213
3214
3215
3216
3217
3218
3219
3220
3221
3222
3223
3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
3239
3240
3241
3242
3243
3244
3245
3246
3247
3248
3249
3250
3251
3252
3253
3254
@_fail_for_sql_zen_store
def list_triggers(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[datetime] = None,
    updated: Optional[datetime] = None,
    name: Optional[str] = None,
    event_source_id: Optional[UUID] = None,
    action_id: Optional[UUID] = None,
    event_source_flavor: Optional[str] = None,
    event_source_subtype: Optional[str] = None,
    action_flavor: Optional[str] = None,
    action_subtype: Optional[str] = None,
    workspace_id: Optional[Union[str, UUID]] = None,
    user_id: Optional[Union[str, UUID]] = None,
    user: Optional[Union[UUID, str]] = None,
    hydrate: bool = False,
) -> Page[TriggerResponse]:
    """Lists all triggers.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of triggers to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        workspace_id: The id of the workspace to filter by.
        user_id: The  id of the user to filter by.
        user: Filter by user name/ID.
        name: The name of the trigger to filter by.
        event_source_id: The event source associated with the trigger.
        action_id: The action associated with the trigger.
        event_source_flavor: Flavor of the event source associated with the
            trigger.
        event_source_subtype: Type of the event source associated with the
            trigger.
        action_flavor: Flavor of the action associated with the trigger.
        action_subtype: Type of the action associated with the trigger.
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        A page of triggers.
    """
    trigger_filter_model = TriggerFilter(
        page=page,
        size=size,
        sort_by=sort_by,
        logical_operator=logical_operator,
        workspace_id=workspace_id,
        user_id=user_id,
        user=user,
        name=name,
        event_source_id=event_source_id,
        action_id=action_id,
        event_source_flavor=event_source_flavor,
        event_source_subtype=event_source_subtype,
        action_flavor=action_flavor,
        action_subtype=action_subtype,
        id=id,
        created=created,
        updated=updated,
    )
    trigger_filter_model.set_scope_workspace(self.active_workspace.id)
    return self.zen_store.list_triggers(
        trigger_filter_model, hydrate=hydrate
    )

list_users(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, external_user_id=None, created=None, updated=None, name=None, full_name=None, email=None, active=None, email_opted_in=None, hydrate=False)

List all users.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the id of stacks to filter by.

None
external_user_id Optional[str]

Use the external user id for filtering.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

Use the username for filtering

None
full_name Optional[str]

Use the user full name for filtering

None
email Optional[str]

Use the user email for filtering

None
active Optional[bool]

User the user active status for filtering

None
email_opted_in Optional[bool]

Use the user opt in status for filtering

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[UserResponse]

The User

Source code in src/zenml/client.py
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
def list_users(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    external_user_id: Optional[str] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    full_name: Optional[str] = None,
    email: Optional[str] = None,
    active: Optional[bool] = None,
    email_opted_in: Optional[bool] = None,
    hydrate: bool = False,
) -> Page[UserResponse]:
    """List all users.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the id of stacks to filter by.
        external_user_id: Use the external user id for filtering.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: Use the username for filtering
        full_name: Use the user full name for filtering
        email: Use the user email for filtering
        active: User the user active status for filtering
        email_opted_in: Use the user opt in status for filtering
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        The User
    """
    return self.zen_store.list_users(
        UserFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            external_user_id=external_user_id,
            created=created,
            updated=updated,
            name=name,
            full_name=full_name,
            email=email,
            active=active,
            email_opted_in=email_opted_in,
        ),
        hydrate=hydrate,
    )

list_workspaces(sort_by='created', page=PAGINATION_STARTING_PAGE, size=PAGE_SIZE_DEFAULT, logical_operator=LogicalOperators.AND, id=None, created=None, updated=None, name=None, hydrate=False)

List all workspaces.

Parameters:

Name Type Description Default
sort_by str

The column to sort by

'created'
page int

The page of items

PAGINATION_STARTING_PAGE
size int

The maximum size of all pages

PAGE_SIZE_DEFAULT
logical_operator LogicalOperators

Which logical operator to use [and, or]

AND
id Optional[Union[UUID, str]]

Use the workspace ID to filter by.

None
created Optional[Union[datetime, str]]

Use to filter by time of creation

None
updated Optional[Union[datetime, str]]

Use the last updated date for filtering

None
name Optional[str]

Use the workspace name for filtering

None
hydrate bool

Flag deciding whether to hydrate the output model(s) by including metadata fields in the response.

False

Returns:

Type Description
Page[WorkspaceResponse]

Page of workspaces

Source code in src/zenml/client.py
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
def list_workspaces(
    self,
    sort_by: str = "created",
    page: int = PAGINATION_STARTING_PAGE,
    size: int = PAGE_SIZE_DEFAULT,
    logical_operator: LogicalOperators = LogicalOperators.AND,
    id: Optional[Union[UUID, str]] = None,
    created: Optional[Union[datetime, str]] = None,
    updated: Optional[Union[datetime, str]] = None,
    name: Optional[str] = None,
    hydrate: bool = False,
) -> Page[WorkspaceResponse]:
    """List all workspaces.

    Args:
        sort_by: The column to sort by
        page: The page of items
        size: The maximum size of all pages
        logical_operator: Which logical operator to use [and, or]
        id: Use the workspace ID to filter by.
        created: Use to filter by time of creation
        updated: Use the last updated date for filtering
        name: Use the workspace name for filtering
        hydrate: Flag deciding whether to hydrate the output model(s)
            by including metadata fields in the response.

    Returns:
        Page of workspaces
    """
    return self.zen_store.list_workspaces(
        WorkspaceFilter(
            sort_by=sort_by,
            page=page,
            size=size,
            logical_operator=logical_operator,
            id=id,
            created=created,
            updated=updated,
            name=name,
        ),
        hydrate=hydrate,
    )

login_service_connector(name_id_or_prefix, resource_type=None, resource_id=None, **kwargs)

Use a service connector to authenticate a local client/SDK.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to use.

required
resource_type Optional[str]

The type of the resource to connect to. If not provided, the resource type from the service connector configuration will be used.

None
resource_id Optional[str]

The ID of a particular resource instance to configure the local client to connect to. If the connector instance is already configured with a resource ID that is not the same or equivalent to the one requested, a ValueError exception is raised. May be omitted for connectors and resource types that do not support multiple resource instances.

None
kwargs Any

Additional implementation specific keyword arguments to use to configure the client.

{}

Returns:

Type Description
ServiceConnector

The service connector client instance that was used to configure the

ServiceConnector

local client.

Source code in src/zenml/client.py
5857
5858
5859
5860
5861
5862
5863
5864
5865
5866
5867
5868
5869
5870
5871
5872
5873
5874
5875
5876
5877
5878
5879
5880
5881
5882
5883
5884
5885
5886
5887
5888
5889
5890
5891
5892
5893
5894
5895
5896
def login_service_connector(
    self,
    name_id_or_prefix: Union[UUID, str],
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    **kwargs: Any,
) -> "ServiceConnector":
    """Use a service connector to authenticate a local client/SDK.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to use.
        resource_type: The type of the resource to connect to. If not
            provided, the resource type from the service connector
            configuration will be used.
        resource_id: The ID of a particular resource instance to configure
            the local client to connect to. If the connector instance is
            already configured with a resource ID that is not the same or
            equivalent to the one requested, a `ValueError` exception is
            raised. May be omitted for connectors and resource types that do
            not support multiple resource instances.
        kwargs: Additional implementation specific keyword arguments to use
            to configure the client.

    Returns:
        The service connector client instance that was used to configure the
        local client.
    """
    connector_client = self.get_service_connector_client(
        name_id_or_prefix=name_id_or_prefix,
        resource_type=resource_type,
        resource_id=resource_id,
        verify=False,
    )

    connector_client.configure_local_client(
        **kwargs,
    )

    return connector_client

prune_artifacts(only_versions=True, delete_from_artifact_store=False)

Delete all unused artifacts and artifact versions.

Parameters:

Name Type Description Default
only_versions bool

Only delete artifact versions, keeping artifacts

True
delete_from_artifact_store bool

Delete data from artifact metadata

False
Source code in src/zenml/client.py
4159
4160
4161
4162
4163
4164
4165
4166
4167
4168
4169
4170
4171
4172
4173
4174
4175
4176
4177
4178
4179
4180
def prune_artifacts(
    self,
    only_versions: bool = True,
    delete_from_artifact_store: bool = False,
) -> None:
    """Delete all unused artifacts and artifact versions.

    Args:
        only_versions: Only delete artifact versions, keeping artifacts
        delete_from_artifact_store: Delete data from artifact metadata
    """
    if delete_from_artifact_store:
        unused_artifact_versions = depaginate(
            self.list_artifact_versions, only_unused=True
        )
        for unused_artifact_version in unused_artifact_versions:
            self._delete_artifact_from_artifact_store(
                unused_artifact_version
            )

    self.zen_store.prune_artifact_versions(only_versions)
    logger.info("All unused artifacts and artifact versions deleted.")

restore_secrets(ignore_errors=False, delete_secrets=False)

Restore all secrets from the configured backup secrets store.

Parameters:

Name Type Description Default
ignore_errors bool

Whether to ignore individual errors during the restore process and attempt to restore all secrets.

False
delete_secrets bool

Whether to delete the secrets that have been successfully restored from the backup secrets store. Setting this flag effectively moves all secrets from the backup secrets store to the primary secrets store.

False
Source code in src/zenml/client.py
4945
4946
4947
4948
4949
4950
4951
4952
4953
4954
4955
4956
4957
4958
4959
4960
4961
4962
def restore_secrets(
    self,
    ignore_errors: bool = False,
    delete_secrets: bool = False,
) -> None:
    """Restore all secrets from the configured backup secrets store.

    Args:
        ignore_errors: Whether to ignore individual errors during the
            restore process and attempt to restore all secrets.
        delete_secrets: Whether to delete the secrets that have been
            successfully restored from the backup secrets store. Setting
            this flag effectively moves all secrets from the backup secrets
            store to the primary secrets store.
    """
    self.zen_store.restore_secrets(
        ignore_errors=ignore_errors, delete_secrets=delete_secrets
    )

rotate_api_key(service_account_name_id_or_prefix, name_id_or_prefix, retain_period_minutes=0, set_key=False)

Rotate an API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to rotate the API key for.

required
name_id_or_prefix Union[UUID, str]

Name, ID or prefix of the API key to update.

required
retain_period_minutes int

The number of minutes to retain the old API key for. If set to 0, the old API key will be invalidated.

0
set_key bool

Whether to set the rotated API key as the active API key.

False

Returns:

Type Description
APIKeyResponse

The updated API key.

Source code in src/zenml/client.py
7447
7448
7449
7450
7451
7452
7453
7454
7455
7456
7457
7458
7459
7460
7461
7462
7463
7464
7465
7466
7467
7468
7469
7470
7471
7472
7473
7474
7475
7476
7477
7478
7479
7480
7481
7482
7483
7484
def rotate_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[UUID, str],
    retain_period_minutes: int = 0,
    set_key: bool = False,
) -> APIKeyResponse:
    """Rotate an API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to rotate the API key for.
        name_id_or_prefix: Name, ID or prefix of the API key to update.
        retain_period_minutes: The number of minutes to retain the old API
            key for. If set to 0, the old API key will be invalidated.
        set_key: Whether to set the rotated API key as the active API key.

    Returns:
        The updated API key.
    """
    api_key = self.get_api_key(
        service_account_name_id_or_prefix=service_account_name_id_or_prefix,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    rotate_request = APIKeyRotateRequest(
        retain_period_minutes=retain_period_minutes
    )
    new_key = self.zen_store.rotate_api_key(
        service_account_id=api_key.service_account.id,
        api_key_name_or_id=api_key.id,
        rotate_request=rotate_request,
    )
    assert new_key.key is not None
    if set_key:
        self.set_api_key(key=new_key.key)

    return new_key

set_active_workspace(workspace_name_or_id)

Set the workspace for the local client.

Parameters:

Name Type Description Default
workspace_name_or_id Union[str, UUID]

The name or ID of the workspace to set active.

required

Returns:

Type Description
WorkspaceResponse

The model of the active workspace.

Source code in src/zenml/client.py
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
def set_active_workspace(
    self, workspace_name_or_id: Union[str, UUID]
) -> "WorkspaceResponse":
    """Set the workspace for the local client.

    Args:
        workspace_name_or_id: The name or ID of the workspace to set active.

    Returns:
        The model of the active workspace.
    """
    workspace = self.zen_store.get_workspace(
        workspace_name_or_id=workspace_name_or_id
    )  # raises KeyError
    if self._config:
        self._config.set_active_workspace(workspace)
        # Sanitize the client configuration to reflect the current
        # settings
        self._sanitize_config()
    else:
        # set the active workspace globally only if the client doesn't use
        # a local configuration
        GlobalConfiguration().set_active_workspace(workspace)
    return workspace

set_api_key(key)

Configure the client with an API key.

Parameters:

Name Type Description Default
key str

The API key to use.

required

Raises:

Type Description
NotImplementedError

If the client is not connected to a ZenML server.

Source code in src/zenml/client.py
7267
7268
7269
7270
7271
7272
7273
7274
7275
7276
7277
7278
7279
7280
7281
7282
7283
7284
7285
7286
7287
7288
7289
7290
7291
7292
7293
7294
def set_api_key(self, key: str) -> None:
    """Configure the client with an API key.

    Args:
        key: The API key to use.

    Raises:
        NotImplementedError: If the client is not connected to a ZenML
            server.
    """
    from zenml.login.credentials_store import get_credentials_store
    from zenml.zen_stores.rest_zen_store import RestZenStore

    zen_store = self.zen_store
    if not zen_store.TYPE == StoreType.REST:
        raise NotImplementedError(
            "API key configuration is only supported if connected to a "
            "ZenML server."
        )

    credentials_store = get_credentials_store()
    assert isinstance(zen_store, RestZenStore)

    credentials_store.set_api_key(server_url=zen_store.url, api_key=key)

    # Force a re-authentication to start using the new API key
    # right away.
    zen_store.authenticate(force=True)

trigger_pipeline(pipeline_name_or_id=None, run_configuration=None, config_path=None, template_id=None, stack_name_or_id=None, synchronous=False)

Trigger a pipeline from the server.

Usage examples: * Run the latest runnable template for a pipeline:

Client().trigger_pipeline(pipeline_name_or_id=<NAME>)
  • Run the latest runnable template for a pipeline on a specific stack:
Client().trigger_pipeline(
    pipeline_name_or_id=<NAME>,
    stack_name_or_id=<STACK_NAME_OR_ID>
)
  • Run a specific template:
Client().trigger_pipeline(template_id=<ID>)

Parameters:

Name Type Description Default
pipeline_name_or_id Union[str, UUID, None]

Name or ID of the pipeline. If this is specified, the latest runnable template for this pipeline will be used for the run (Runnable here means that the build associated with the template is for a remote stack without any custom flavor stack components). If not given, a template ID that should be run needs to be specified.

None
run_configuration Union[PipelineRunConfiguration, Dict[str, Any], None]

Configuration for the run. Either this or a path to a config file can be specified.

None
config_path Optional[str]

Path to a YAML configuration file. This file will be parsed as a PipelineRunConfiguration object. Either this or the configuration in code can be specified.

None
template_id Optional[UUID]

ID of the template to run. Either this or a pipeline can be specified.

None
stack_name_or_id Union[str, UUID, None]

Name or ID of the stack on which to run the pipeline. If not specified, this method will try to find a runnable template on any stack.

None
synchronous bool

If True, this method will wait until the triggered run is finished.

False

Raises:

Type Description
RuntimeError

If triggering the pipeline failed.

Returns:

Type Description
PipelineRunResponse

Model of the pipeline run.

Source code in src/zenml/client.py
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
@_fail_for_sql_zen_store
def trigger_pipeline(
    self,
    pipeline_name_or_id: Union[str, UUID, None] = None,
    run_configuration: Union[
        PipelineRunConfiguration, Dict[str, Any], None
    ] = None,
    config_path: Optional[str] = None,
    template_id: Optional[UUID] = None,
    stack_name_or_id: Union[str, UUID, None] = None,
    synchronous: bool = False,
) -> PipelineRunResponse:
    """Trigger a pipeline from the server.

    Usage examples:
    * Run the latest runnable template for a pipeline:
    ```python
    Client().trigger_pipeline(pipeline_name_or_id=<NAME>)
    ```
    * Run the latest runnable template for a pipeline on a specific stack:
    ```python
    Client().trigger_pipeline(
        pipeline_name_or_id=<NAME>,
        stack_name_or_id=<STACK_NAME_OR_ID>
    )
    ```
    * Run a specific template:
    ```python
    Client().trigger_pipeline(template_id=<ID>)
    ```

    Args:
        pipeline_name_or_id: Name or ID of the pipeline. If this is
            specified, the latest runnable template for this pipeline will
            be used for the run (Runnable here means that the build
            associated with the template is for a remote stack without any
            custom flavor stack components). If not given, a template ID
            that should be run needs to be specified.
        run_configuration: Configuration for the run. Either this or a
            path to a config file can be specified.
        config_path: Path to a YAML configuration file. This file will be
            parsed as a `PipelineRunConfiguration` object. Either this or
            the configuration in code can be specified.
        template_id: ID of the template to run. Either this or a pipeline
            can be specified.
        stack_name_or_id: Name or ID of the stack on which to run the
            pipeline. If not specified, this method will try to find a
            runnable template on any stack.
        synchronous: If `True`, this method will wait until the triggered
            run is finished.

    Raises:
        RuntimeError: If triggering the pipeline failed.

    Returns:
        Model of the pipeline run.
    """
    from zenml.pipelines.run_utils import (
        validate_run_config_is_runnable_from_server,
        validate_stack_is_runnable_from_server,
        wait_for_pipeline_run_to_finish,
    )

    if Counter([template_id, pipeline_name_or_id])[None] != 1:
        raise RuntimeError(
            "You need to specify exactly one of pipeline or template "
            "to trigger."
        )

    if run_configuration and config_path:
        raise RuntimeError(
            "Only config path or runtime configuration can be specified."
        )

    if config_path:
        run_configuration = PipelineRunConfiguration.from_yaml(config_path)

    if isinstance(run_configuration, Dict):
        run_configuration = PipelineRunConfiguration.model_validate(
            run_configuration
        )

    if run_configuration:
        validate_run_config_is_runnable_from_server(run_configuration)

    if template_id:
        if stack_name_or_id:
            logger.warning(
                "Template ID and stack specified, ignoring the stack and "
                "using stack associated with the template instead."
            )

        run = self.zen_store.run_template(
            template_id=template_id,
            run_configuration=run_configuration,
        )
    else:
        assert pipeline_name_or_id
        pipeline = self.get_pipeline(name_id_or_prefix=pipeline_name_or_id)

        stack = None
        if stack_name_or_id:
            stack = self.get_stack(
                stack_name_or_id, allow_name_prefix_match=False
            )
            validate_stack_is_runnable_from_server(
                zen_store=self.zen_store, stack=stack
            )

        templates = depaginate(
            self.list_run_templates,
            pipeline_id=pipeline.id,
            stack_id=stack.id if stack else None,
        )

        for template in templates:
            if not template.build:
                continue

            stack = template.build.stack
            if not stack:
                continue

            try:
                validate_stack_is_runnable_from_server(
                    zen_store=self.zen_store, stack=stack
                )
            except ValueError:
                continue

            run = self.zen_store.run_template(
                template_id=template.id,
                run_configuration=run_configuration,
            )
            break
        else:
            raise RuntimeError(
                "Unable to find a runnable template for the given stack "
                "and pipeline."
            )

    if synchronous:
        run = wait_for_pipeline_run_to_finish(run_id=run.id)

    return run

update_action(name_id_or_prefix, name=None, description=None, configuration=None, service_account_id=None, auth_window=None)

Update an action.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the action to update.

required
name Optional[str]

The new name of the action.

None
description Optional[str]

The new description of the action.

None
configuration Optional[Dict[str, Any]]

The new configuration of the action.

None
service_account_id Optional[UUID]

The new service account that is used to execute the action.

None
auth_window Optional[int]

The new time window in minutes for which the service account is authorized to execute the action. Set this to 0 to authorize the service account indefinitely (not recommended).

None

Returns:

Type Description
ActionResponse

The updated action.

Source code in src/zenml/client.py
3063
3064
3065
3066
3067
3068
3069
3070
3071
3072
3073
3074
3075
3076
3077
3078
3079
3080
3081
3082
3083
3084
3085
3086
3087
3088
3089
3090
3091
3092
3093
3094
3095
3096
3097
3098
3099
3100
3101
3102
3103
3104
@_fail_for_sql_zen_store
def update_action(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    service_account_id: Optional[UUID] = None,
    auth_window: Optional[int] = None,
) -> ActionResponse:
    """Update an action.

    Args:
        name_id_or_prefix: The name, id or prefix of the action to update.
        name: The new name of the action.
        description: The new description of the action.
        configuration: The new configuration of the action.
        service_account_id: The new service account that is used to execute
            the action.
        auth_window: The new time window in minutes for which the service
            account is authorized to execute the action. Set this to 0 to
            authorize the service account indefinitely (not recommended).

    Returns:
        The updated action.
    """
    action = self.get_action(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    update_model = ActionUpdate(
        name=name,
        description=description,
        configuration=configuration,
        service_account_id=service_account_id,
        auth_window=auth_window,
    )

    return self.zen_store.update_action(
        action_id=action.id,
        action_update=update_model,
    )

update_api_key(service_account_name_id_or_prefix, name_id_or_prefix, name=None, description=None, active=None)

Update an API key.

Parameters:

Name Type Description Default
service_account_name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the service account to update the API key for.

required
name_id_or_prefix Union[UUID, str]

Name, ID or prefix of the API key to update.

required
name Optional[str]

New name of the API key.

None
description Optional[str]

New description of the API key.

None
active Optional[bool]

Whether the API key is active or not.

None

Returns:

Type Description
APIKeyResponse

The updated API key.

Source code in src/zenml/client.py
7412
7413
7414
7415
7416
7417
7418
7419
7420
7421
7422
7423
7424
7425
7426
7427
7428
7429
7430
7431
7432
7433
7434
7435
7436
7437
7438
7439
7440
7441
7442
7443
7444
7445
def update_api_key(
    self,
    service_account_name_id_or_prefix: Union[str, UUID],
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
) -> APIKeyResponse:
    """Update an API key.

    Args:
        service_account_name_id_or_prefix: The name, ID or prefix of the
            service account to update the API key for.
        name_id_or_prefix: Name, ID or prefix of the API key to update.
        name: New name of the API key.
        description: New description of the API key.
        active: Whether the API key is active or not.

    Returns:
        The updated API key.
    """
    api_key = self.get_api_key(
        service_account_name_id_or_prefix=service_account_name_id_or_prefix,
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )
    update = APIKeyUpdate(
        name=name, description=description, active=active
    )
    return self.zen_store.update_api_key(
        service_account_id=api_key.service_account.id,
        api_key_name_or_id=api_key.id,
        api_key_update=update,
    )

update_artifact(name_id_or_prefix, new_name=None, add_tags=None, remove_tags=None, has_custom_name=None)

Update an artifact.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to update.

required
new_name Optional[str]

The new name of the artifact.

None
add_tags Optional[List[str]]

Tags to add to the artifact.

None
remove_tags Optional[List[str]]

Tags to remove from the artifact.

None
has_custom_name Optional[bool]

Whether the artifact has a custom name.

None

Returns:

Type Description
ArtifactResponse

The updated artifact.

Source code in src/zenml/client.py
4115
4116
4117
4118
4119
4120
4121
4122
4123
4124
4125
4126
4127
4128
4129
4130
4131
4132
4133
4134
4135
4136
4137
4138
4139
4140
4141
4142
4143
4144
def update_artifact(
    self,
    name_id_or_prefix: Union[str, UUID],
    new_name: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
    has_custom_name: Optional[bool] = None,
) -> ArtifactResponse:
    """Update an artifact.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to update.
        new_name: The new name of the artifact.
        add_tags: Tags to add to the artifact.
        remove_tags: Tags to remove from the artifact.
        has_custom_name: Whether the artifact has a custom name.

    Returns:
        The updated artifact.
    """
    artifact = self.get_artifact(name_id_or_prefix=name_id_or_prefix)
    artifact_update = ArtifactUpdate(
        name=new_name,
        add_tags=add_tags,
        remove_tags=remove_tags,
        has_custom_name=has_custom_name,
    )
    return self.zen_store.update_artifact(
        artifact_id=artifact.id, artifact_update=artifact_update
    )

update_artifact_version(name_id_or_prefix, version=None, add_tags=None, remove_tags=None)

Update an artifact version.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, ID or prefix of the artifact to update.

required
version Optional[str]

The version of the artifact to update. Only used if name_id_or_prefix is the name of the artifact. If not specified, the latest version is updated.

None
add_tags Optional[List[str]]

Tags to add to the artifact version.

None
remove_tags Optional[List[str]]

Tags to remove from the artifact version.

None

Returns:

Type Description
ArtifactVersionResponse

The updated artifact version.

Source code in src/zenml/client.py
4336
4337
4338
4339
4340
4341
4342
4343
4344
4345
4346
4347
4348
4349
4350
4351
4352
4353
4354
4355
4356
4357
4358
4359
4360
4361
4362
4363
4364
4365
4366
def update_artifact_version(
    self,
    name_id_or_prefix: Union[str, UUID],
    version: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
) -> ArtifactVersionResponse:
    """Update an artifact version.

    Args:
        name_id_or_prefix: The name, ID or prefix of the artifact to update.
        version: The version of the artifact to update. Only used if
            `name_id_or_prefix` is the name of the artifact. If not
            specified, the latest version is updated.
        add_tags: Tags to add to the artifact version.
        remove_tags: Tags to remove from the artifact version.

    Returns:
        The updated artifact version.
    """
    artifact_version = self.get_artifact_version(
        name_id_or_prefix=name_id_or_prefix,
        version=version,
    )
    artifact_version_update = ArtifactVersionUpdate(
        add_tags=add_tags, remove_tags=remove_tags
    )
    return self.zen_store.update_artifact_version(
        artifact_version_id=artifact_version.id,
        artifact_version_update=artifact_version_update,
    )

update_authorized_device(id_or_prefix, locked=None)

Update an authorized device.

Parameters:

Name Type Description Default
id_or_prefix Union[UUID, str]

The ID or ID prefix of the authorized device.

required
locked Optional[bool]

Whether to lock or unlock the authorized device.

None

Returns:

Type Description
OAuthDeviceResponse

The updated authorized device.

Source code in src/zenml/client.py
6761
6762
6763
6764
6765
6766
6767
6768
6769
6770
6771
6772
6773
6774
6775
6776
6777
6778
6779
6780
6781
6782
6783
def update_authorized_device(
    self,
    id_or_prefix: Union[UUID, str],
    locked: Optional[bool] = None,
) -> OAuthDeviceResponse:
    """Update an authorized device.

    Args:
        id_or_prefix: The ID or ID prefix of the authorized device.
        locked: Whether to lock or unlock the authorized device.

    Returns:
        The updated authorized device.
    """
    device = self.get_authorized_device(
        id_or_prefix=id_or_prefix, allow_id_prefix_match=False
    )
    return self.zen_store.update_authorized_device(
        device_id=device.id,
        update=OAuthDeviceUpdate(
            locked=locked,
        ),
    )

update_code_repository(name_id_or_prefix, name=None, description=None, logo_url=None, config=None)

Update a code repository.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

Name, ID or prefix of the code repository to update.

required
name Optional[str]

New name of the code repository.

None
description Optional[str]

New description of the code repository.

None
logo_url Optional[str]

New logo URL of the code repository.

None
config Optional[Dict[str, Any]]

New configuration options for the code repository. Will be used to update the existing configuration values. To remove values from the existing configuration, set the value for that key to None.

None

Returns:

Type Description
CodeRepositoryResponse

The updated code repository.

Source code in src/zenml/client.py
5106
5107
5108
5109
5110
5111
5112
5113
5114
5115
5116
5117
5118
5119
5120
5121
5122
5123
5124
5125
5126
5127
5128
5129
5130
5131
5132
5133
5134
5135
5136
5137
5138
5139
5140
5141
5142
5143
5144
5145
5146
5147
5148
5149
5150
def update_code_repository(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    logo_url: Optional[str] = None,
    config: Optional[Dict[str, Any]] = None,
) -> CodeRepositoryResponse:
    """Update a code repository.

    Args:
        name_id_or_prefix: Name, ID or prefix of the code repository to
            update.
        name: New name of the code repository.
        description: New description of the code repository.
        logo_url: New logo URL of the code repository.
        config: New configuration options for the code repository. Will
            be used to update the existing configuration values. To remove
            values from the existing configuration, set the value for that
            key to `None`.

    Returns:
        The updated code repository.
    """
    repo = self.get_code_repository(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    update = CodeRepositoryUpdate(
        name=name, description=description, logo_url=logo_url
    )
    if config is not None:
        combined_config = repo.config
        combined_config.update(config)
        combined_config = {
            k: v for k, v in combined_config.items() if v is not None
        }

        self._validate_code_repository_config(
            source=repo.source, config=combined_config
        )
        update.config = combined_config

    return self.zen_store.update_code_repository(
        code_repository_id=repo.id, update=update
    )

update_event_source(name_id_or_prefix, name=None, description=None, configuration=None, rotate_secret=None, is_active=None)

Updates a event_source.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the event_source to update.

required
name Optional[str]

the new name of the event_source.

None
description Optional[str]

the new description of the event_source.

None
configuration Optional[Dict[str, Any]]

The event source configuration.

None
rotate_secret Optional[bool]

Allows rotating of secret, if true, the response will contain the new secret value

None
is_active Optional[bool]

Optional[bool] = Allows for activation/deactivating the event source

None

Returns:

Type Description
EventSourceResponse

The model of the updated event_source.

Raises:

Type Description
EntityExistsError

If the event_source name is already taken.

Source code in src/zenml/client.py
2865
2866
2867
2868
2869
2870
2871
2872
2873
2874
2875
2876
2877
2878
2879
2880
2881
2882
2883
2884
2885
2886
2887
2888
2889
2890
2891
2892
2893
2894
2895
2896
2897
2898
2899
2900
2901
2902
2903
2904
2905
2906
2907
2908
2909
2910
2911
2912
2913
2914
2915
2916
2917
2918
@_fail_for_sql_zen_store
def update_event_source(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    rotate_secret: Optional[bool] = None,
    is_active: Optional[bool] = None,
) -> EventSourceResponse:
    """Updates a event_source.

    Args:
        name_id_or_prefix: The name, id or prefix of the event_source to update.
        name: the new name of the event_source.
        description: the new description of the event_source.
        configuration: The event source configuration.
        rotate_secret: Allows rotating of secret, if true, the response will
            contain the new secret value
        is_active: Optional[bool] = Allows for activation/deactivating the
            event source

    Returns:
        The model of the updated event_source.

    Raises:
        EntityExistsError: If the event_source name is already taken.
    """
    # First, get the eve
    event_source = self.get_event_source(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    # Create the update model
    update_model = EventSourceUpdate(
        name=name,
        description=description,
        configuration=configuration,
        rotate_secret=rotate_secret,
        is_active=is_active,
    )

    if name:
        if self.list_event_sources(name=name):
            raise EntityExistsError(
                "There are already existing event_sources with the name "
                f"'{name}'."
            )

    updated_event_source = self.zen_store.update_event_source(
        event_source_id=event_source.id,
        event_source_update=update_model,
    )
    return updated_event_source

update_model(model_name_or_id, name=None, license=None, description=None, audience=None, use_cases=None, limitations=None, trade_offs=None, ethics=None, add_tags=None, remove_tags=None, save_models_to_registry=None)

Updates an existing model in Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

name or id of the model to be deleted.

required
name Optional[str]

The name of the model.

None
license Optional[str]

The license under which the model is created.

None
description Optional[str]

The description of the model.

None
audience Optional[str]

The target audience of the model.

None
use_cases Optional[str]

The use cases of the model.

None
limitations Optional[str]

The known limitations of the model.

None
trade_offs Optional[str]

The tradeoffs of the model.

None
ethics Optional[str]

The ethical implications of the model.

None
add_tags Optional[List[str]]

Tags to add to the model.

None
remove_tags Optional[List[str]]

Tags to remove from to the model.

None
save_models_to_registry Optional[bool]

Whether to save the model to the registry.

None

Returns:

Type Description
ModelResponse

The updated model.

Source code in src/zenml/client.py
6099
6100
6101
6102
6103
6104
6105
6106
6107
6108
6109
6110
6111
6112
6113
6114
6115
6116
6117
6118
6119
6120
6121
6122
6123
6124
6125
6126
6127
6128
6129
6130
6131
6132
6133
6134
6135
6136
6137
6138
6139
6140
6141
6142
6143
6144
6145
6146
6147
6148
6149
6150
6151
def update_model(
    self,
    model_name_or_id: Union[str, UUID],
    name: Optional[str] = None,
    license: Optional[str] = None,
    description: Optional[str] = None,
    audience: Optional[str] = None,
    use_cases: Optional[str] = None,
    limitations: Optional[str] = None,
    trade_offs: Optional[str] = None,
    ethics: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
    save_models_to_registry: Optional[bool] = None,
) -> ModelResponse:
    """Updates an existing model in Model Control Plane.

    Args:
        model_name_or_id: name or id of the model to be deleted.
        name: The name of the model.
        license: The license under which the model is created.
        description: The description of the model.
        audience: The target audience of the model.
        use_cases: The use cases of the model.
        limitations: The known limitations of the model.
        trade_offs: The tradeoffs of the model.
        ethics: The ethical implications of the model.
        add_tags: Tags to add to the model.
        remove_tags: Tags to remove from to the model.
        save_models_to_registry: Whether to save the model to the
            registry.

    Returns:
        The updated model.
    """
    if not is_valid_uuid(model_name_or_id):
        model_name_or_id = self.zen_store.get_model(model_name_or_id).id
    return self.zen_store.update_model(
        model_id=model_name_or_id,  # type:ignore[arg-type]
        model_update=ModelUpdate(
            name=name,
            license=license,
            description=description,
            audience=audience,
            use_cases=use_cases,
            limitations=limitations,
            trade_offs=trade_offs,
            ethics=ethics,
            add_tags=add_tags,
            remove_tags=remove_tags,
            save_models_to_registry=save_models_to_registry,
        ),
    )

update_model_version(model_name_or_id, version_name_or_id, stage=None, force=False, name=None, description=None, add_tags=None, remove_tags=None)

Get all model versions by filter.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

The name or ID of the model containing model version.

required
version_name_or_id Union[str, UUID]

The name or ID of model version to be updated.

required
stage Optional[Union[str, ModelStages]]

Target model version stage to be set.

None
force bool

Whether existing model version in target stage should be silently archived or an error should be raised.

False
name Optional[str]

Target model version name to be set.

None
description Optional[str]

Target model version description to be set.

None
add_tags Optional[List[str]]

Tags to add to the model version.

None
remove_tags Optional[List[str]]

Tags to remove from to the model version.

None

Returns:

Type Description
ModelVersionResponse

An updated model version.

Source code in src/zenml/client.py
6443
6444
6445
6446
6447
6448
6449
6450
6451
6452
6453
6454
6455
6456
6457
6458
6459
6460
6461
6462
6463
6464
6465
6466
6467
6468
6469
6470
6471
6472
6473
6474
6475
6476
6477
6478
6479
6480
6481
6482
6483
6484
6485
6486
6487
6488
def update_model_version(
    self,
    model_name_or_id: Union[str, UUID],
    version_name_or_id: Union[str, UUID],
    stage: Optional[Union[str, ModelStages]] = None,
    force: bool = False,
    name: Optional[str] = None,
    description: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
) -> ModelVersionResponse:
    """Get all model versions by filter.

    Args:
        model_name_or_id: The name or ID of the model containing model version.
        version_name_or_id: The name or ID of model version to be updated.
        stage: Target model version stage to be set.
        force: Whether existing model version in target stage should be
            silently archived or an error should be raised.
        name: Target model version name to be set.
        description: Target model version description to be set.
        add_tags: Tags to add to the model version.
        remove_tags: Tags to remove from to the model version.

    Returns:
        An updated model version.
    """
    if not is_valid_uuid(model_name_or_id):
        model_name_or_id = self.get_model(model_name_or_id).id
    if not is_valid_uuid(version_name_or_id):
        version_name_or_id = self.get_model_version(
            model_name_or_id, version_name_or_id
        ).id

    return self.zen_store.update_model_version(
        model_version_id=version_name_or_id,  # type:ignore[arg-type]
        model_version_update_model=ModelVersionUpdate(
            model=model_name_or_id,
            stage=stage,
            force=force,
            name=name,
            description=description,
            add_tags=add_tags,
            remove_tags=remove_tags,
        ),
    )

update_run_template(name_id_or_prefix, name=None, description=None, add_tags=None, remove_tags=None)

Update a run template.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

Name/ID/ID prefix of the template to update.

required
name Optional[str]

The new name of the run template.

None
description Optional[str]

The new description of the run template.

None
add_tags Optional[List[str]]

Tags to add to the run template.

None
remove_tags Optional[List[str]]

Tags to remove from the run template.

None

Returns:

Type Description
RunTemplateResponse

The updated run template.

Source code in src/zenml/client.py
3582
3583
3584
3585
3586
3587
3588
3589
3590
3591
3592
3593
3594
3595
3596
3597
3598
3599
3600
3601
3602
3603
3604
3605
3606
3607
3608
3609
3610
3611
3612
3613
3614
3615
3616
3617
3618
3619
3620
3621
def update_run_template(
    self,
    name_id_or_prefix: Union[str, UUID],
    name: Optional[str] = None,
    description: Optional[str] = None,
    add_tags: Optional[List[str]] = None,
    remove_tags: Optional[List[str]] = None,
) -> RunTemplateResponse:
    """Update a run template.

    Args:
        name_id_or_prefix: Name/ID/ID prefix of the template to update.
        name: The new name of the run template.
        description: The new description of the run template.
        add_tags: Tags to add to the run template.
        remove_tags: Tags to remove from the run template.

    Returns:
        The updated run template.
    """
    if is_valid_uuid(name_id_or_prefix):
        template_id = (
            UUID(name_id_or_prefix)
            if isinstance(name_id_or_prefix, str)
            else name_id_or_prefix
        )
    else:
        template_id = self.get_run_template(
            name_id_or_prefix, hydrate=False
        ).id

    return self.zen_store.update_run_template(
        template_id=template_id,
        template_update=RunTemplateUpdate(
            name=name,
            description=description,
            add_tags=add_tags,
            remove_tags=remove_tags,
        ),
    )

update_secret(name_id_or_prefix, scope=None, new_name=None, new_scope=None, add_or_update_values=None, remove_values=None)

Updates a secret.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name, id or prefix of the id for the secret to update.

required
scope Optional[SecretScope]

The scope of the secret to update.

None
new_name Optional[str]

The new name of the secret.

None
new_scope Optional[SecretScope]

The new scope of the secret.

None
add_or_update_values Optional[Dict[str, str]]

The values to add or update.

None
remove_values Optional[List[str]]

The values to remove.

None

Returns:

Type Description
SecretResponse

The updated secret.

Raises:

Type Description
KeyError

If trying to remove a value that doesn't exist.

ValueError

If a key is provided in both add_or_update_values and remove_values.

Source code in src/zenml/client.py
4759
4760
4761
4762
4763
4764
4765
4766
4767
4768
4769
4770
4771
4772
4773
4774
4775
4776
4777
4778
4779
4780
4781
4782
4783
4784
4785
4786
4787
4788
4789
4790
4791
4792
4793
4794
4795
4796
4797
4798
4799
4800
4801
4802
4803
4804
4805
4806
4807
4808
4809
4810
4811
4812
4813
4814
4815
4816
4817
4818
4819
4820
4821
4822
4823
4824
4825
4826
def update_secret(
    self,
    name_id_or_prefix: Union[str, UUID],
    scope: Optional[SecretScope] = None,
    new_name: Optional[str] = None,
    new_scope: Optional[SecretScope] = None,
    add_or_update_values: Optional[Dict[str, str]] = None,
    remove_values: Optional[List[str]] = None,
) -> SecretResponse:
    """Updates a secret.

    Args:
        name_id_or_prefix: The name, id or prefix of the id for the
            secret to update.
        scope: The scope of the secret to update.
        new_name: The new name of the secret.
        new_scope: The new scope of the secret.
        add_or_update_values: The values to add or update.
        remove_values: The values to remove.

    Returns:
        The updated secret.

    Raises:
        KeyError: If trying to remove a value that doesn't exist.
        ValueError: If a key is provided in both add_or_update_values and
            remove_values.
    """
    secret = self.get_secret(
        name_id_or_prefix=name_id_or_prefix,
        scope=scope,
        # Don't allow partial name matches, but allow partial ID matches
        allow_partial_name_match=False,
        allow_partial_id_match=True,
        hydrate=True,
    )

    secret_update = SecretUpdate(name=new_name or secret.name)

    if new_scope:
        secret_update.scope = new_scope
    values: Dict[str, Optional[SecretStr]] = {}
    if add_or_update_values:
        values.update(
            {
                key: SecretStr(value)
                for key, value in add_or_update_values.items()
            }
        )
    if remove_values:
        for key in remove_values:
            if key not in secret.values:
                raise KeyError(
                    f"Cannot remove value '{key}' from secret "
                    f"'{secret.name}' because it does not exist."
                )
            if key in values:
                raise ValueError(
                    f"Key '{key}' is supplied both in the values to add or "
                    f"update and the values to be removed."
                )
            values[key] = None
    if values:
        secret_update.values = values

    return Client().zen_store.update_secret(
        secret_id=secret.id, secret_update=secret_update
    )

update_server_settings(updated_name=None, updated_logo_url=None, updated_enable_analytics=None, updated_enable_announcements=None, updated_enable_updates=None, updated_onboarding_state=None)

Update the server settings.

Parameters:

Name Type Description Default
updated_name Optional[str]

Updated name for the server.

None
updated_logo_url Optional[str]

Updated logo URL for the server.

None
updated_enable_analytics Optional[bool]

Updated value whether to enable analytics for the server.

None
updated_enable_announcements Optional[bool]

Updated value whether to display announcements about ZenML.

None
updated_enable_updates Optional[bool]

Updated value whether to display updates about ZenML.

None
updated_onboarding_state Optional[Dict[str, Any]]

Updated onboarding state for the server.

None

Returns:

Type Description
ServerSettingsResponse

The updated server settings.

Source code in src/zenml/client.py
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
def update_server_settings(
    self,
    updated_name: Optional[str] = None,
    updated_logo_url: Optional[str] = None,
    updated_enable_analytics: Optional[bool] = None,
    updated_enable_announcements: Optional[bool] = None,
    updated_enable_updates: Optional[bool] = None,
    updated_onboarding_state: Optional[Dict[str, Any]] = None,
) -> ServerSettingsResponse:
    """Update the server settings.

    Args:
        updated_name: Updated name for the server.
        updated_logo_url: Updated logo URL for the server.
        updated_enable_analytics: Updated value whether to enable
            analytics for the server.
        updated_enable_announcements: Updated value whether to display
            announcements about ZenML.
        updated_enable_updates: Updated value whether to display updates
            about ZenML.
        updated_onboarding_state: Updated onboarding state for the server.

    Returns:
        The updated server settings.
    """
    update_model = ServerSettingsUpdate(
        server_name=updated_name,
        logo_url=updated_logo_url,
        enable_analytics=updated_enable_analytics,
        display_announcements=updated_enable_announcements,
        display_updates=updated_enable_updates,
        onboarding_state=updated_onboarding_state,
    )
    return self.zen_store.update_server_settings(update_model)

update_service(id, name=None, service_source=None, admin_state=None, status=None, endpoint=None, labels=None, prediction_url=None, health_check_url=None, model_version_id=None)

Update a service.

Parameters:

Name Type Description Default
id UUID

The ID of the service to update.

required
name Optional[str]

The new name of the service.

None
admin_state Optional[ServiceState]

The new admin state of the service.

None
status Optional[Dict[str, Any]]

The new status of the service.

None
endpoint Optional[Dict[str, Any]]

The new endpoint of the service.

None
service_source Optional[str]

The new service source of the service.

None
labels Optional[Dict[str, str]]

The new labels of the service.

None
prediction_url Optional[str]

The new prediction url of the service.

None
health_check_url Optional[str]

The new health check url of the service.

None
model_version_id Optional[UUID]

The new model version id of the service.

None

Returns:

Type Description
ServiceResponse

The updated service.

Source code in src/zenml/client.py
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
def update_service(
    self,
    id: UUID,
    name: Optional[str] = None,
    service_source: Optional[str] = None,
    admin_state: Optional[ServiceState] = None,
    status: Optional[Dict[str, Any]] = None,
    endpoint: Optional[Dict[str, Any]] = None,
    labels: Optional[Dict[str, str]] = None,
    prediction_url: Optional[str] = None,
    health_check_url: Optional[str] = None,
    model_version_id: Optional[UUID] = None,
) -> ServiceResponse:
    """Update a service.

    Args:
        id: The ID of the service to update.
        name: The new name of the service.
        admin_state: The new admin state of the service.
        status: The new status of the service.
        endpoint: The new endpoint of the service.
        service_source: The new service source of the service.
        labels: The new labels of the service.
        prediction_url: The new prediction url of the service.
        health_check_url: The new health check url of the service.
        model_version_id: The new model version id of the service.

    Returns:
        The updated service.
    """
    service_update = ServiceUpdate()
    if name:
        service_update.name = name
    if service_source:
        service_update.service_source = service_source
    if admin_state:
        service_update.admin_state = admin_state
    if status:
        service_update.status = status
    if endpoint:
        service_update.endpoint = endpoint
    if labels:
        service_update.labels = labels
    if prediction_url:
        service_update.prediction_url = prediction_url
    if health_check_url:
        service_update.health_check_url = health_check_url
    if model_version_id:
        service_update.model_version_id = model_version_id
    return self.zen_store.update_service(
        service_id=id, update=service_update
    )

update_service_account(name_id_or_prefix, updated_name=None, description=None, active=None)

Update a service account.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the service account to update.

required
updated_name Optional[str]

The new name of the service account.

None
description Optional[str]

The new description of the service account.

None
active Optional[bool]

The new active status of the service account.

None

Returns:

Type Description
ServiceAccountResponse

The updated service account.

Source code in src/zenml/client.py
7180
7181
7182
7183
7184
7185
7186
7187
7188
7189
7190
7191
7192
7193
7194
7195
7196
7197
7198
7199
7200
7201
7202
7203
7204
7205
7206
7207
7208
7209
7210
def update_service_account(
    self,
    name_id_or_prefix: Union[str, UUID],
    updated_name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
) -> ServiceAccountResponse:
    """Update a service account.

    Args:
        name_id_or_prefix: The name or ID of the service account to update.
        updated_name: The new name of the service account.
        description: The new description of the service account.
        active: The new active status of the service account.

    Returns:
        The updated service account.
    """
    service_account = self.get_service_account(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    service_account_update = ServiceAccountUpdate(
        name=updated_name,
        description=description,
        active=active,
    )

    return self.zen_store.update_service_account(
        service_account_name_or_id=service_account.id,
        service_account_update=service_account_update,
    )

update_service_connector(name_id_or_prefix, name=None, auth_method=None, resource_type=None, configuration=None, resource_id=None, description=None, expires_at=None, expires_skew_tolerance=None, expiration_seconds=None, labels=None, verify=True, list_resources=True, update=True)

Validate and/or register an updated service connector.

If the resource_type, resource_id and expiration_seconds parameters are set to their "empty" values (empty string for resource type and resource ID, 0 for expiration seconds), the existing values will be removed from the service connector. Setting them to None or omitting them will not affect the existing values.

If supplied, the configuration parameter is a full replacement of the existing configuration rather than a partial update.

Labels can be updated or removed by setting the label value to None.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to update.

required
name Optional[str]

The new name of the service connector.

None
auth_method Optional[str]

The new authentication method of the service connector.

None
resource_type Optional[str]

The new resource type for the service connector. If set to the empty string, the existing resource type will be removed.

None
configuration Optional[Dict[str, str]]

The new configuration of the service connector. If set, this needs to be a full replacement of the existing configuration rather than a partial update.

None
resource_id Optional[str]

The new resource id of the service connector. If set to the empty string, the existing resource ID will be removed.

None
description Optional[str]

The description of the service connector.

None
expires_at Optional[datetime]

The new UTC expiration time of the service connector.

None
expires_skew_tolerance Optional[int]

The allowed expiration skew for the service connector credentials.

None
expiration_seconds Optional[int]

The expiration time of the service connector. If set to 0, the existing expiration time will be removed.

None
labels Optional[Dict[str, Optional[str]]]

The service connector to update or remove. If a label value is set to None, the label will be removed.

None
verify bool

Whether to verify that the service connector configuration and credentials can be used to gain access to the resource.

True
list_resources bool

Whether to also list the resources that the service connector can give access to (if verify is True).

True
update bool

Whether to update the service connector or not.

True

Returns:

Type Description
Optional[Union[ServiceConnectorResponse, ServiceConnectorUpdate]]

The model of the registered service connector and the resources

Optional[ServiceConnectorResourcesModel]

that the service connector can give access to (if verify is True).

Raises:

Type Description
AuthorizationException

If the service connector verification fails due to invalid credentials or insufficient permissions.

Source code in src/zenml/client.py
5535
5536
5537
5538
5539
5540
5541
5542
5543
5544
5545
5546
5547
5548
5549
5550
5551
5552
5553
5554
5555
5556
5557
5558
5559
5560
5561
5562
5563
5564
5565
5566
5567
5568
5569
5570
5571
5572
5573
5574
5575
5576
5577
5578
5579
5580
5581
5582
5583
5584
5585
5586
5587
5588
5589
5590
5591
5592
5593
5594
5595
5596
5597
5598
5599
5600
5601
5602
5603
5604
5605
5606
5607
5608
5609
5610
5611
5612
5613
5614
5615
5616
5617
5618
5619
5620
5621
5622
5623
5624
5625
5626
5627
5628
5629
5630
5631
5632
5633
5634
5635
5636
5637
5638
5639
5640
5641
5642
5643
5644
5645
5646
5647
5648
5649
5650
5651
5652
5653
5654
5655
5656
5657
5658
5659
5660
5661
5662
5663
5664
5665
5666
5667
5668
5669
5670
5671
5672
5673
5674
5675
5676
5677
5678
5679
5680
5681
5682
5683
5684
5685
5686
5687
5688
5689
5690
5691
5692
5693
5694
5695
5696
5697
5698
5699
5700
5701
5702
5703
5704
5705
5706
5707
5708
5709
5710
5711
5712
5713
5714
5715
5716
5717
5718
5719
5720
5721
5722
5723
5724
5725
5726
5727
5728
5729
5730
5731
5732
5733
5734
5735
5736
5737
5738
5739
5740
5741
5742
5743
5744
5745
5746
5747
5748
5749
5750
5751
5752
5753
5754
5755
5756
5757
5758
5759
5760
def update_service_connector(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    auth_method: Optional[str] = None,
    resource_type: Optional[str] = None,
    configuration: Optional[Dict[str, str]] = None,
    resource_id: Optional[str] = None,
    description: Optional[str] = None,
    expires_at: Optional[datetime] = None,
    expires_skew_tolerance: Optional[int] = None,
    expiration_seconds: Optional[int] = None,
    labels: Optional[Dict[str, Optional[str]]] = None,
    verify: bool = True,
    list_resources: bool = True,
    update: bool = True,
) -> Tuple[
    Optional[
        Union[
            ServiceConnectorResponse,
            ServiceConnectorUpdate,
        ]
    ],
    Optional[ServiceConnectorResourcesModel],
]:
    """Validate and/or register an updated service connector.

    If the `resource_type`, `resource_id` and `expiration_seconds`
    parameters are set to their "empty" values (empty string for resource
    type and resource ID, 0 for expiration seconds), the existing values
    will be removed from the service connector. Setting them to None or
    omitting them will not affect the existing values.

    If supplied, the `configuration` parameter is a full replacement of the
    existing configuration rather than a partial update.

    Labels can be updated or removed by setting the label value to None.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to update.
        name: The new name of the service connector.
        auth_method: The new authentication method of the service connector.
        resource_type: The new resource type for the service connector.
            If set to the empty string, the existing resource type will be
            removed.
        configuration: The new configuration of the service connector. If
            set, this needs to be a full replacement of the existing
            configuration rather than a partial update.
        resource_id: The new resource id of the service connector.
            If set to the empty string, the existing resource ID will be
            removed.
        description: The description of the service connector.
        expires_at: The new UTC expiration time of the service connector.
        expires_skew_tolerance: The allowed expiration skew for the service
            connector credentials.
        expiration_seconds: The expiration time of the service connector.
            If set to 0, the existing expiration time will be removed.
        labels: The service connector to update or remove. If a label value
            is set to None, the label will be removed.
        verify: Whether to verify that the service connector configuration
            and credentials can be used to gain access to the resource.
        list_resources: Whether to also list the resources that the service
            connector can give access to (if verify is True).
        update: Whether to update the service connector or not.

    Returns:
        The model of the registered service connector and the resources
        that the service connector can give access to (if verify is True).

    Raises:
        AuthorizationException: If the service connector verification
            fails due to invalid credentials or insufficient permissions.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    connector_model = self.get_service_connector(
        name_id_or_prefix,
        allow_name_prefix_match=False,
        load_secrets=True,
    )

    connector_instance: Optional[ServiceConnector] = None
    connector_resources: Optional[ServiceConnectorResourcesModel] = None

    if isinstance(connector_model.connector_type, str):
        connector = self.get_service_connector_type(
            connector_model.connector_type
        )
    else:
        connector = connector_model.connector_type

    resource_types: Optional[Union[str, List[str]]] = None
    if resource_type == "":
        resource_types = None
    elif resource_type is None:
        resource_types = connector_model.resource_types
    else:
        resource_types = resource_type

    if not resource_type and len(connector.resource_types) == 1:
        resource_types = connector.resource_types[0].resource_type

    if resource_id == "":
        resource_id = None
    elif resource_id is None:
        resource_id = connector_model.resource_id

    if expiration_seconds == 0:
        expiration_seconds = None
    elif expiration_seconds is None:
        expiration_seconds = connector_model.expiration_seconds

    connector_update = ServiceConnectorUpdate(
        name=name or connector_model.name,
        connector_type=connector.connector_type,
        description=description or connector_model.description,
        auth_method=auth_method or connector_model.auth_method,
        expires_at=expires_at,
        expires_skew_tolerance=expires_skew_tolerance,
        expiration_seconds=expiration_seconds,
    )

    # Validate and configure the resources
    if configuration is not None:
        # The supplied configuration is a drop-in replacement for the
        # existing configuration and secrets
        connector_update.validate_and_configure_resources(
            connector_type=connector,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=configuration,
        )
    else:
        connector_update.validate_and_configure_resources(
            connector_type=connector,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=connector_model.configuration,
            secrets=connector_model.secrets,
        )

    # Add the labels
    if labels is not None:
        # Apply the new label values, but don't keep any labels that
        # have been set to None in the update
        connector_update.labels = {
            **{
                label: value
                for label, value in connector_model.labels.items()
                if label not in labels
            },
            **{
                label: value
                for label, value in labels.items()
                if value is not None
            },
        }
    else:
        connector_update.labels = connector_model.labels

    if verify:
        # Prefer to verify the connector config server-side if the
        # implementation, if available there, because it ensures
        # that the connector can be shared with other users or used
        # from other machines and because some auth methods rely on the
        # server-side authentication environment

        # Convert the update model to a request model for validation
        connector_request_dict = connector_update.model_dump()
        connector_request_dict.update(
            user=self.active_user.id,
            workspace=self.active_workspace.id,
        )
        connector_request = ServiceConnectorRequest.model_validate(
            connector_request_dict
        )

        if connector.remote:
            connector_resources = (
                self.zen_store.verify_service_connector_config(
                    service_connector=connector_request,
                    list_resources=list_resources,
                )
            )
        else:
            connector_instance = (
                service_connector_registry.instantiate_connector(
                    model=connector_request,
                )
            )
            connector_resources = connector_instance.verify(
                list_resources=list_resources
            )

        if connector_resources.error:
            raise AuthorizationException(connector_resources.error)

        # For resource types that don't support multi-instances, it's
        # better to save the default resource ID in the connector, if
        # available. Otherwise, we'll need to instantiate the connector
        # again to get the default resource ID.
        connector_update.resource_id = (
            connector_update.resource_id
            or connector_resources.get_default_resource_id()
        )

    if not update:
        return connector_update, connector_resources

    # Update the model
    connector_response = self.zen_store.update_service_connector(
        service_connector_id=connector_model.id,
        update=connector_update,
    )

    if connector_resources:
        connector_resources.id = connector_response.id
        connector_resources.name = connector_response.name
        connector_resources.connector_type = (
            connector_response.connector_type
        )

    return connector_response, connector_resources

update_stack(name_id_or_prefix=None, name=None, stack_spec_file=None, labels=None, description=None, component_updates=None)

Updates a stack and its components.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name, id or prefix of the stack to update.

None
name Optional[str]

the new name of the stack.

None
stack_spec_file Optional[str]

path to the stack spec file.

None
labels Optional[Dict[str, Any]]

The new labels of the stack component.

None
description Optional[str]

the new description of the stack.

None
component_updates Optional[Dict[StackComponentType, List[Union[UUID, str]]]]

dictionary which maps stack component types to lists of new stack component names or ids.

None

Returns:

Type Description
StackResponse

The model of the updated stack.

Raises:

Type Description
EntityExistsError

If the stack name is already taken.

Source code in src/zenml/client.py
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
def update_stack(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]] = None,
    name: Optional[str] = None,
    stack_spec_file: Optional[str] = None,
    labels: Optional[Dict[str, Any]] = None,
    description: Optional[str] = None,
    component_updates: Optional[
        Dict[StackComponentType, List[Union[UUID, str]]]
    ] = None,
) -> StackResponse:
    """Updates a stack and its components.

    Args:
        name_id_or_prefix: The name, id or prefix of the stack to update.
        name: the new name of the stack.
        stack_spec_file: path to the stack spec file.
        labels: The new labels of the stack component.
        description: the new description of the stack.
        component_updates: dictionary which maps stack component types to
            lists of new stack component names or ids.

    Returns:
        The model of the updated stack.

    Raises:
        EntityExistsError: If the stack name is already taken.
    """
    # First, get the stack
    stack = self.get_stack(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    # Create the update model
    update_model = StackUpdate(
        workspace=self.active_workspace.id,
        user=self.active_user.id,
        stack_spec_path=stack_spec_file,
    )

    if name:
        if self.list_stacks(name=name):
            raise EntityExistsError(
                "There are already existing stacks with the name "
                f"'{name}'."
            )

        update_model.name = name

    if description:
        update_model.description = description

    # Get the current components
    if component_updates:
        components_dict = stack.components.copy()

        for component_type, component_id_list in component_updates.items():
            if component_id_list is not None:
                components_dict[component_type] = [
                    self.get_stack_component(
                        name_id_or_prefix=component_id,
                        component_type=component_type,
                    )
                    for component_id in component_id_list
                ]

        update_model.components = {
            c_type: [c.id for c in c_list]
            for c_type, c_list in components_dict.items()
        }

    if labels is not None:
        existing_labels = stack.labels or {}
        existing_labels.update(labels)

        existing_labels = {
            k: v for k, v in existing_labels.items() if v is not None
        }
        update_model.labels = existing_labels

    updated_stack = self.zen_store.update_stack(
        stack_id=stack.id,
        stack_update=update_model,
    )
    if updated_stack.id == self.active_stack_model.id:
        if self._config:
            self._config.set_active_stack(updated_stack)
        else:
            GlobalConfiguration().set_active_stack(updated_stack)
    return updated_stack

update_stack_component(name_id_or_prefix, component_type, name=None, configuration=None, labels=None, disconnect=None, connector_id=None, connector_resource_id=None)

Updates a stack component.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

The name, id or prefix of the stack component to update.

required
component_type StackComponentType

The type of the stack component to update.

required
name Optional[str]

The new name of the stack component.

None
configuration Optional[Dict[str, Any]]

The new configuration of the stack component.

None
labels Optional[Dict[str, Any]]

The new labels of the stack component.

None
disconnect Optional[bool]

Whether to disconnect the stack component from its service connector.

None
connector_id Optional[UUID]

The new connector id of the stack component.

None
connector_resource_id Optional[str]

The new connector resource id of the stack component.

None

Returns:

Type Description
ComponentResponse

The updated stack component.

Raises:

Type Description
EntityExistsError

If the new name is already taken.

Source code in src/zenml/client.py
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
def update_stack_component(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]],
    component_type: StackComponentType,
    name: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    labels: Optional[Dict[str, Any]] = None,
    disconnect: Optional[bool] = None,
    connector_id: Optional[UUID] = None,
    connector_resource_id: Optional[str] = None,
) -> ComponentResponse:
    """Updates a stack component.

    Args:
        name_id_or_prefix: The name, id or prefix of the stack component to
            update.
        component_type: The type of the stack component to update.
        name: The new name of the stack component.
        configuration: The new configuration of the stack component.
        labels: The new labels of the stack component.
        disconnect: Whether to disconnect the stack component from its
            service connector.
        connector_id: The new connector id of the stack component.
        connector_resource_id: The new connector resource id of the
            stack component.

    Returns:
        The updated stack component.

    Raises:
        EntityExistsError: If the new name is already taken.
    """
    # Get the existing component model
    component = self.get_stack_component(
        name_id_or_prefix=name_id_or_prefix,
        component_type=component_type,
        allow_name_prefix_match=False,
    )

    update_model = ComponentUpdate(
        workspace=self.active_workspace.id,
        user=self.active_user.id,
    )

    if name is not None:
        existing_components = self.list_stack_components(
            name=name,
            type=component_type,
        )
        if existing_components.total > 0:
            raise EntityExistsError(
                f"There are already existing components with the "
                f"name '{name}'."
            )
        update_model.name = name

    if configuration is not None:
        existing_configuration = component.configuration
        existing_configuration.update(configuration)
        existing_configuration = {
            k: v
            for k, v in existing_configuration.items()
            if v is not None
        }

        from zenml.stack.utils import (
            validate_stack_component_config,
            warn_if_config_server_mismatch,
        )

        validated_config = validate_stack_component_config(
            configuration_dict=existing_configuration,
            flavor=component.flavor,
            component_type=component.type,
            # Always enforce validation of custom flavors
            validate_custom_flavors=True,
        )
        # Guaranteed to not be None by setting
        # `validate_custom_flavors=True` above
        assert validated_config is not None
        warn_if_config_server_mismatch(validated_config)

        update_model.configuration = existing_configuration

    if labels is not None:
        existing_labels = component.labels or {}
        existing_labels.update(labels)

        existing_labels = {
            k: v for k, v in existing_labels.items() if v is not None
        }
        update_model.labels = existing_labels

    if disconnect:
        update_model.connector = None
        update_model.connector_resource_id = None
    else:
        existing_component = self.get_stack_component(
            name_id_or_prefix=name_id_or_prefix,
            component_type=component_type,
            allow_name_prefix_match=False,
        )
        update_model.connector = connector_id
        update_model.connector_resource_id = connector_resource_id
        if connector_id is None and existing_component.connector:
            update_model.connector = existing_component.connector.id
            update_model.connector_resource_id = (
                existing_component.connector_resource_id
            )

    # Send the updated component to the ZenStore
    return self.zen_store.update_stack_component(
        component_id=component.id,
        component_update=update_model,
    )

update_tag(tag_name_or_id, tag_update_model)

Updates an existing tag.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

name or UUID of the tag to be updated.

required
tag_update_model TagUpdate

the tag to be updated.

required

Returns:

Type Description
TagResponse

The updated tag.

Source code in src/zenml/client.py
7534
7535
7536
7537
7538
7539
7540
7541
7542
7543
7544
7545
7546
7547
7548
7549
7550
def update_tag(
    self,
    tag_name_or_id: Union[str, UUID],
    tag_update_model: TagUpdate,
) -> TagResponse:
    """Updates an existing tag.

    Args:
        tag_name_or_id: name or UUID of the tag to be updated.
        tag_update_model: the tag to be updated.

    Returns:
        The updated tag.
    """
    return self.zen_store.update_tag(
        tag_name_or_id=tag_name_or_id, tag_update_model=tag_update_model
    )

update_trigger(name_id_or_prefix, name=None, description=None, event_filter=None, is_active=None)

Updates a trigger.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the trigger to update.

required
name Optional[str]

the new name of the trigger.

None
description Optional[str]

the new description of the trigger.

None
event_filter Optional[Dict[str, Any]]

The event filter configuration.

None
is_active Optional[bool]

Whether the trigger is active or not.

None

Returns:

Type Description
TriggerResponse

The model of the updated trigger.

Raises:

Type Description
EntityExistsError

If the trigger name is already taken.

Source code in src/zenml/client.py
3256
3257
3258
3259
3260
3261
3262
3263
3264
3265
3266
3267
3268
3269
3270
3271
3272
3273
3274
3275
3276
3277
3278
3279
3280
3281
3282
3283
3284
3285
3286
3287
3288
3289
3290
3291
3292
3293
3294
3295
3296
3297
3298
3299
3300
3301
3302
3303
3304
@_fail_for_sql_zen_store
def update_trigger(
    self,
    name_id_or_prefix: Union[UUID, str],
    name: Optional[str] = None,
    description: Optional[str] = None,
    event_filter: Optional[Dict[str, Any]] = None,
    is_active: Optional[bool] = None,
) -> TriggerResponse:
    """Updates a trigger.

    Args:
        name_id_or_prefix: The name, id or prefix of the trigger to update.
        name: the new name of the trigger.
        description: the new description of the trigger.
        event_filter: The event filter configuration.
        is_active: Whether the trigger is active or not.

    Returns:
        The model of the updated trigger.

    Raises:
        EntityExistsError: If the trigger name is already taken.
    """
    # First, get the eve
    trigger = self.get_trigger(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )

    # Create the update model
    update_model = TriggerUpdate(
        name=name,
        description=description,
        event_filter=event_filter,
        is_active=is_active,
    )

    if name:
        if self.list_triggers(name=name):
            raise EntityExistsError(
                "There are already is an existing trigger with the name "
                f"'{name}'."
            )

    updated_trigger = self.zen_store.update_trigger(
        trigger_id=trigger.id,
        trigger_update=update_model,
    )
    return updated_trigger

update_user(name_id_or_prefix, updated_name=None, updated_full_name=None, updated_email=None, updated_email_opt_in=None, updated_password=None, old_password=None, updated_is_admin=None, updated_metadata=None, active=None)

Update a user.

Parameters:

Name Type Description Default
name_id_or_prefix Union[str, UUID]

The name or ID of the user to update.

required
updated_name Optional[str]

The new name of the user.

None
updated_full_name Optional[str]

The new full name of the user.

None
updated_email Optional[str]

The new email of the user.

None
updated_email_opt_in Optional[bool]

The new email opt-in status of the user.

None
updated_password Optional[str]

The new password of the user.

None
old_password Optional[str]

The old password of the user. Required for password update.

None
updated_is_admin Optional[bool]

Whether the user should be an admin.

None
updated_metadata Optional[Dict[str, Any]]

The new metadata for the user.

None
active Optional[bool]

Use to activate or deactivate the user.

None

Returns:

Type Description
UserResponse

The updated user.

Raises:

Type Description
ValidationError

If the old password is not provided when updating the password.

Source code in src/zenml/client.py
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
def update_user(
    self,
    name_id_or_prefix: Union[str, UUID],
    updated_name: Optional[str] = None,
    updated_full_name: Optional[str] = None,
    updated_email: Optional[str] = None,
    updated_email_opt_in: Optional[bool] = None,
    updated_password: Optional[str] = None,
    old_password: Optional[str] = None,
    updated_is_admin: Optional[bool] = None,
    updated_metadata: Optional[Dict[str, Any]] = None,
    active: Optional[bool] = None,
) -> UserResponse:
    """Update a user.

    Args:
        name_id_or_prefix: The name or ID of the user to update.
        updated_name: The new name of the user.
        updated_full_name: The new full name of the user.
        updated_email: The new email of the user.
        updated_email_opt_in: The new email opt-in status of the user.
        updated_password: The new password of the user.
        old_password: The old password of the user. Required for password
            update.
        updated_is_admin: Whether the user should be an admin.
        updated_metadata: The new metadata for the user.
        active: Use to activate or deactivate the user.

    Returns:
        The updated user.

    Raises:
        ValidationError: If the old password is not provided when updating
            the password.
    """
    user = self.get_user(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    user_update = UserUpdate(name=updated_name or user.name)
    if updated_full_name:
        user_update.full_name = updated_full_name
    if updated_email is not None:
        user_update.email = updated_email
        user_update.email_opted_in = (
            updated_email_opt_in or user.email_opted_in
        )
    if updated_email_opt_in is not None:
        user_update.email_opted_in = updated_email_opt_in
    if updated_password is not None:
        user_update.password = updated_password
        if old_password is None:
            raise ValidationError(
                "Old password is required to update the password."
            )
        user_update.old_password = old_password
    if updated_is_admin is not None:
        user_update.is_admin = updated_is_admin
    if active is not None:
        user_update.active = active

    if updated_metadata is not None:
        user_update.user_metadata = updated_metadata

    return self.zen_store.update_user(
        user_id=user.id, user_update=user_update
    )

update_workspace(name_id_or_prefix, new_name=None, new_description=None)

Update a workspace.

Parameters:

Name Type Description Default
name_id_or_prefix Optional[Union[UUID, str]]

Name, ID or prefix of the workspace to update.

required
new_name Optional[str]

New name of the workspace.

None
new_description Optional[str]

New description of the workspace.

None

Returns:

Type Description
WorkspaceResponse

The updated workspace.

Source code in src/zenml/client.py
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
def update_workspace(
    self,
    name_id_or_prefix: Optional[Union[UUID, str]],
    new_name: Optional[str] = None,
    new_description: Optional[str] = None,
) -> WorkspaceResponse:
    """Update a workspace.

    Args:
        name_id_or_prefix: Name, ID or prefix of the workspace to update.
        new_name: New name of the workspace.
        new_description: New description of the workspace.

    Returns:
        The updated workspace.
    """
    workspace = self.get_workspace(
        name_id_or_prefix=name_id_or_prefix, allow_name_prefix_match=False
    )
    workspace_update = WorkspaceUpdate(name=new_name or workspace.name)
    if new_description:
        workspace_update.description = new_description
    return self.zen_store.update_workspace(
        workspace_id=workspace.id,
        workspace_update=workspace_update,
    )

verify_service_connector(name_id_or_prefix, resource_type=None, resource_id=None, list_resources=True)

Verifies if a service connector has access to one or more resources.

Parameters:

Name Type Description Default
name_id_or_prefix Union[UUID, str]

The name, id or prefix of the service connector to verify.

required
resource_type Optional[str]

The type of the resource for which to verify access. If not provided, the resource type from the service connector configuration will be used.

None
resource_id Optional[str]

The ID of the resource for which to verify access. If not provided, the resource ID from the service connector configuration will be used.

None
list_resources bool

Whether to list the resources that the service connector has access to.

True

Returns:

Type Description
ServiceConnectorResourcesModel

The list of resources that the service connector has access to,

ServiceConnectorResourcesModel

scoped to the supplied resource type and ID, if provided.

Raises:

Type Description
AuthorizationException

If the service connector does not have access to the resources.

Source code in src/zenml/client.py
5785
5786
5787
5788
5789
5790
5791
5792
5793
5794
5795
5796
5797
5798
5799
5800
5801
5802
5803
5804
5805
5806
5807
5808
5809
5810
5811
5812
5813
5814
5815
5816
5817
5818
5819
5820
5821
5822
5823
5824
5825
5826
5827
5828
5829
5830
5831
5832
5833
5834
5835
5836
5837
5838
5839
5840
5841
5842
5843
5844
5845
5846
5847
5848
5849
5850
5851
5852
5853
5854
5855
def verify_service_connector(
    self,
    name_id_or_prefix: Union[UUID, str],
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    list_resources: bool = True,
) -> "ServiceConnectorResourcesModel":
    """Verifies if a service connector has access to one or more resources.

    Args:
        name_id_or_prefix: The name, id or prefix of the service connector
            to verify.
        resource_type: The type of the resource for which to verify access.
            If not provided, the resource type from the service connector
            configuration will be used.
        resource_id: The ID of the resource for which to verify access. If
            not provided, the resource ID from the service connector
            configuration will be used.
        list_resources: Whether to list the resources that the service
            connector has access to.

    Returns:
        The list of resources that the service connector has access to,
        scoped to the supplied resource type and ID, if provided.

    Raises:
        AuthorizationException: If the service connector does not have
            access to the resources.
    """
    from zenml.service_connectors.service_connector_registry import (
        service_connector_registry,
    )

    # Get the service connector model
    service_connector = self.get_service_connector(
        name_id_or_prefix=name_id_or_prefix,
        allow_name_prefix_match=False,
    )

    connector_type = self.get_service_connector_type(
        service_connector.type
    )

    # Prefer to verify the connector config server-side if the
    # implementation if available there, because it ensures
    # that the connector can be shared with other users or used
    # from other machines and because some auth methods rely on the
    # server-side authentication environment
    if connector_type.remote:
        connector_resources = self.zen_store.verify_service_connector(
            service_connector_id=service_connector.id,
            resource_type=resource_type,
            resource_id=resource_id,
            list_resources=list_resources,
        )
    else:
        connector_instance = (
            service_connector_registry.instantiate_connector(
                model=service_connector
            )
        )
        connector_resources = connector_instance.verify(
            resource_type=resource_type,
            resource_id=resource_id,
            list_resources=list_resources,
        )

    if connector_resources.error:
        raise AuthorizationException(connector_resources.error)

    return connector_resources

CodeRepositoryFilter

Bases: WorkspaceScopedFilter

Model to enable advanced filtering of all code repositories.

Source code in src/zenml/models/v2/core/code_repository.py
184
185
186
187
188
189
190
class CodeRepositoryFilter(WorkspaceScopedFilter):
    """Model to enable advanced filtering of all code repositories."""

    name: Optional[str] = Field(
        description="Name of the code repository.",
        default=None,
    )

ColorVariants

Bases: StrEnum

All possible color variants for frontend.

Source code in src/zenml/enums.py
336
337
338
339
340
341
342
343
344
345
346
347
348
349
class ColorVariants(StrEnum):
    """All possible color variants for frontend."""

    GREY = "grey"
    PURPLE = "purple"
    RED = "red"
    GREEN = "green"
    YELLOW = "yellow"
    ORANGE = "orange"
    LIME = "lime"
    TEAL = "teal"
    TURQUOISE = "turquoise"
    MAGENTA = "magenta"
    BLUE = "blue"

ComponentFilter

Bases: WorkspaceScopedFilter

Model to enable advanced filtering of all ComponentModels.

The Component Model needs additional scoping. As such the _scope_user field can be set to the user that is doing the filtering. The generate_filter() method of the baseclass is overwritten to include the scoping.

Source code in src/zenml/models/v2/core/component.py
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
class ComponentFilter(WorkspaceScopedFilter):
    """Model to enable advanced filtering of all ComponentModels.

    The Component Model needs additional scoping. As such the `_scope_user`
    field can be set to the user that is doing the filtering. The
    `generate_filter()` method of the baseclass is overwritten to include the
    scoping.
    """

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        "scope_type",
        "stack_id",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.CLI_EXCLUDE_FIELDS,
        "scope_type",
    ]
    scope_type: Optional[str] = Field(
        default=None,
        description="The type to scope this query to.",
    )
    name: Optional[str] = Field(
        default=None,
        description="Name of the stack component",
    )
    flavor: Optional[str] = Field(
        default=None,
        description="Flavor of the stack component",
    )
    type: Optional[str] = Field(
        default=None,
        description="Type of the stack component",
    )
    connector_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Connector linked to the stack component",
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Stack of the stack component",
        union_mode="left_to_right",
    )

    def set_scope_type(self, component_type: str) -> None:
        """Set the type of component on which to perform the filtering to scope the response.

        Args:
            component_type: The type of component to scope the query to.
        """
        self.scope_type = component_type

    def generate_filter(
        self, table: Type["AnySchema"]
    ) -> Union["ColumnElement[bool]"]:
        """Generate the filter for the query.

        Stack components can be scoped by type to narrow the search.

        Args:
            table: The Table that is being queried from.

        Returns:
            The filter expression for the query.
        """
        from sqlmodel import and_, or_

        from zenml.zen_stores.schemas import (
            StackComponentSchema,
            StackCompositionSchema,
        )

        base_filter = super().generate_filter(table)
        if self.scope_type:
            type_filter = getattr(table, "type") == self.scope_type
            return and_(base_filter, type_filter)

        if self.stack_id:
            operator = (
                or_ if self.logical_operator == LogicalOperators.OR else and_
            )

            stack_filter = and_(
                StackCompositionSchema.stack_id == self.stack_id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
            )
            base_filter = operator(base_filter, stack_filter)

        return base_filter

generate_filter(table)

Generate the filter for the query.

Stack components can be scoped by type to narrow the search.

Parameters:

Name Type Description Default
table Type[AnySchema]

The Table that is being queried from.

required

Returns:

Type Description
Union[ColumnElement[bool]]

The filter expression for the query.

Source code in src/zenml/models/v2/core/component.py
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
def generate_filter(
    self, table: Type["AnySchema"]
) -> Union["ColumnElement[bool]"]:
    """Generate the filter for the query.

    Stack components can be scoped by type to narrow the search.

    Args:
        table: The Table that is being queried from.

    Returns:
        The filter expression for the query.
    """
    from sqlmodel import and_, or_

    from zenml.zen_stores.schemas import (
        StackComponentSchema,
        StackCompositionSchema,
    )

    base_filter = super().generate_filter(table)
    if self.scope_type:
        type_filter = getattr(table, "type") == self.scope_type
        return and_(base_filter, type_filter)

    if self.stack_id:
        operator = (
            or_ if self.logical_operator == LogicalOperators.OR else and_
        )

        stack_filter = and_(
            StackCompositionSchema.stack_id == self.stack_id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
        )
        base_filter = operator(base_filter, stack_filter)

    return base_filter

set_scope_type(component_type)

Set the type of component on which to perform the filtering to scope the response.

Parameters:

Name Type Description Default
component_type str

The type of component to scope the query to.

required
Source code in src/zenml/models/v2/core/component.py
394
395
396
397
398
399
400
def set_scope_type(self, component_type: str) -> None:
    """Set the type of component on which to perform the filtering to scope the response.

    Args:
        component_type: The type of component to scope the query to.
    """
    self.scope_type = component_type

ComponentInfo

Bases: BaseModel

Information about each stack components when creating a full stack.

Source code in src/zenml/models/v2/misc/info_models.py
34
35
36
37
38
39
40
41
42
43
44
45
46
class ComponentInfo(BaseModel):
    """Information about each stack components when creating a full stack."""

    flavor: str
    service_connector_index: Optional[int] = Field(
        default=None,
        title="The id of the service connector from the list "
        "`service_connectors`.",
        description="The id of the service connector from the list "
        "`service_connectors` from `FullStackRequest`.",
    )
    service_connector_resource_id: Optional[str] = None
    configuration: Dict[str, Any] = {}

CredentialsNotValid

Bases: AuthorizationException

Raised when the credentials provided are invalid.

This is a subclass of AuthorizationException and should only be raised when the authentication credentials are invalid (e.g. expired API token, invalid username/password, invalid signature). If caught by the ZenML client, it will trigger an invalidation of the currently cached API token and a re-authentication flow.

Source code in src/zenml/exceptions.py
50
51
52
53
54
55
56
57
58
class CredentialsNotValid(AuthorizationException):
    """Raised when the credentials provided are invalid.

    This is a subclass of AuthorizationException and should only be raised when
    the authentication credentials are invalid (e.g. expired API token, invalid
    username/password, invalid signature). If caught by the ZenML client, it
    will trigger an invalidation of the currently cached API token and a
    re-authentication flow.
    """

DatabaseBackupStrategy

Bases: StrEnum

All available database backup strategies.

Source code in src/zenml/enums.py
382
383
384
385
386
387
388
389
390
391
392
class DatabaseBackupStrategy(StrEnum):
    """All available database backup strategies."""

    # Backup disabled
    DISABLED = "disabled"
    # In-memory backup
    IN_MEMORY = "in-memory"
    # Dump the database to a file
    DUMP_FILE = "dump-file"
    # Create a backup of the database in the remote database service
    DATABASE = "database"

EntityExistsError

Bases: ZenMLBaseException

Raised when trying to register an entity that already exists.

Source code in src/zenml/exceptions.py
167
168
class EntityExistsError(ZenMLBaseException):
    """Raised when trying to register an entity that already exists."""

Environment

Provides environment information.

Individual environment components can be registered separately to extend the global Environment object with additional information (see BaseEnvironmentComponent).

Source code in src/zenml/environment.py
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
class Environment(metaclass=SingletonMetaClass):
    """Provides environment information.

    Individual environment components can be registered separately to extend
    the global Environment object with additional information (see
    `BaseEnvironmentComponent`).
    """

    def __init__(self) -> None:
        """Initializes an Environment instance.

        Note: Environment is a singleton class, which means this method will
        only get called once. All following `Environment()` calls will return
        the previously initialized instance.
        """

    @staticmethod
    def get_system_info() -> Dict[str, str]:
        """Information about the operating system.

        Returns:
            A dictionary containing information about the operating system.
        """
        system = platform.system()

        if system == "Windows":
            release, version, csd, ptype = platform.win32_ver()

            return {
                "os": "windows",
                "windows_version_release": release,
                "windows_version": version,
                "windows_version_service_pack": csd,
                "windows_version_os_type": ptype,
            }

        if system == "Darwin":
            return {"os": "mac", "mac_version": platform.mac_ver()[0]}

        if system == "Linux":
            return {
                "os": "linux",
                "linux_distro": distro.id(),
                "linux_distro_like": distro.like(),
                "linux_distro_version": distro.version(),
            }

        # We don't collect data for any other system.
        return {"os": "unknown"}

    @staticmethod
    def python_version() -> str:
        """Returns the python version of the running interpreter.

        Returns:
            str: the python version
        """
        return platform.python_version()

    @staticmethod
    def in_container() -> bool:
        """If the current python process is running in a container.

        Returns:
            `True` if the current python process is running in a
            container, `False` otherwise.
        """
        # TODO [ENG-167]: Make this more reliable and add test.
        return INSIDE_ZENML_CONTAINER

    @staticmethod
    def in_docker() -> bool:
        """If the current python process is running in a docker container.

        Returns:
            `True` if the current python process is running in a docker
            container, `False` otherwise.
        """
        if os.path.exists("./dockerenv") or os.path.exists("/.dockerinit"):
            return True

        try:
            with open("/proc/1/cgroup", "rt") as ifh:
                info = ifh.read()
                return "docker" in info
        except (FileNotFoundError, Exception):
            return False

    @staticmethod
    def in_kubernetes() -> bool:
        """If the current python process is running in a kubernetes pod.

        Returns:
            `True` if the current python process is running in a kubernetes
            pod, `False` otherwise.
        """
        if "KUBERNETES_SERVICE_HOST" in os.environ:
            return True

        try:
            with open("/proc/1/cgroup", "rt") as ifh:
                info = ifh.read()
                return "kubepod" in info
        except (FileNotFoundError, Exception):
            return False

    @staticmethod
    def in_google_colab() -> bool:
        """If the current Python process is running in a Google Colab.

        Returns:
            `True` if the current Python process is running in a Google Colab,
            `False` otherwise.
        """
        try:
            import google.colab  # noqa

            return True

        except ModuleNotFoundError:
            return False

    @staticmethod
    def in_notebook() -> bool:
        """If the current Python process is running in a notebook.

        Returns:
            `True` if the current Python process is running in a notebook,
            `False` otherwise.
        """
        if Environment.in_google_colab():
            return True

        try:
            ipython = get_ipython()  # type: ignore[name-defined]
        except NameError:
            return False

        if ipython.__class__.__name__ in [
            "TerminalInteractiveShell",
            "ZMQInteractiveShell",
            "DatabricksShell",
        ]:
            return True
        return False

    @staticmethod
    def in_github_codespaces() -> bool:
        """If the current Python process is running in GitHub Codespaces.

        Returns:
            `True` if the current Python process is running in GitHub Codespaces,
            `False` otherwise.
        """
        return (
            "CODESPACES" in os.environ
            or "GITHUB_CODESPACE_TOKEN" in os.environ
            or "GITHUB_CODESPACES_PORT_FORWARDING_DOMAIN" in os.environ
        )

    @staticmethod
    def in_vscode_remote_container() -> bool:
        """If the current Python process is running in a VS Code Remote Container.

        Returns:
            `True` if the current Python process is running in a VS Code Remote Container,
            `False` otherwise.
        """
        return (
            "REMOTE_CONTAINERS" in os.environ
            or "VSCODE_REMOTE_CONTAINERS_SESSION" in os.environ
        )

    @staticmethod
    def in_paperspace_gradient() -> bool:
        """If the current Python process is running in Paperspace Gradient.

        Returns:
            `True` if the current Python process is running in Paperspace
            Gradient, `False` otherwise.
        """
        return "PAPERSPACE_NOTEBOOK_REPO_ID" in os.environ

    @staticmethod
    def in_github_actions() -> bool:
        """If the current Python process is running in GitHub Actions.

        Returns:
            `True` if the current Python process is running in GitHub
            Actions, `False` otherwise.
        """
        return "GITHUB_ACTIONS" in os.environ

    @staticmethod
    def in_gitlab_ci() -> bool:
        """If the current Python process is running in GitLab CI.

        Returns:
            `True` if the current Python process is running in GitLab
            CI, `False` otherwise.
        """
        return "GITLAB_CI" in os.environ

    @staticmethod
    def in_circle_ci() -> bool:
        """If the current Python process is running in Circle CI.

        Returns:
            `True` if the current Python process is running in Circle
            CI, `False` otherwise.
        """
        return "CIRCLECI" in os.environ

    @staticmethod
    def in_bitbucket_ci() -> bool:
        """If the current Python process is running in Bitbucket CI.

        Returns:
            `True` if the current Python process is running in Bitbucket
            CI, `False` otherwise.
        """
        return "BITBUCKET_BUILD_NUMBER" in os.environ

    @staticmethod
    def in_ci() -> bool:
        """If the current Python process is running in any CI.

        Returns:
            `True` if the current Python process is running in any
            CI, `False` otherwise.
        """
        return "CI" in os.environ

    @staticmethod
    def in_wsl() -> bool:
        """If the current process is running in Windows Subsystem for Linux.

        source: https://www.scivision.dev/python-detect-wsl/

        Returns:
            `True` if the current process is running in WSL, `False` otherwise.
        """
        return "microsoft-standard" in platform.uname().release

    @staticmethod
    def in_lightning_ai_studio() -> bool:
        """If the current Python process is running in Lightning.ai studios.

        Returns:
            `True` if the current Python process is running in Lightning.ai studios,
            `False` otherwise.
        """
        return (
            "LIGHTNING_CLOUD_URL" in os.environ
            and "LIGHTNING_CLOUDSPACE_HOST" in os.environ
        )

__init__()

Initializes an Environment instance.

Note: Environment is a singleton class, which means this method will only get called once. All following Environment() calls will return the previously initialized instance.

Source code in src/zenml/environment.py
121
122
123
124
125
126
127
def __init__(self) -> None:
    """Initializes an Environment instance.

    Note: Environment is a singleton class, which means this method will
    only get called once. All following `Environment()` calls will return
    the previously initialized instance.
    """

get_system_info() staticmethod

Information about the operating system.

Returns:

Type Description
Dict[str, str]

A dictionary containing information about the operating system.

Source code in src/zenml/environment.py
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
@staticmethod
def get_system_info() -> Dict[str, str]:
    """Information about the operating system.

    Returns:
        A dictionary containing information about the operating system.
    """
    system = platform.system()

    if system == "Windows":
        release, version, csd, ptype = platform.win32_ver()

        return {
            "os": "windows",
            "windows_version_release": release,
            "windows_version": version,
            "windows_version_service_pack": csd,
            "windows_version_os_type": ptype,
        }

    if system == "Darwin":
        return {"os": "mac", "mac_version": platform.mac_ver()[0]}

    if system == "Linux":
        return {
            "os": "linux",
            "linux_distro": distro.id(),
            "linux_distro_like": distro.like(),
            "linux_distro_version": distro.version(),
        }

    # We don't collect data for any other system.
    return {"os": "unknown"}

in_bitbucket_ci() staticmethod

If the current Python process is running in Bitbucket CI.

Returns:

Type Description
bool

True if the current Python process is running in Bitbucket

bool

CI, False otherwise.

Source code in src/zenml/environment.py
326
327
328
329
330
331
332
333
334
@staticmethod
def in_bitbucket_ci() -> bool:
    """If the current Python process is running in Bitbucket CI.

    Returns:
        `True` if the current Python process is running in Bitbucket
        CI, `False` otherwise.
    """
    return "BITBUCKET_BUILD_NUMBER" in os.environ

in_ci() staticmethod

If the current Python process is running in any CI.

Returns:

Type Description
bool

True if the current Python process is running in any

bool

CI, False otherwise.

Source code in src/zenml/environment.py
336
337
338
339
340
341
342
343
344
@staticmethod
def in_ci() -> bool:
    """If the current Python process is running in any CI.

    Returns:
        `True` if the current Python process is running in any
        CI, `False` otherwise.
    """
    return "CI" in os.environ

in_circle_ci() staticmethod

If the current Python process is running in Circle CI.

Returns:

Type Description
bool

True if the current Python process is running in Circle

bool

CI, False otherwise.

Source code in src/zenml/environment.py
316
317
318
319
320
321
322
323
324
@staticmethod
def in_circle_ci() -> bool:
    """If the current Python process is running in Circle CI.

    Returns:
        `True` if the current Python process is running in Circle
        CI, `False` otherwise.
    """
    return "CIRCLECI" in os.environ

in_container() staticmethod

If the current python process is running in a container.

Returns:

Type Description
bool

True if the current python process is running in a

bool

container, False otherwise.

Source code in src/zenml/environment.py
172
173
174
175
176
177
178
179
180
181
@staticmethod
def in_container() -> bool:
    """If the current python process is running in a container.

    Returns:
        `True` if the current python process is running in a
        container, `False` otherwise.
    """
    # TODO [ENG-167]: Make this more reliable and add test.
    return INSIDE_ZENML_CONTAINER

in_docker() staticmethod

If the current python process is running in a docker container.

Returns:

Type Description
bool

True if the current python process is running in a docker

bool

container, False otherwise.

Source code in src/zenml/environment.py
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
@staticmethod
def in_docker() -> bool:
    """If the current python process is running in a docker container.

    Returns:
        `True` if the current python process is running in a docker
        container, `False` otherwise.
    """
    if os.path.exists("./dockerenv") or os.path.exists("/.dockerinit"):
        return True

    try:
        with open("/proc/1/cgroup", "rt") as ifh:
            info = ifh.read()
            return "docker" in info
    except (FileNotFoundError, Exception):
        return False

in_github_actions() staticmethod

If the current Python process is running in GitHub Actions.

Returns:

Type Description
bool

True if the current Python process is running in GitHub

bool

Actions, False otherwise.

Source code in src/zenml/environment.py
296
297
298
299
300
301
302
303
304
@staticmethod
def in_github_actions() -> bool:
    """If the current Python process is running in GitHub Actions.

    Returns:
        `True` if the current Python process is running in GitHub
        Actions, `False` otherwise.
    """
    return "GITHUB_ACTIONS" in os.environ

in_github_codespaces() staticmethod

If the current Python process is running in GitHub Codespaces.

Returns:

Type Description
bool

True if the current Python process is running in GitHub Codespaces,

bool

False otherwise.

Source code in src/zenml/environment.py
259
260
261
262
263
264
265
266
267
268
269
270
271
@staticmethod
def in_github_codespaces() -> bool:
    """If the current Python process is running in GitHub Codespaces.

    Returns:
        `True` if the current Python process is running in GitHub Codespaces,
        `False` otherwise.
    """
    return (
        "CODESPACES" in os.environ
        or "GITHUB_CODESPACE_TOKEN" in os.environ
        or "GITHUB_CODESPACES_PORT_FORWARDING_DOMAIN" in os.environ
    )

in_gitlab_ci() staticmethod

If the current Python process is running in GitLab CI.

Returns:

Type Description
bool

True if the current Python process is running in GitLab

bool

CI, False otherwise.

Source code in src/zenml/environment.py
306
307
308
309
310
311
312
313
314
@staticmethod
def in_gitlab_ci() -> bool:
    """If the current Python process is running in GitLab CI.

    Returns:
        `True` if the current Python process is running in GitLab
        CI, `False` otherwise.
    """
    return "GITLAB_CI" in os.environ

in_google_colab() staticmethod

If the current Python process is running in a Google Colab.

Returns:

Type Description
bool

True if the current Python process is running in a Google Colab,

bool

False otherwise.

Source code in src/zenml/environment.py
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
@staticmethod
def in_google_colab() -> bool:
    """If the current Python process is running in a Google Colab.

    Returns:
        `True` if the current Python process is running in a Google Colab,
        `False` otherwise.
    """
    try:
        import google.colab  # noqa

        return True

    except ModuleNotFoundError:
        return False

in_kubernetes() staticmethod

If the current python process is running in a kubernetes pod.

Returns:

Type Description
bool

True if the current python process is running in a kubernetes

bool

pod, False otherwise.

Source code in src/zenml/environment.py
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
@staticmethod
def in_kubernetes() -> bool:
    """If the current python process is running in a kubernetes pod.

    Returns:
        `True` if the current python process is running in a kubernetes
        pod, `False` otherwise.
    """
    if "KUBERNETES_SERVICE_HOST" in os.environ:
        return True

    try:
        with open("/proc/1/cgroup", "rt") as ifh:
            info = ifh.read()
            return "kubepod" in info
    except (FileNotFoundError, Exception):
        return False

in_lightning_ai_studio() staticmethod

If the current Python process is running in Lightning.ai studios.

Returns:

Type Description
bool

True if the current Python process is running in Lightning.ai studios,

bool

False otherwise.

Source code in src/zenml/environment.py
357
358
359
360
361
362
363
364
365
366
367
368
@staticmethod
def in_lightning_ai_studio() -> bool:
    """If the current Python process is running in Lightning.ai studios.

    Returns:
        `True` if the current Python process is running in Lightning.ai studios,
        `False` otherwise.
    """
    return (
        "LIGHTNING_CLOUD_URL" in os.environ
        and "LIGHTNING_CLOUDSPACE_HOST" in os.environ
    )

in_notebook() staticmethod

If the current Python process is running in a notebook.

Returns:

Type Description
bool

True if the current Python process is running in a notebook,

bool

False otherwise.

Source code in src/zenml/environment.py
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
@staticmethod
def in_notebook() -> bool:
    """If the current Python process is running in a notebook.

    Returns:
        `True` if the current Python process is running in a notebook,
        `False` otherwise.
    """
    if Environment.in_google_colab():
        return True

    try:
        ipython = get_ipython()  # type: ignore[name-defined]
    except NameError:
        return False

    if ipython.__class__.__name__ in [
        "TerminalInteractiveShell",
        "ZMQInteractiveShell",
        "DatabricksShell",
    ]:
        return True
    return False

in_paperspace_gradient() staticmethod

If the current Python process is running in Paperspace Gradient.

Returns:

Type Description
bool

True if the current Python process is running in Paperspace

bool

Gradient, False otherwise.

Source code in src/zenml/environment.py
286
287
288
289
290
291
292
293
294
@staticmethod
def in_paperspace_gradient() -> bool:
    """If the current Python process is running in Paperspace Gradient.

    Returns:
        `True` if the current Python process is running in Paperspace
        Gradient, `False` otherwise.
    """
    return "PAPERSPACE_NOTEBOOK_REPO_ID" in os.environ

in_vscode_remote_container() staticmethod

If the current Python process is running in a VS Code Remote Container.

Returns:

Type Description
bool

True if the current Python process is running in a VS Code Remote Container,

bool

False otherwise.

Source code in src/zenml/environment.py
273
274
275
276
277
278
279
280
281
282
283
284
@staticmethod
def in_vscode_remote_container() -> bool:
    """If the current Python process is running in a VS Code Remote Container.

    Returns:
        `True` if the current Python process is running in a VS Code Remote Container,
        `False` otherwise.
    """
    return (
        "REMOTE_CONTAINERS" in os.environ
        or "VSCODE_REMOTE_CONTAINERS_SESSION" in os.environ
    )

in_wsl() staticmethod

If the current process is running in Windows Subsystem for Linux.

source: https://www.scivision.dev/python-detect-wsl/

Returns:

Type Description
bool

True if the current process is running in WSL, False otherwise.

Source code in src/zenml/environment.py
346
347
348
349
350
351
352
353
354
355
@staticmethod
def in_wsl() -> bool:
    """If the current process is running in Windows Subsystem for Linux.

    source: https://www.scivision.dev/python-detect-wsl/

    Returns:
        `True` if the current process is running in WSL, `False` otherwise.
    """
    return "microsoft-standard" in platform.uname().release

python_version() staticmethod

Returns the python version of the running interpreter.

Returns:

Name Type Description
str str

the python version

Source code in src/zenml/environment.py
163
164
165
166
167
168
169
170
@staticmethod
def python_version() -> str:
    """Returns the python version of the running interpreter.

    Returns:
        str: the python version
    """
    return platform.python_version()

GitNotFoundError

Bases: ImportError

Raised when ZenML CLI is used to interact with examples on a machine with no git installation.

Source code in src/zenml/exceptions.py
215
216
class GitNotFoundError(ImportError):
    """Raised when ZenML CLI is used to interact with examples on a machine with no git installation."""

GlobalConfiguration

Bases: BaseModel

Stores global configuration options.

Configuration options are read from a config file, but can be overwritten by environment variables. See GlobalConfiguration.__getattribute__ for more details.

Attributes:

Name Type Description
user_id UUID

Unique user id.

user_email Optional[str]

Email address associated with this client.

user_email_opt_in Optional[bool]

Whether the user has opted in to email communication.

analytics_opt_in bool

If a user agreed to sending analytics or not.

version Optional[str]

Version of ZenML that was last used to create or update the global config.

store Optional[SerializeAsAny[StoreConfiguration]]

Store configuration.

active_stack_id Optional[UUID]

The ID of the active stack.

active_workspace_name Optional[str]

The name of the active workspace.

Source code in src/zenml/config/global_config.py
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
class GlobalConfiguration(BaseModel, metaclass=GlobalConfigMetaClass):
    """Stores global configuration options.

    Configuration options are read from a config file, but can be overwritten
    by environment variables. See `GlobalConfiguration.__getattribute__` for
    more details.

    Attributes:
        user_id: Unique user id.
        user_email: Email address associated with this client.
        user_email_opt_in: Whether the user has opted in to email communication.
        analytics_opt_in: If a user agreed to sending analytics or not.
        version: Version of ZenML that was last used to create or update the
            global config.
        store: Store configuration.
        active_stack_id: The ID of the active stack.
        active_workspace_name: The name of the active workspace.
    """

    user_id: uuid.UUID = Field(default_factory=uuid.uuid4)
    user_email: Optional[str] = None
    user_email_opt_in: Optional[bool] = None
    analytics_opt_in: bool = True
    version: Optional[str] = None
    store: Optional[SerializeAsAny[StoreConfiguration]] = None
    active_stack_id: Optional[uuid.UUID] = None
    active_workspace_name: Optional[str] = None

    _zen_store: Optional["BaseZenStore"] = None
    _active_workspace: Optional["WorkspaceResponse"] = None
    _active_stack: Optional["StackResponse"] = None

    def __init__(self, **data: Any) -> None:
        """Initializes a GlobalConfiguration using values from the config file.

        GlobalConfiguration is a singleton class: only one instance can exist.
        Calling this constructor multiple times will always yield the same
        instance.

        Args:
            data: Custom configuration options.
        """
        config_values = self._read_config()
        config_values.update(data)

        super().__init__(**config_values)

        if not fileio.exists(self._config_file):
            self._write_config()

    @classmethod
    def get_instance(cls) -> Optional["GlobalConfiguration"]:
        """Return the GlobalConfiguration singleton instance.

        Returns:
            The GlobalConfiguration singleton instance or None, if the
            GlobalConfiguration hasn't been initialized yet.
        """
        return cls._global_config

    @classmethod
    def _reset_instance(
        cls, config: Optional["GlobalConfiguration"] = None
    ) -> None:
        """Reset the GlobalConfiguration singleton instance.

        This method is only meant for internal use and testing purposes.

        Args:
            config: The GlobalConfiguration instance to set as the global
                singleton. If None, the global GlobalConfiguration singleton is
                reset to an empty value.
        """
        cls._global_config = config
        if config:
            config._write_config()

    @field_validator("version")
    @classmethod
    def _validate_version(cls, value: Optional[str]) -> Optional[str]:
        """Validate the version attribute.

        Args:
            value: The version attribute value.

        Returns:
            The version attribute value.

        Raises:
            RuntimeError: If the version parsing fails.
        """
        if value is None:
            return value

        if not isinstance(version.parse(value), version.Version):
            # If the version parsing fails, it returns a `LegacyVersion`
            # instead. Check to make sure it's an actual `Version` object
            # which represents a valid version.
            raise RuntimeError(
                f"Invalid version in global configuration: {value}."
            )

        return value

    def __setattr__(self, key: str, value: Any) -> None:
        """Sets an attribute and persists it in the global configuration.

        Args:
            key: The attribute name.
            value: The attribute value.
        """
        super().__setattr__(key, value)
        if key.startswith("_"):
            return
        self._write_config()

    def __custom_getattribute__(self, key: str) -> Any:
        """Gets an attribute value for a specific key.

        If a value for this attribute was specified using an environment
        variable called `$(CONFIG_ENV_VAR_PREFIX)$(ATTRIBUTE_NAME)` and its
        value can be parsed to the attribute type, the value from this
        environment variable is returned instead.

        Args:
            key: The attribute name.

        Returns:
            The attribute value.
        """
        value = super().__getattribute__(key)
        if key.startswith("_") or key not in type(self).model_fields:
            return value

        environment_variable_name = f"{CONFIG_ENV_VAR_PREFIX}{key.upper()}"
        try:
            environment_variable_value = os.environ[environment_variable_name]
            # set the environment variable value to leverage Pydantic's type
            # conversion and validation
            super().__setattr__(key, environment_variable_value)
            return_value = super().__getattribute__(key)
            # set back the old value as we don't want to permanently store
            # the environment variable value here
            super().__setattr__(key, value)
            return return_value
        except (ValidationError, KeyError, TypeError):
            return value

    if not TYPE_CHECKING:
        # When defining __getattribute__, mypy allows accessing non-existent
        # attributes without failing
        # (see https://github.com/python/mypy/issues/13319).
        __getattribute__ = __custom_getattribute__

    def _migrate_config(self) -> None:
        """Migrates the global config to the latest version."""
        curr_version = version.parse(__version__)
        if self.version is None:
            logger.info(
                "Initializing the ZenML global configuration version to %s",
                curr_version,
            )
        else:
            config_version = version.parse(self.version)
            if config_version > curr_version:
                logger.error(
                    "The ZenML global configuration version (%s) is higher "
                    "than the version of ZenML currently being used (%s). "
                    "Read more about this issue and how to solve it here: "
                    "`https://docs.zenml.io/reference/global-settings#version-mismatch-downgrading`",
                    config_version,
                    curr_version,
                )
                # TODO [ENG-899]: Give more detailed instruction on how to
                #  resolve version mismatch.
                return

            if config_version == curr_version:
                return

            logger.info(
                "Migrating the ZenML global configuration from version %s "
                "to version %s...",
                config_version,
                curr_version,
            )

        # this will also trigger rewriting the config file to disk
        # to ensure the schema migration results are persisted
        self.version = __version__

    def _read_config(self) -> Dict[str, Any]:
        """Reads configuration options from disk.

        If the config file doesn't exist yet, this method returns an empty
        dictionary.

        Returns:
            A dictionary containing the configuration options.
        """
        config_file = self._config_file
        config_values = {}
        if fileio.exists(config_file):
            config_values = yaml_utils.read_yaml(config_file)

        if config_values is None:
            # This can happen for example if the config file is empty
            config_values = {}
        elif not isinstance(config_values, dict):
            logger.warning(
                "The global configuration file is not a valid YAML file. "
                "Creating a new global configuration file."
            )
            config_values = {}

        return config_values

    def _write_config(self) -> None:
        """Writes the global configuration options to disk."""
        # We never write the configuration file in a ZenML server environment
        # because this is a long-running process and the global configuration
        # variables are supplied via environment variables.
        if ENV_ZENML_SERVER in os.environ:
            logger.info(
                "Not writing the global configuration to disk in a ZenML "
                "server environment."
            )
            return

        config_file = self._config_file
        yaml_dict = self.model_dump(mode="json", exclude_none=True)
        logger.debug(f"Writing config to {config_file}")

        if not fileio.exists(config_file):
            io_utils.create_dir_recursive_if_not_exists(self.config_directory)

        yaml_utils.write_yaml(config_file, yaml_dict)

    def _configure_store(
        self,
        config: StoreConfiguration,
        skip_default_registrations: bool = False,
        **kwargs: Any,
    ) -> None:
        """Configure the global zen store.

        This method creates and initializes the global store according to the
        supplied configuration.

        Args:
            config: The new store configuration to use.
            skip_default_registrations: If `True`, the creation of the default
                stack and user in the store will be skipped.
            **kwargs: Additional keyword arguments to pass to the store
                constructor.
        """
        from zenml.zen_stores.base_zen_store import BaseZenStore

        if self.store == config and self._zen_store:
            # TODO: Do we actually need to create/initialize the store here
            #   or can we just return instead? We think this is just getting
            #   called for default registrations.
            BaseZenStore.create_store(
                config, skip_default_registrations, **kwargs
            )
            return

        # TODO: Revisit the flow regarding the registration of the default
        #  entities once the analytics v1 is removed.
        store = BaseZenStore.create_store(config, True, **kwargs)

        logger.debug(f"Configuring the global store to {store.config}")
        self.store = store.config
        self._zen_store = store

        if not skip_default_registrations:
            store._initialize_database()

        # Sanitize the global configuration to reflect the new store
        self._sanitize_config()
        self._write_config()

        local_stores_path = Path(self.local_stores_path)
        local_stores_path.mkdir(parents=True, exist_ok=True)

    def _sanitize_config(self) -> None:
        """Sanitize and save the global configuration.

        This method is called to ensure that the active stack and workspace
        are set to their default values, if possible.
        """
        # If running in a ZenML server environment, the active stack and
        # workspace are not relevant
        if ENV_ZENML_SERVER in os.environ:
            return
        active_workspace, active_stack = self.zen_store.validate_active_config(
            self.active_workspace_name,
            self.active_stack_id,
            config_name="global",
        )
        self.active_workspace_name = active_workspace.name
        self._active_workspace = active_workspace
        self.set_active_stack(active_stack)

    @property
    def _config_file(self) -> str:
        """Path to the file where global configuration options are stored.

        Returns:
            The path to the global configuration file.
        """
        return os.path.join(self.config_directory, "config.yaml")

    @property
    def config_directory(self) -> str:
        """Directory where the global configuration file is located.

        Returns:
            The directory where the global configuration file is located.
        """
        return io_utils.get_global_config_directory()

    @property
    def local_stores_path(self) -> str:
        """Path where local stores information is stored.

        Returns:
            The path where local stores information is stored.
        """
        if ENV_ZENML_LOCAL_STORES_PATH in os.environ:
            return os.environ[ENV_ZENML_LOCAL_STORES_PATH]

        return os.path.join(
            self.config_directory,
            LOCAL_STORES_DIRECTORY_NAME,
        )

    def get_config_environment_vars(self) -> Dict[str, str]:
        """Convert the global configuration to environment variables.

        Returns:
            Environment variables dictionary.
        """
        environment_vars = {}

        for key in self.model_fields.keys():
            if key == "store":
                # The store configuration uses its own environment variable
                # naming scheme
                continue

            value = getattr(self, key)
            if value is not None:
                environment_vars[CONFIG_ENV_VAR_PREFIX + key.upper()] = str(
                    value
                )

        store_dict = self.store_configuration.model_dump(exclude_none=True)

        # The secrets store and backup secrets store configurations use their
        # own environment variables naming scheme
        secrets_store_dict = store_dict.pop("secrets_store", None) or {}
        backup_secrets_store_dict = (
            store_dict.pop("backup_secrets_store", None) or {}
        )

        for key, value in store_dict.items():
            if key in ["username", "password"]:
                # Never include the username and password in the env vars. Use
                # the API token instead.
                continue

            environment_vars[ENV_ZENML_STORE_PREFIX + key.upper()] = str(value)

        for key, value in secrets_store_dict.items():
            environment_vars[ENV_ZENML_SECRETS_STORE_PREFIX + key.upper()] = (
                str(value)
            )

        for key, value in backup_secrets_store_dict.items():
            environment_vars[
                ENV_ZENML_BACKUP_SECRETS_STORE_PREFIX + key.upper()
            ] = str(value)

        return environment_vars

    def _get_store_configuration(
        self, baseline: Optional[StoreConfiguration] = None
    ) -> StoreConfiguration:
        """Get the store configuration.

        This method computes a store configuration starting from a baseline and
        applying the environment variables on top. If no baseline is provided,
        the following are used as a baseline:

        * the current store configuration, if it exists (e.g. if a store was
        configured in the global configuration file or explicitly set in the
        global configuration by calling `set_store`), or
        * the default store configuration, otherwise

        Args:
            baseline: Optional baseline store configuration to use.

        Returns:
            The store configuration.
        """
        from zenml.zen_stores.base_zen_store import BaseZenStore

        # Step 1: Create a baseline store configuration

        if baseline is not None:
            # Use the provided baseline store configuration
            store = baseline
        elif self.store is not None:
            # Use the current store configuration as a baseline
            store = self.store
        else:
            # Start with the default store configuration as a baseline
            store = self.get_default_store()

        # Step 2: Replace or update the baseline store configuration with the
        # environment variables

        env_store_config: Dict[str, str] = {}
        env_secrets_store_config: Dict[str, str] = {}
        env_backup_secrets_store_config: Dict[str, str] = {}
        for k, v in os.environ.items():
            if k.startswith(ENV_ZENML_STORE_PREFIX):
                env_store_config[k[len(ENV_ZENML_STORE_PREFIX) :].lower()] = v
            elif k.startswith(ENV_ZENML_SECRETS_STORE_PREFIX):
                env_secrets_store_config[
                    k[len(ENV_ZENML_SECRETS_STORE_PREFIX) :].lower()
                ] = v
            elif k.startswith(ENV_ZENML_BACKUP_SECRETS_STORE_PREFIX):
                env_backup_secrets_store_config[
                    k[len(ENV_ZENML_BACKUP_SECRETS_STORE_PREFIX) :].lower()
                ] = v

        if len(env_store_config):
            # As a convenience, we also infer the store type from the URL if
            # not explicitly set in the environment variables.
            if "type" not in env_store_config and "url" in env_store_config:
                env_store_config["type"] = BaseZenStore.get_store_type(
                    env_store_config["url"]
                )

            # We distinguish between two cases here: the environment variables
            # are used to completely replace the store configuration (i.e. when
            # the store type or URL is set using the environment variables), or
            # they are only used to update the store configuration. In the first
            # case, we replace the baseline store configuration with the
            # environment variables. In the second case, we only merge the
            # environment variables into the baseline store config.

            if "type" in env_store_config:
                logger.debug(
                    "Using environment variables to configure the store"
                )
                store = StoreConfiguration(
                    **env_store_config,
                )
            else:
                logger.debug(
                    "Using environment variables to update the default store"
                )
                store = store.model_copy(update=env_store_config, deep=True)

        # Step 3: Replace or update the baseline secrets store configuration
        # with the environment variables. This only applies to SQL stores.

        if store.type == StoreType.SQL:
            # We distinguish between two cases here: the environment
            # variables are used to completely replace the secrets store
            # configuration (i.e. when the secrets store type is set using
            # the environment variable), or they are only used to update the
            # store configuration. In the first case, we replace the
            # baseline secrets store configuration with the environment
            # variables. In the second case, we only merge the environment
            # variables into the baseline secrets store config (if any is
            # set).

            if len(env_secrets_store_config):
                if "type" in env_secrets_store_config:
                    logger.debug(
                        "Using environment variables to configure the secrets "
                        "store"
                    )
                    store.secrets_store = SecretsStoreConfiguration(
                        **env_secrets_store_config
                    )
                elif store.secrets_store:
                    logger.debug(
                        "Using environment variables to update the secrets "
                        "store"
                    )
                    store.secrets_store = store.secrets_store.model_copy(
                        update=env_secrets_store_config, deep=True
                    )

            if len(env_backup_secrets_store_config):
                if "type" in env_backup_secrets_store_config:
                    logger.debug(
                        "Using environment variables to configure the backup "
                        "secrets store"
                    )
                    store.backup_secrets_store = SecretsStoreConfiguration(
                        **env_backup_secrets_store_config
                    )
                elif store.backup_secrets_store:
                    logger.debug(
                        "Using environment variables to update the backup "
                        "secrets store"
                    )
                    store.backup_secrets_store = (
                        store.backup_secrets_store.model_copy(
                            update=env_backup_secrets_store_config, deep=True
                        )
                    )

        return store

    @property
    def store_configuration(self) -> StoreConfiguration:
        """Get the current store configuration.

        Returns:
            The store configuration.
        """
        # If the zen store is already initialized, we can get the store
        # configuration from there and disregard the global configuration.
        if self._zen_store is not None:
            return self._zen_store.config
        return self._get_store_configuration()

    def get_default_store(self) -> StoreConfiguration:
        """Get the default SQLite store configuration.

        Returns:
            The default SQLite store configuration.
        """
        from zenml.zen_stores.base_zen_store import BaseZenStore

        return BaseZenStore.get_default_store_config(
            path=os.path.join(
                self.local_stores_path,
                DEFAULT_STORE_DIRECTORY_NAME,
            )
        )

    def set_default_store(self) -> None:
        """Initializes and sets the default store configuration.

        Call this method to initialize or revert the store configuration to the
        default store.
        """
        # Apply the environment variables to the default store configuration
        default_store_cfg = self._get_store_configuration(
            baseline=self.get_default_store()
        )
        self._configure_store(default_store_cfg)
        logger.debug("Using the default store for the global config.")

    def uses_default_store(self) -> bool:
        """Check if the global configuration uses the default store.

        Returns:
            `True` if the global configuration uses the default store.
        """
        return self.store_configuration.url == self.get_default_store().url

    def set_store(
        self,
        config: StoreConfiguration,
        skip_default_registrations: bool = False,
        **kwargs: Any,
    ) -> None:
        """Update the active store configuration.

        Call this method to validate and update the active store configuration.

        Args:
            config: The new store configuration to use.
            skip_default_registrations: If `True`, the creation of the default
                stack and user in the store will be skipped.
            **kwargs: Additional keyword arguments to pass to the store
                constructor.
        """
        # Apply the environment variables to the custom store configuration
        config = self._get_store_configuration(baseline=config)
        self._configure_store(config, skip_default_registrations, **kwargs)
        logger.info("Updated the global store configuration.")

    @property
    def is_initialized(self) -> bool:
        """Check if the global configuration is initialized.

        Returns:
            `True` if the global configuration is initialized.
        """
        return self._zen_store is not None

    @property
    def zen_store(self) -> "BaseZenStore":
        """Initialize and/or return the global zen store.

        If the store hasn't been initialized yet, it is initialized when this
        property is first accessed according to the global store configuration.

        Returns:
            The current zen store.
        """
        if self._zen_store is None:
            self._configure_store(self.store_configuration)
        assert self._zen_store is not None

        return self._zen_store

    def set_active_workspace(
        self, workspace: "WorkspaceResponse"
    ) -> "WorkspaceResponse":
        """Set the workspace for the local client.

        Args:
            workspace: The workspace to set active.

        Returns:
            The workspace that was set active.
        """
        self.active_workspace_name = workspace.name
        self._active_workspace = workspace
        # Sanitize the global configuration to reflect the new workspace
        self._sanitize_config()
        return workspace

    def set_active_stack(self, stack: "StackResponse") -> None:
        """Set the active stack for the local client.

        Args:
            stack: The model of the stack to set active.
        """
        self.active_stack_id = stack.id
        self._active_stack = stack

    def get_active_workspace(self) -> "WorkspaceResponse":
        """Get a model of the active workspace for the local client.

        Returns:
            The model of the active workspace.
        """
        workspace_name = self.get_active_workspace_name()

        if self._active_workspace is not None:
            return self._active_workspace

        workspace = self.zen_store.get_workspace(
            workspace_name_or_id=workspace_name,
        )
        return self.set_active_workspace(workspace)

    def get_active_workspace_name(self) -> str:
        """Get the name of the active workspace.

        If the active workspace doesn't exist yet, the ZenStore is reinitialized.

        Returns:
            The name of the active workspace.
        """
        if self.active_workspace_name is None:
            _ = self.zen_store
            assert self.active_workspace_name is not None

        return self.active_workspace_name

    def get_active_stack_id(self) -> UUID:
        """Get the ID of the active stack.

        If the active stack doesn't exist yet, the ZenStore is reinitialized.

        Returns:
            The active stack ID.
        """
        if self.active_stack_id is None:
            _ = self.zen_store
            assert self.active_stack_id is not None

        return self.active_stack_id

    model_config = ConfigDict(
        # Validate attributes when assigning them. We need to set this in order
        # to have a mix of mutable and immutable attributes
        validate_assignment=True,
        # Allow extra attributes from configs of previous ZenML versions to
        # permit downgrading
        extra="allow",
    )

config_directory property

Directory where the global configuration file is located.

Returns:

Type Description
str

The directory where the global configuration file is located.

is_initialized property

Check if the global configuration is initialized.

Returns:

Type Description
bool

True if the global configuration is initialized.

local_stores_path property

Path where local stores information is stored.

Returns:

Type Description
str

The path where local stores information is stored.

store_configuration property

Get the current store configuration.

Returns:

Type Description
StoreConfiguration

The store configuration.

zen_store property

Initialize and/or return the global zen store.

If the store hasn't been initialized yet, it is initialized when this property is first accessed according to the global store configuration.

Returns:

Type Description
BaseZenStore

The current zen store.

__custom_getattribute__(key)

Gets an attribute value for a specific key.

If a value for this attribute was specified using an environment variable called $(CONFIG_ENV_VAR_PREFIX)$(ATTRIBUTE_NAME) and its value can be parsed to the attribute type, the value from this environment variable is returned instead.

Parameters:

Name Type Description Default
key str

The attribute name.

required

Returns:

Type Description
Any

The attribute value.

Source code in src/zenml/config/global_config.py
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
def __custom_getattribute__(self, key: str) -> Any:
    """Gets an attribute value for a specific key.

    If a value for this attribute was specified using an environment
    variable called `$(CONFIG_ENV_VAR_PREFIX)$(ATTRIBUTE_NAME)` and its
    value can be parsed to the attribute type, the value from this
    environment variable is returned instead.

    Args:
        key: The attribute name.

    Returns:
        The attribute value.
    """
    value = super().__getattribute__(key)
    if key.startswith("_") or key not in type(self).model_fields:
        return value

    environment_variable_name = f"{CONFIG_ENV_VAR_PREFIX}{key.upper()}"
    try:
        environment_variable_value = os.environ[environment_variable_name]
        # set the environment variable value to leverage Pydantic's type
        # conversion and validation
        super().__setattr__(key, environment_variable_value)
        return_value = super().__getattribute__(key)
        # set back the old value as we don't want to permanently store
        # the environment variable value here
        super().__setattr__(key, value)
        return return_value
    except (ValidationError, KeyError, TypeError):
        return value

__init__(**data)

Initializes a GlobalConfiguration using values from the config file.

GlobalConfiguration is a singleton class: only one instance can exist. Calling this constructor multiple times will always yield the same instance.

Parameters:

Name Type Description Default
data Any

Custom configuration options.

{}
Source code in src/zenml/config/global_config.py
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
def __init__(self, **data: Any) -> None:
    """Initializes a GlobalConfiguration using values from the config file.

    GlobalConfiguration is a singleton class: only one instance can exist.
    Calling this constructor multiple times will always yield the same
    instance.

    Args:
        data: Custom configuration options.
    """
    config_values = self._read_config()
    config_values.update(data)

    super().__init__(**config_values)

    if not fileio.exists(self._config_file):
        self._write_config()

__setattr__(key, value)

Sets an attribute and persists it in the global configuration.

Parameters:

Name Type Description Default
key str

The attribute name.

required
value Any

The attribute value.

required
Source code in src/zenml/config/global_config.py
204
205
206
207
208
209
210
211
212
213
214
def __setattr__(self, key: str, value: Any) -> None:
    """Sets an attribute and persists it in the global configuration.

    Args:
        key: The attribute name.
        value: The attribute value.
    """
    super().__setattr__(key, value)
    if key.startswith("_"):
        return
    self._write_config()

get_active_stack_id()

Get the ID of the active stack.

If the active stack doesn't exist yet, the ZenStore is reinitialized.

Returns:

Type Description
UUID

The active stack ID.

Source code in src/zenml/config/global_config.py
773
774
775
776
777
778
779
780
781
782
783
784
785
def get_active_stack_id(self) -> UUID:
    """Get the ID of the active stack.

    If the active stack doesn't exist yet, the ZenStore is reinitialized.

    Returns:
        The active stack ID.
    """
    if self.active_stack_id is None:
        _ = self.zen_store
        assert self.active_stack_id is not None

    return self.active_stack_id

get_active_workspace()

Get a model of the active workspace for the local client.

Returns:

Type Description
WorkspaceResponse

The model of the active workspace.

Source code in src/zenml/config/global_config.py
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
def get_active_workspace(self) -> "WorkspaceResponse":
    """Get a model of the active workspace for the local client.

    Returns:
        The model of the active workspace.
    """
    workspace_name = self.get_active_workspace_name()

    if self._active_workspace is not None:
        return self._active_workspace

    workspace = self.zen_store.get_workspace(
        workspace_name_or_id=workspace_name,
    )
    return self.set_active_workspace(workspace)

get_active_workspace_name()

Get the name of the active workspace.

If the active workspace doesn't exist yet, the ZenStore is reinitialized.

Returns:

Type Description
str

The name of the active workspace.

Source code in src/zenml/config/global_config.py
759
760
761
762
763
764
765
766
767
768
769
770
771
def get_active_workspace_name(self) -> str:
    """Get the name of the active workspace.

    If the active workspace doesn't exist yet, the ZenStore is reinitialized.

    Returns:
        The name of the active workspace.
    """
    if self.active_workspace_name is None:
        _ = self.zen_store
        assert self.active_workspace_name is not None

    return self.active_workspace_name

get_config_environment_vars()

Convert the global configuration to environment variables.

Returns:

Type Description
Dict[str, str]

Environment variables dictionary.

Source code in src/zenml/config/global_config.py
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
def get_config_environment_vars(self) -> Dict[str, str]:
    """Convert the global configuration to environment variables.

    Returns:
        Environment variables dictionary.
    """
    environment_vars = {}

    for key in self.model_fields.keys():
        if key == "store":
            # The store configuration uses its own environment variable
            # naming scheme
            continue

        value = getattr(self, key)
        if value is not None:
            environment_vars[CONFIG_ENV_VAR_PREFIX + key.upper()] = str(
                value
            )

    store_dict = self.store_configuration.model_dump(exclude_none=True)

    # The secrets store and backup secrets store configurations use their
    # own environment variables naming scheme
    secrets_store_dict = store_dict.pop("secrets_store", None) or {}
    backup_secrets_store_dict = (
        store_dict.pop("backup_secrets_store", None) or {}
    )

    for key, value in store_dict.items():
        if key in ["username", "password"]:
            # Never include the username and password in the env vars. Use
            # the API token instead.
            continue

        environment_vars[ENV_ZENML_STORE_PREFIX + key.upper()] = str(value)

    for key, value in secrets_store_dict.items():
        environment_vars[ENV_ZENML_SECRETS_STORE_PREFIX + key.upper()] = (
            str(value)
        )

    for key, value in backup_secrets_store_dict.items():
        environment_vars[
            ENV_ZENML_BACKUP_SECRETS_STORE_PREFIX + key.upper()
        ] = str(value)

    return environment_vars

get_default_store()

Get the default SQLite store configuration.

Returns:

Type Description
StoreConfiguration

The default SQLite store configuration.

Source code in src/zenml/config/global_config.py
634
635
636
637
638
639
640
641
642
643
644
645
646
647
def get_default_store(self) -> StoreConfiguration:
    """Get the default SQLite store configuration.

    Returns:
        The default SQLite store configuration.
    """
    from zenml.zen_stores.base_zen_store import BaseZenStore

    return BaseZenStore.get_default_store_config(
        path=os.path.join(
            self.local_stores_path,
            DEFAULT_STORE_DIRECTORY_NAME,
        )
    )

get_instance() classmethod

Return the GlobalConfiguration singleton instance.

Returns:

Type Description
Optional[GlobalConfiguration]

The GlobalConfiguration singleton instance or None, if the

Optional[GlobalConfiguration]

GlobalConfiguration hasn't been initialized yet.

Source code in src/zenml/config/global_config.py
150
151
152
153
154
155
156
157
158
@classmethod
def get_instance(cls) -> Optional["GlobalConfiguration"]:
    """Return the GlobalConfiguration singleton instance.

    Returns:
        The GlobalConfiguration singleton instance or None, if the
        GlobalConfiguration hasn't been initialized yet.
    """
    return cls._global_config

set_active_stack(stack)

Set the active stack for the local client.

Parameters:

Name Type Description Default
stack StackResponse

The model of the stack to set active.

required
Source code in src/zenml/config/global_config.py
734
735
736
737
738
739
740
741
def set_active_stack(self, stack: "StackResponse") -> None:
    """Set the active stack for the local client.

    Args:
        stack: The model of the stack to set active.
    """
    self.active_stack_id = stack.id
    self._active_stack = stack

set_active_workspace(workspace)

Set the workspace for the local client.

Parameters:

Name Type Description Default
workspace WorkspaceResponse

The workspace to set active.

required

Returns:

Type Description
WorkspaceResponse

The workspace that was set active.

Source code in src/zenml/config/global_config.py
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
def set_active_workspace(
    self, workspace: "WorkspaceResponse"
) -> "WorkspaceResponse":
    """Set the workspace for the local client.

    Args:
        workspace: The workspace to set active.

    Returns:
        The workspace that was set active.
    """
    self.active_workspace_name = workspace.name
    self._active_workspace = workspace
    # Sanitize the global configuration to reflect the new workspace
    self._sanitize_config()
    return workspace

set_default_store()

Initializes and sets the default store configuration.

Call this method to initialize or revert the store configuration to the default store.

Source code in src/zenml/config/global_config.py
649
650
651
652
653
654
655
656
657
658
659
660
def set_default_store(self) -> None:
    """Initializes and sets the default store configuration.

    Call this method to initialize or revert the store configuration to the
    default store.
    """
    # Apply the environment variables to the default store configuration
    default_store_cfg = self._get_store_configuration(
        baseline=self.get_default_store()
    )
    self._configure_store(default_store_cfg)
    logger.debug("Using the default store for the global config.")

set_store(config, skip_default_registrations=False, **kwargs)

Update the active store configuration.

Call this method to validate and update the active store configuration.

Parameters:

Name Type Description Default
config StoreConfiguration

The new store configuration to use.

required
skip_default_registrations bool

If True, the creation of the default stack and user in the store will be skipped.

False
**kwargs Any

Additional keyword arguments to pass to the store constructor.

{}
Source code in src/zenml/config/global_config.py
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
def set_store(
    self,
    config: StoreConfiguration,
    skip_default_registrations: bool = False,
    **kwargs: Any,
) -> None:
    """Update the active store configuration.

    Call this method to validate and update the active store configuration.

    Args:
        config: The new store configuration to use.
        skip_default_registrations: If `True`, the creation of the default
            stack and user in the store will be skipped.
        **kwargs: Additional keyword arguments to pass to the store
            constructor.
    """
    # Apply the environment variables to the custom store configuration
    config = self._get_store_configuration(baseline=config)
    self._configure_store(config, skip_default_registrations, **kwargs)
    logger.info("Updated the global store configuration.")

uses_default_store()

Check if the global configuration uses the default store.

Returns:

Type Description
bool

True if the global configuration uses the default store.

Source code in src/zenml/config/global_config.py
662
663
664
665
666
667
668
def uses_default_store(self) -> bool:
    """Check if the global configuration uses the default store.

    Returns:
        `True` if the global configuration uses the default store.
    """
    return self.store_configuration.url == self.get_default_store().url

IllegalOperationError

Bases: ZenMLBaseException

Raised when an illegal operation is attempted.

Source code in src/zenml/exceptions.py
223
224
class IllegalOperationError(ZenMLBaseException):
    """Raised when an illegal operation is attempted."""

InitializationException

Bases: ZenMLBaseException

Raised when an error occurred during initialization of a ZenML repository.

Source code in src/zenml/exceptions.py
42
43
class InitializationException(ZenMLBaseException):
    """Raised when an error occurred during initialization of a ZenML repository."""

LoggingLevels

Bases: Enum

Enum for logging levels.

Source code in src/zenml/enums.py
 96
 97
 98
 99
100
101
102
103
104
class LoggingLevels(Enum):
    """Enum for logging levels."""

    NOTSET = logging.NOTSET
    ERROR = logging.ERROR
    WARN = logging.WARN
    INFO = logging.INFO
    DEBUG = logging.DEBUG
    CRITICAL = logging.CRITICAL

ModelFilter

Bases: WorkspaceScopedFilter, TaggableFilter

Model to enable advanced filtering of all Workspaces.

Source code in src/zenml/models/v2/core/model.py
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
class ModelFilter(WorkspaceScopedFilter, TaggableFilter):
    """Model to enable advanced filtering of all Workspaces."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the Model",
    )

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
    ]
    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        SORT_BY_LATEST_VERSION_KEY,
    ]
    CLI_EXCLUDE_FIELDS = [
        *WorkspaceScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query for Models.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, case, col, desc, func, select

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import (
            ModelSchema,
            ModelVersionSchema,
        )

        sort_by, operand = self.sorting_params

        if sort_by == SORT_BY_LATEST_VERSION_KEY:
            # Subquery to find the latest version per model
            latest_version_subquery = (
                select(
                    ModelSchema.id,
                    case(
                        (
                            func.max(ModelVersionSchema.created).is_(None),
                            ModelSchema.created,
                        ),
                        else_=func.max(ModelVersionSchema.created),
                    ).label("latest_version_created"),
                )
                .outerjoin(
                    ModelVersionSchema,
                    ModelSchema.id == ModelVersionSchema.model_id,  # type: ignore[arg-type]
                )
                .group_by(col(ModelSchema.id))
                .subquery()
            )

            query = query.add_columns(
                latest_version_subquery.c.latest_version_created,
            ).where(ModelSchema.id == latest_version_subquery.c.id)

            # Apply sorting based on the operand
            if operand == SorterOps.ASCENDING:
                query = query.order_by(
                    asc(latest_version_subquery.c.latest_version_created),
                    asc(ModelSchema.id),
                )
            else:
                query = query.order_by(
                    desc(latest_version_subquery.c.latest_version_created),
                    desc(ModelSchema.id),
                )
            return query

        # For other sorting cases, delegate to the parent class
        return super().apply_sorting(query=query, table=table)

apply_sorting(query, table)

Apply sorting to the query for Models.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/model.py
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query for Models.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, case, col, desc, func, select

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import (
        ModelSchema,
        ModelVersionSchema,
    )

    sort_by, operand = self.sorting_params

    if sort_by == SORT_BY_LATEST_VERSION_KEY:
        # Subquery to find the latest version per model
        latest_version_subquery = (
            select(
                ModelSchema.id,
                case(
                    (
                        func.max(ModelVersionSchema.created).is_(None),
                        ModelSchema.created,
                    ),
                    else_=func.max(ModelVersionSchema.created),
                ).label("latest_version_created"),
            )
            .outerjoin(
                ModelVersionSchema,
                ModelSchema.id == ModelVersionSchema.model_id,  # type: ignore[arg-type]
            )
            .group_by(col(ModelSchema.id))
            .subquery()
        )

        query = query.add_columns(
            latest_version_subquery.c.latest_version_created,
        ).where(ModelSchema.id == latest_version_subquery.c.id)

        # Apply sorting based on the operand
        if operand == SorterOps.ASCENDING:
            query = query.order_by(
                asc(latest_version_subquery.c.latest_version_created),
                asc(ModelSchema.id),
            )
        else:
            query = query.order_by(
                desc(latest_version_subquery.c.latest_version_created),
                desc(ModelSchema.id),
            )
        return query

    # For other sorting cases, delegate to the parent class
    return super().apply_sorting(query=query, table=table)

ModelRegistryModelMetadata

Bases: BaseModel

Base class for all ZenML model registry model metadata.

The ModelRegistryModelMetadata class represents metadata associated with a registered model version, including information such as the associated pipeline name, pipeline run ID, step name, ZenML version, and custom attributes. It serves as a blueprint for creating concrete model metadata implementations in a registry, and provides a record of the history of a model and its development process.

Source code in src/zenml/model_registries/base_model_registry.py
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
class ModelRegistryModelMetadata(BaseModel):
    """Base class for all ZenML model registry model metadata.

    The `ModelRegistryModelMetadata` class represents metadata associated with
    a registered model version, including information such as the associated
    pipeline name, pipeline run ID, step name, ZenML version, and custom
    attributes. It serves as a blueprint for creating concrete model metadata
    implementations in a registry, and provides a record of the history of a
    model and its development process.
    """

    zenml_version: Optional[str] = None
    zenml_run_name: Optional[str] = None
    zenml_pipeline_name: Optional[str] = None
    zenml_pipeline_uuid: Optional[str] = None
    zenml_pipeline_run_uuid: Optional[str] = None
    zenml_step_name: Optional[str] = None
    zenml_workspace: Optional[str] = None

    @property
    def custom_attributes(self) -> Dict[str, str]:
        """Returns a dictionary of custom attributes.

        Returns:
            A dictionary of custom attributes.
        """
        # Return all attributes that are not explicitly defined as Pydantic
        # fields in this class
        if self.model_extra:
            return {k: str(v) for k, v in self.model_extra.items()}
        return {}

    def model_dump(
        self,
        *,
        exclude_unset: bool = False,
        exclude_none: bool = True,
        **kwargs: Any,
    ) -> Dict[str, str]:
        """Returns a dictionary representation of the metadata.

        This method overrides the default Pydantic `model_dump` method to allow
        for the exclusion of fields with a value of None.

        Args:
            exclude_unset: Whether to exclude unset attributes.
            exclude_none: Whether to exclude None attributes.
            **kwargs: Additional keyword arguments.

        Returns:
            A dictionary representation of the metadata.
        """
        if exclude_none:
            return {
                k: v
                for k, v in super()
                .model_dump(exclude_unset=exclude_unset, **kwargs)
                .items()
                if v is not None
            }
        else:
            return super().model_dump(exclude_unset=exclude_unset, **kwargs)

    model_config = ConfigDict(extra="allow")

custom_attributes property

Returns a dictionary of custom attributes.

Returns:

Type Description
Dict[str, str]

A dictionary of custom attributes.

model_dump(*, exclude_unset=False, exclude_none=True, **kwargs)

Returns a dictionary representation of the metadata.

This method overrides the default Pydantic model_dump method to allow for the exclusion of fields with a value of None.

Parameters:

Name Type Description Default
exclude_unset bool

Whether to exclude unset attributes.

False
exclude_none bool

Whether to exclude None attributes.

True
**kwargs Any

Additional keyword arguments.

{}

Returns:

Type Description
Dict[str, str]

A dictionary representation of the metadata.

Source code in src/zenml/model_registries/base_model_registry.py
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
def model_dump(
    self,
    *,
    exclude_unset: bool = False,
    exclude_none: bool = True,
    **kwargs: Any,
) -> Dict[str, str]:
    """Returns a dictionary representation of the metadata.

    This method overrides the default Pydantic `model_dump` method to allow
    for the exclusion of fields with a value of None.

    Args:
        exclude_unset: Whether to exclude unset attributes.
        exclude_none: Whether to exclude None attributes.
        **kwargs: Additional keyword arguments.

    Returns:
        A dictionary representation of the metadata.
    """
    if exclude_none:
        return {
            k: v
            for k, v in super()
            .model_dump(exclude_unset=exclude_unset, **kwargs)
            .items()
            if v is not None
        }
    else:
        return super().model_dump(exclude_unset=exclude_unset, **kwargs)

ModelResponse

Bases: WorkspaceScopedResponse[ModelResponseBody, ModelResponseMetadata, ModelResponseResources]

Response model for models.

Source code in src/zenml/models/v2/core/model.py
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
class ModelResponse(
    WorkspaceScopedResponse[
        ModelResponseBody, ModelResponseMetadata, ModelResponseResources
    ]
):
    """Response model for models."""

    name: str = Field(
        title="The name of the model",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "ModelResponse":
        """Get the hydrated version of this model.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_model(self.id)

    # Body and metadata properties
    @property
    def tags(self) -> List["TagResponse"]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_body().tags

    @property
    def latest_version_name(self) -> Optional[str]:
        """The `latest_version_name` property.

        Returns:
            the value of the property.
        """
        return self.get_body().latest_version_name

    @property
    def latest_version_id(self) -> Optional[UUID]:
        """The `latest_version_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().latest_version_id

    @property
    def license(self) -> Optional[str]:
        """The `license` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().license

    @property
    def description(self) -> Optional[str]:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def audience(self) -> Optional[str]:
        """The `audience` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().audience

    @property
    def use_cases(self) -> Optional[str]:
        """The `use_cases` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().use_cases

    @property
    def limitations(self) -> Optional[str]:
        """The `limitations` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().limitations

    @property
    def trade_offs(self) -> Optional[str]:
        """The `trade_offs` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().trade_offs

    @property
    def ethics(self) -> Optional[str]:
        """The `ethics` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().ethics

    @property
    def save_models_to_registry(self) -> bool:
        """The `save_models_to_registry` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().save_models_to_registry

    # Helper functions
    @property
    def versions(self) -> List["Model"]:
        """List all versions of the model.

        Returns:
            The list of all model version.
        """
        from zenml.client import Client

        client = Client()
        model_versions = depaginate(
            client.list_model_versions, model_name_or_id=self.id
        )
        return [
            mv.to_model_class(suppress_class_validation_warnings=True)
            for mv in model_versions
        ]

audience property

The audience property.

Returns:

Type Description
Optional[str]

the value of the property.

description property

The description property.

Returns:

Type Description
Optional[str]

the value of the property.

ethics property

The ethics property.

Returns:

Type Description
Optional[str]

the value of the property.

latest_version_id property

The latest_version_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

latest_version_name property

The latest_version_name property.

Returns:

Type Description
Optional[str]

the value of the property.

license property

The license property.

Returns:

Type Description
Optional[str]

the value of the property.

limitations property

The limitations property.

Returns:

Type Description
Optional[str]

the value of the property.

save_models_to_registry property

The save_models_to_registry property.

Returns:

Type Description
bool

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

trade_offs property

The trade_offs property.

Returns:

Type Description
Optional[str]

the value of the property.

use_cases property

The use_cases property.

Returns:

Type Description
Optional[str]

the value of the property.

versions property

List all versions of the model.

Returns:

Type Description
List[Model]

The list of all model version.

get_hydrated_version()

Get the hydrated version of this model.

Returns:

Type Description
ModelResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/model.py
193
194
195
196
197
198
199
200
201
def get_hydrated_version(self) -> "ModelResponse":
    """Get the hydrated version of this model.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_model(self.id)

ModelStages

Bases: StrEnum

All possible stages of a Model Version.

Source code in src/zenml/enums.py
326
327
328
329
330
331
332
333
class ModelStages(StrEnum):
    """All possible stages of a Model Version."""

    NONE = "none"
    STAGING = "staging"
    PRODUCTION = "production"
    ARCHIVED = "archived"
    LATEST = "latest"

ModelVersionArtifactFilter

Bases: BaseFilter

Model version pipeline run links filter model.

Source code in src/zenml/models/v2/core/model_version_artifact.py
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
class ModelVersionArtifactFilter(BaseFilter):
    """Model version pipeline run links filter model."""

    # Artifact name and type are not DB fields and need to be handled separately
    FILTER_EXCLUDE_FIELDS = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "artifact_name",
        "only_data_artifacts",
        "only_model_artifacts",
        "only_deployment_artifacts",
        "has_custom_name",
        "user",
    ]
    CLI_EXCLUDE_FIELDS = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
        "only_data_artifacts",
        "only_model_artifacts",
        "only_deployment_artifacts",
        "has_custom_name",
        "model_version_id",
        "updated",
        "id",
    ]

    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by model version ID",
        union_mode="left_to_right",
    )
    artifact_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by artifact ID",
        union_mode="left_to_right",
    )
    artifact_name: Optional[str] = Field(
        default=None,
        description="Name of the artifact",
    )
    only_data_artifacts: Optional[bool] = False
    only_model_artifacts: Optional[bool] = False
    only_deployment_artifacts: Optional[bool] = False
    has_custom_name: Optional[bool] = None
    user: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the user that created the artifact.",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List[Union["ColumnElement[bool]"]]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_, col

        from zenml.zen_stores.schemas import (
            ArtifactSchema,
            ArtifactVersionSchema,
            ModelVersionArtifactSchema,
            UserSchema,
        )

        if self.artifact_name:
            value, filter_operator = self._resolve_operator(self.artifact_name)
            filter_ = StrFilter(
                operation=GenericFilterOps(filter_operator),
                column="name",
                value=value,
            )
            artifact_name_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                filter_.generate_query_conditions(ArtifactSchema),
            )
            custom_filters.append(artifact_name_filter)

        if self.only_data_artifacts:
            data_artifact_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                col(ArtifactVersionSchema.type).not_in(
                    ["ServiceArtifact", "ModelArtifact"]
                ),
            )
            custom_filters.append(data_artifact_filter)

        if self.only_model_artifacts:
            model_artifact_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.type == "ModelArtifact",
            )
            custom_filters.append(model_artifact_filter)

        if self.only_deployment_artifacts:
            deployment_artifact_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.type == "ServiceArtifact",
            )
            custom_filters.append(deployment_artifact_filter)

        if self.has_custom_name is not None:
            custom_name_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
                ArtifactSchema.has_custom_name == self.has_custom_name,
            )
            custom_filters.append(custom_name_filter)

        if self.user:
            user_filter = and_(
                ModelVersionArtifactSchema.artifact_version_id
                == ArtifactVersionSchema.id,
                ArtifactVersionSchema.user_id == UserSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.user,
                    table=UserSchema,
                    additional_columns=["full_name"],
                ),
            )
            custom_filters.append(user_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[Union[ColumnElement[bool]]]

A list of custom filters.

Source code in src/zenml/models/v2/core/model_version_artifact.py
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List[Union["ColumnElement[bool]"]]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_, col

    from zenml.zen_stores.schemas import (
        ArtifactSchema,
        ArtifactVersionSchema,
        ModelVersionArtifactSchema,
        UserSchema,
    )

    if self.artifact_name:
        value, filter_operator = self._resolve_operator(self.artifact_name)
        filter_ = StrFilter(
            operation=GenericFilterOps(filter_operator),
            column="name",
            value=value,
        )
        artifact_name_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            filter_.generate_query_conditions(ArtifactSchema),
        )
        custom_filters.append(artifact_name_filter)

    if self.only_data_artifacts:
        data_artifact_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            col(ArtifactVersionSchema.type).not_in(
                ["ServiceArtifact", "ModelArtifact"]
            ),
        )
        custom_filters.append(data_artifact_filter)

    if self.only_model_artifacts:
        model_artifact_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.type == "ModelArtifact",
        )
        custom_filters.append(model_artifact_filter)

    if self.only_deployment_artifacts:
        deployment_artifact_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.type == "ServiceArtifact",
        )
        custom_filters.append(deployment_artifact_filter)

    if self.has_custom_name is not None:
        custom_name_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.artifact_id == ArtifactSchema.id,
            ArtifactSchema.has_custom_name == self.has_custom_name,
        )
        custom_filters.append(custom_name_filter)

    if self.user:
        user_filter = and_(
            ModelVersionArtifactSchema.artifact_version_id
            == ArtifactVersionSchema.id,
            ArtifactVersionSchema.user_id == UserSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.user,
                table=UserSchema,
                additional_columns=["full_name"],
            ),
        )
        custom_filters.append(user_filter)

    return custom_filters

ModelVersionFilter

Bases: WorkspaceScopedFilter, TaggableFilter

Filter model for model versions.

Source code in src/zenml/models/v2/core/model_version.py
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
class ModelVersionFilter(WorkspaceScopedFilter, TaggableFilter):
    """Filter model for model versions."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        "run_metadata",
    ]
    CUSTOM_SORTING_OPTIONS = [
        *WorkspaceScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
    ]
    CLI_EXCLUDE_FIELDS = [
        *WorkspaceScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="The name of the Model Version",
    )
    number: Optional[int] = Field(
        default=None,
        description="The number of the Model Version",
    )
    stage: Optional[Union[str, ModelStages]] = Field(
        description="The model version stage",
        default=None,
        union_mode="left_to_right",
    )
    run_metadata: Optional[Dict[str, str]] = Field(
        default=None,
        description="The run_metadata to filter the model versions by.",
    )

    _model_id: UUID = PrivateAttr(None)

    def set_scope_model(self, model_name_or_id: Union[str, UUID]) -> None:
        """Set the model to scope this response.

        Args:
            model_name_or_id: The model to scope this response to.
        """
        try:
            model_id = UUID(str(model_name_or_id))
        except ValueError:
            from zenml.client import Client

            model_id = Client().get_model(model_name_or_id).id

        self._model_id = model_id

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_

        from zenml.zen_stores.schemas import (
            ModelVersionSchema,
            RunMetadataResourceSchema,
            RunMetadataSchema,
        )

        if self.run_metadata is not None:
            from zenml.enums import MetadataResourceTypes

            for key, value in self.run_metadata.items():
                additional_filter = and_(
                    RunMetadataResourceSchema.resource_id
                    == ModelVersionSchema.id,
                    RunMetadataResourceSchema.resource_type
                    == MetadataResourceTypes.MODEL_VERSION,
                    RunMetadataResourceSchema.run_metadata_id
                    == RunMetadataSchema.id,
                    self.generate_custom_query_conditions_for_column(
                        value=value,
                        table=RunMetadataSchema,
                        column="value",
                    ),
                )
                custom_filters.append(additional_filter)

        return custom_filters

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)

        if self._model_id:
            query = query.where(getattr(table, "model_id") == self._model_id)

        return query

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/model_version.py
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)

    if self._model_id:
        query = query.where(getattr(table, "model_id") == self._model_id)

    return query

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/model_version.py
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_

    from zenml.zen_stores.schemas import (
        ModelVersionSchema,
        RunMetadataResourceSchema,
        RunMetadataSchema,
    )

    if self.run_metadata is not None:
        from zenml.enums import MetadataResourceTypes

        for key, value in self.run_metadata.items():
            additional_filter = and_(
                RunMetadataResourceSchema.resource_id
                == ModelVersionSchema.id,
                RunMetadataResourceSchema.resource_type
                == MetadataResourceTypes.MODEL_VERSION,
                RunMetadataResourceSchema.run_metadata_id
                == RunMetadataSchema.id,
                self.generate_custom_query_conditions_for_column(
                    value=value,
                    table=RunMetadataSchema,
                    column="value",
                ),
            )
            custom_filters.append(additional_filter)

    return custom_filters

set_scope_model(model_name_or_id)

Set the model to scope this response.

Parameters:

Name Type Description Default
model_name_or_id Union[str, UUID]

The model to scope this response to.

required
Source code in src/zenml/models/v2/core/model_version.py
608
609
610
611
612
613
614
615
616
617
618
619
620
621
def set_scope_model(self, model_name_or_id: Union[str, UUID]) -> None:
    """Set the model to scope this response.

    Args:
        model_name_or_id: The model to scope this response to.
    """
    try:
        model_id = UUID(str(model_name_or_id))
    except ValueError:
        from zenml.client import Client

        model_id = Client().get_model(model_name_or_id).id

    self._model_id = model_id

ModelVersionPipelineRunFilter

Bases: BaseFilter

Model version pipeline run links filter model.

Source code in src/zenml/models/v2/core/model_version_pipeline_run.py
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
class ModelVersionPipelineRunFilter(BaseFilter):
    """Model version pipeline run links filter model."""

    FILTER_EXCLUDE_FIELDS = [
        *BaseFilter.FILTER_EXCLUDE_FIELDS,
        "pipeline_run_name",
        "user",
    ]
    CLI_EXCLUDE_FIELDS = [
        *BaseFilter.CLI_EXCLUDE_FIELDS,
        "model_version_id",
        "updated",
        "id",
    ]

    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by model version ID",
        union_mode="left_to_right",
    )
    pipeline_run_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Filter by pipeline run ID",
        union_mode="left_to_right",
    )
    pipeline_run_name: Optional[str] = Field(
        default=None,
        description="Name of the pipeline run",
    )
    user: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the user that created the pipeline run.",
    )

    # TODO: In Pydantic v2, the `model_` is a protected namespaces for all
    #  fields defined under base models. If not handled, this raises a warning.
    #  It is possible to suppress this warning message with the following
    #  configuration, however the ultimate solution is to rename these fields.
    #  Even though they do not cause any problems right now, if we are not
    #  careful we might overwrite some fields protected by pydantic.
    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_

        from zenml.zen_stores.schemas import (
            ModelVersionPipelineRunSchema,
            PipelineRunSchema,
            UserSchema,
        )

        if self.pipeline_run_name:
            value, filter_operator = self._resolve_operator(
                self.pipeline_run_name
            )
            filter_ = StrFilter(
                operation=GenericFilterOps(filter_operator),
                column="name",
                value=value,
            )
            pipeline_run_name_filter = and_(
                ModelVersionPipelineRunSchema.pipeline_run_id
                == PipelineRunSchema.id,
                filter_.generate_query_conditions(PipelineRunSchema),
            )
            custom_filters.append(pipeline_run_name_filter)

        if self.user:
            user_filter = and_(
                ModelVersionPipelineRunSchema.pipeline_run_id
                == PipelineRunSchema.id,
                PipelineRunSchema.user_id == UserSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.user,
                    table=UserSchema,
                    additional_columns=["full_name"],
                ),
            )
            custom_filters.append(user_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/model_version_pipeline_run.py
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_

    from zenml.zen_stores.schemas import (
        ModelVersionPipelineRunSchema,
        PipelineRunSchema,
        UserSchema,
    )

    if self.pipeline_run_name:
        value, filter_operator = self._resolve_operator(
            self.pipeline_run_name
        )
        filter_ = StrFilter(
            operation=GenericFilterOps(filter_operator),
            column="name",
            value=value,
        )
        pipeline_run_name_filter = and_(
            ModelVersionPipelineRunSchema.pipeline_run_id
            == PipelineRunSchema.id,
            filter_.generate_query_conditions(PipelineRunSchema),
        )
        custom_filters.append(pipeline_run_name_filter)

    if self.user:
        user_filter = and_(
            ModelVersionPipelineRunSchema.pipeline_run_id
            == PipelineRunSchema.id,
            PipelineRunSchema.user_id == UserSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.user,
                table=UserSchema,
                additional_columns=["full_name"],
            ),
        )
        custom_filters.append(user_filter)

    return custom_filters

ModelVersionResponse

Bases: WorkspaceScopedResponse[ModelVersionResponseBody, ModelVersionResponseMetadata, ModelVersionResponseResources]

Response model for model versions.

Source code in src/zenml/models/v2/core/model_version.py
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
class ModelVersionResponse(
    WorkspaceScopedResponse[
        ModelVersionResponseBody,
        ModelVersionResponseMetadata,
        ModelVersionResponseResources,
    ]
):
    """Response model for model versions."""

    name: Optional[str] = Field(
        description="The name of the model version",
        max_length=STR_FIELD_MAX_LENGTH,
        default=None,
    )

    @property
    def stage(self) -> Optional[str]:
        """The `stage` property.

        Returns:
            the value of the property.
        """
        return self.get_body().stage

    @property
    def number(self) -> int:
        """The `number` property.

        Returns:
            the value of the property.
        """
        return self.get_body().number

    @property
    def model(self) -> "ModelResponse":
        """The `model` property.

        Returns:
            the value of the property.
        """
        return self.get_body().model

    @property
    def model_artifact_ids(self) -> Dict[str, Dict[str, UUID]]:
        """The `model_artifact_ids` property.

        Returns:
            the value of the property.
        """
        return self.get_body().model_artifact_ids

    @property
    def data_artifact_ids(self) -> Dict[str, Dict[str, UUID]]:
        """The `data_artifact_ids` property.

        Returns:
            the value of the property.
        """
        return self.get_body().data_artifact_ids

    @property
    def deployment_artifact_ids(self) -> Dict[str, Dict[str, UUID]]:
        """The `deployment_artifact_ids` property.

        Returns:
            the value of the property.
        """
        return self.get_body().deployment_artifact_ids

    @property
    def pipeline_run_ids(self) -> Dict[str, UUID]:
        """The `pipeline_run_ids` property.

        Returns:
            the value of the property.
        """
        return self.get_body().pipeline_run_ids

    @property
    def tags(self) -> List[TagResponse]:
        """The `tags` property.

        Returns:
            the value of the property.
        """
        return self.get_body().tags

    @property
    def description(self) -> Optional[str]:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().description

    @property
    def run_metadata(self) -> Dict[str, MetadataType]:
        """The `run_metadata` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().run_metadata

    def get_hydrated_version(self) -> "ModelVersionResponse":
        """Get the hydrated version of this model version.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_model_version(self.id)

    # Helper functions
    def to_model_class(
        self,
        suppress_class_validation_warnings: bool = True,
    ) -> "Model":
        """Convert response model to Model object.

        Args:
            suppress_class_validation_warnings: internally used to suppress
                repeated warnings.

        Returns:
            Model object
        """
        from zenml.model.model import Model

        mv = Model(
            name=self.model.name,
            license=self.model.license,
            description=self.description,
            audience=self.model.audience,
            use_cases=self.model.use_cases,
            limitations=self.model.limitations,
            trade_offs=self.model.trade_offs,
            ethics=self.model.ethics,
            tags=[t.name for t in self.tags],
            version=self.name,
            suppress_class_validation_warnings=suppress_class_validation_warnings,
            model_version_id=self.id,
        )

        return mv

    @property
    def model_artifacts(
        self,
    ) -> Dict[str, Dict[str, "ArtifactVersionResponse"]]:
        """Get all model artifacts linked to this model version.

        Returns:
            Dictionary of model artifacts with versions as
            Dict[str, Dict[str, ArtifactResponse]]
        """
        from zenml.client import Client

        return {
            name: {
                version: Client().get_artifact_version(a)
                for version, a in self.model_artifact_ids[name].items()
            }
            for name in self.model_artifact_ids
        }

    @property
    def data_artifacts(
        self,
    ) -> Dict[str, Dict[str, "ArtifactVersionResponse"]]:
        """Get all data artifacts linked to this model version.

        Returns:
            Dictionary of data artifacts with versions as
            Dict[str, Dict[str, ArtifactResponse]]
        """
        from zenml.client import Client

        return {
            name: {
                version: Client().get_artifact_version(a)
                for version, a in self.data_artifact_ids[name].items()
            }
            for name in self.data_artifact_ids
        }

    @property
    def deployment_artifacts(
        self,
    ) -> Dict[str, Dict[str, "ArtifactVersionResponse"]]:
        """Get all deployment artifacts linked to this model version.

        Returns:
            Dictionary of deployment artifacts with versions as
            Dict[str, Dict[str, ArtifactResponse]]
        """
        from zenml.client import Client

        return {
            name: {
                version: Client().get_artifact_version(a)
                for version, a in self.deployment_artifact_ids[name].items()
            }
            for name in self.deployment_artifact_ids
        }

    @property
    def pipeline_runs(self) -> Dict[str, "PipelineRunResponse"]:
        """Get all pipeline runs linked to this version.

        Returns:
            Dictionary of Pipeline Runs as PipelineRunResponseModel
        """
        from zenml.client import Client

        return {
            name: Client().get_pipeline_run(pr)
            for name, pr in self.pipeline_run_ids.items()
        }

    def _get_linked_object(
        self,
        name: str,
        version: Optional[str] = None,
        type: Optional[ArtifactType] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the artifact linked to this model version given type.

        Args:
            name: The name of the artifact to retrieve.
            version: The version of the artifact to retrieve (None for
                latest/non-versioned)
            type: The type of the artifact to filter by.

        Returns:
            Specific version of an artifact from collection or None
        """
        from zenml.client import Client

        artifact_versions = Client().list_artifact_versions(
            sort_by="desc:created",
            size=1,
            name=name,
            version=version,
            model_version_id=self.id,
            type=type,
            hydrate=True,
        )

        if not artifact_versions.items:
            return None
        return artifact_versions.items[0]

    def get_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the artifact linked to this model version.

        Args:
            name: The name of the artifact to retrieve.
            version: The version of the artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of an artifact or None
        """
        return self._get_linked_object(name, version)

    def get_model_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the model artifact linked to this model version.

        Args:
            name: The name of the model artifact to retrieve.
            version: The version of the model artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of the model artifact or None
        """
        return self._get_linked_object(name, version, ArtifactType.MODEL)

    def get_data_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the data artifact linked to this model version.

        Args:
            name: The name of the data artifact to retrieve.
            version: The version of the data artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of the data artifact or None
        """
        return self._get_linked_object(name, version, ArtifactType.DATA)

    def get_deployment_artifact(
        self,
        name: str,
        version: Optional[str] = None,
    ) -> Optional["ArtifactVersionResponse"]:
        """Get the deployment artifact linked to this model version.

        Args:
            name: The name of the deployment artifact to retrieve.
            version: The version of the deployment artifact to retrieve (None for
                latest/non-versioned)

        Returns:
            Specific version of the deployment artifact or None
        """
        return self._get_linked_object(name, version, ArtifactType.SERVICE)

    def get_pipeline_run(self, name: str) -> "PipelineRunResponse":
        """Get pipeline run linked to this version.

        Args:
            name: The name of the pipeline run to retrieve.

        Returns:
            PipelineRun as PipelineRunResponseModel
        """
        from zenml.client import Client

        return Client().get_pipeline_run(self.pipeline_run_ids[name])

    def set_stage(
        self, stage: Union[str, ModelStages], force: bool = False
    ) -> None:
        """Sets this Model Version to a desired stage.

        Args:
            stage: the target stage for model version.
            force: whether to force archiving of current model version in
                target stage or raise.

        Raises:
            ValueError: if model_stage is not valid.
        """
        from zenml.client import Client

        stage = getattr(stage, "value", stage)
        if stage not in [stage.value for stage in ModelStages]:
            raise ValueError(f"`{stage}` is not a valid model stage.")

        Client().update_model_version(
            model_name_or_id=self.model.id,
            version_name_or_id=self.id,
            stage=stage,
            force=force,
        )

data_artifact_ids property

The data_artifact_ids property.

Returns:

Type Description
Dict[str, Dict[str, UUID]]

the value of the property.

data_artifacts property

Get all data artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, ArtifactVersionResponse]]

Dictionary of data artifacts with versions as

Dict[str, Dict[str, ArtifactVersionResponse]]

Dict[str, Dict[str, ArtifactResponse]]

deployment_artifact_ids property

The deployment_artifact_ids property.

Returns:

Type Description
Dict[str, Dict[str, UUID]]

the value of the property.

deployment_artifacts property

Get all deployment artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, ArtifactVersionResponse]]

Dictionary of deployment artifacts with versions as

Dict[str, Dict[str, ArtifactVersionResponse]]

Dict[str, Dict[str, ArtifactResponse]]

description property

The description property.

Returns:

Type Description
Optional[str]

the value of the property.

model property

The model property.

Returns:

Type Description
ModelResponse

the value of the property.

model_artifact_ids property

The model_artifact_ids property.

Returns:

Type Description
Dict[str, Dict[str, UUID]]

the value of the property.

model_artifacts property

Get all model artifacts linked to this model version.

Returns:

Type Description
Dict[str, Dict[str, ArtifactVersionResponse]]

Dictionary of model artifacts with versions as

Dict[str, Dict[str, ArtifactVersionResponse]]

Dict[str, Dict[str, ArtifactResponse]]

number property

The number property.

Returns:

Type Description
int

the value of the property.

pipeline_run_ids property

The pipeline_run_ids property.

Returns:

Type Description
Dict[str, UUID]

the value of the property.

pipeline_runs property

Get all pipeline runs linked to this version.

Returns:

Type Description
Dict[str, PipelineRunResponse]

Dictionary of Pipeline Runs as PipelineRunResponseModel

run_metadata property

The run_metadata property.

Returns:

Type Description
Dict[str, MetadataType]

the value of the property.

stage property

The stage property.

Returns:

Type Description
Optional[str]

the value of the property.

tags property

The tags property.

Returns:

Type Description
List[TagResponse]

the value of the property.

get_artifact(name, version=None)

Get the artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the artifact to retrieve.

required
version Optional[str]

The version of the artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of an artifact or None

Source code in src/zenml/models/v2/core/model_version.py
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
def get_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the artifact linked to this model version.

    Args:
        name: The name of the artifact to retrieve.
        version: The version of the artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of an artifact or None
    """
    return self._get_linked_object(name, version)

get_data_artifact(name, version=None)

Get the data artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the data artifact to retrieve.

required
version Optional[str]

The version of the data artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of the data artifact or None

Source code in src/zenml/models/v2/core/model_version.py
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
def get_data_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the data artifact linked to this model version.

    Args:
        name: The name of the data artifact to retrieve.
        version: The version of the data artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of the data artifact or None
    """
    return self._get_linked_object(name, version, ArtifactType.DATA)

get_deployment_artifact(name, version=None)

Get the deployment artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the deployment artifact to retrieve.

required
version Optional[str]

The version of the deployment artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of the deployment artifact or None

Source code in src/zenml/models/v2/core/model_version.py
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
def get_deployment_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the deployment artifact linked to this model version.

    Args:
        name: The name of the deployment artifact to retrieve.
        version: The version of the deployment artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of the deployment artifact or None
    """
    return self._get_linked_object(name, version, ArtifactType.SERVICE)

get_hydrated_version()

Get the hydrated version of this model version.

Returns:

Type Description
ModelVersionResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/model_version.py
310
311
312
313
314
315
316
317
318
def get_hydrated_version(self) -> "ModelVersionResponse":
    """Get the hydrated version of this model version.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_model_version(self.id)

get_model_artifact(name, version=None)

Get the model artifact linked to this model version.

Parameters:

Name Type Description Default
name str

The name of the model artifact to retrieve.

required
version Optional[str]

The version of the model artifact to retrieve (None for latest/non-versioned)

None

Returns:

Type Description
Optional[ArtifactVersionResponse]

Specific version of the model artifact or None

Source code in src/zenml/models/v2/core/model_version.py
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
def get_model_artifact(
    self,
    name: str,
    version: Optional[str] = None,
) -> Optional["ArtifactVersionResponse"]:
    """Get the model artifact linked to this model version.

    Args:
        name: The name of the model artifact to retrieve.
        version: The version of the model artifact to retrieve (None for
            latest/non-versioned)

    Returns:
        Specific version of the model artifact or None
    """
    return self._get_linked_object(name, version, ArtifactType.MODEL)

get_pipeline_run(name)

Get pipeline run linked to this version.

Parameters:

Name Type Description Default
name str

The name of the pipeline run to retrieve.

required

Returns:

Type Description
PipelineRunResponse

PipelineRun as PipelineRunResponseModel

Source code in src/zenml/models/v2/core/model_version.py
528
529
530
531
532
533
534
535
536
537
538
539
def get_pipeline_run(self, name: str) -> "PipelineRunResponse":
    """Get pipeline run linked to this version.

    Args:
        name: The name of the pipeline run to retrieve.

    Returns:
        PipelineRun as PipelineRunResponseModel
    """
    from zenml.client import Client

    return Client().get_pipeline_run(self.pipeline_run_ids[name])

set_stage(stage, force=False)

Sets this Model Version to a desired stage.

Parameters:

Name Type Description Default
stage Union[str, ModelStages]

the target stage for model version.

required
force bool

whether to force archiving of current model version in target stage or raise.

False

Raises:

Type Description
ValueError

if model_stage is not valid.

Source code in src/zenml/models/v2/core/model_version.py
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
def set_stage(
    self, stage: Union[str, ModelStages], force: bool = False
) -> None:
    """Sets this Model Version to a desired stage.

    Args:
        stage: the target stage for model version.
        force: whether to force archiving of current model version in
            target stage or raise.

    Raises:
        ValueError: if model_stage is not valid.
    """
    from zenml.client import Client

    stage = getattr(stage, "value", stage)
    if stage not in [stage.value for stage in ModelStages]:
        raise ValueError(f"`{stage}` is not a valid model stage.")

    Client().update_model_version(
        model_name_or_id=self.model.id,
        version_name_or_id=self.id,
        stage=stage,
        force=force,
    )

to_model_class(suppress_class_validation_warnings=True)

Convert response model to Model object.

Parameters:

Name Type Description Default
suppress_class_validation_warnings bool

internally used to suppress repeated warnings.

True

Returns:

Type Description
Model

Model object

Source code in src/zenml/models/v2/core/model_version.py
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
def to_model_class(
    self,
    suppress_class_validation_warnings: bool = True,
) -> "Model":
    """Convert response model to Model object.

    Args:
        suppress_class_validation_warnings: internally used to suppress
            repeated warnings.

    Returns:
        Model object
    """
    from zenml.model.model import Model

    mv = Model(
        name=self.model.name,
        license=self.model.license,
        description=self.description,
        audience=self.model.audience,
        use_cases=self.model.use_cases,
        limitations=self.model.limitations,
        trade_offs=self.model.trade_offs,
        ethics=self.model.ethics,
        tags=[t.name for t in self.tags],
        version=self.name,
        suppress_class_validation_warnings=suppress_class_validation_warnings,
        model_version_id=self.id,
    )

    return mv

ModelVersionStage

Bases: Enum

Enum of the possible stages of a registered model.

Source code in src/zenml/model_registries/base_model_registry.py
28
29
30
31
32
33
34
class ModelVersionStage(Enum):
    """Enum of the possible stages of a registered model."""

    NONE = "None"
    STAGING = "Staging"
    PRODUCTION = "Production"
    ARCHIVED = "Archived"

OAuthDeviceFilter

Bases: UserScopedFilter

Model to enable advanced filtering of OAuth2 devices.

Source code in src/zenml/models/v2/core/device.py
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
class OAuthDeviceFilter(UserScopedFilter):
    """Model to enable advanced filtering of OAuth2 devices."""

    expires: Optional[Union[datetime, str, None]] = Field(
        default=None,
        description="The expiration date of the OAuth2 device.",
        union_mode="left_to_right",
    )
    client_id: Union[UUID, str, None] = Field(
        default=None,
        description="The client ID of the OAuth2 device.",
        union_mode="left_to_right",
    )
    status: Union[OAuthDeviceStatus, str, None] = Field(
        default=None,
        description="The status of the OAuth2 device.",
        union_mode="left_to_right",
    )
    trusted_device: Union[bool, str, None] = Field(
        default=None,
        description="Whether the OAuth2 device was marked as trusted.",
        union_mode="left_to_right",
    )
    failed_auth_attempts: Union[int, str, None] = Field(
        default=None,
        description="The number of failed authentication attempts.",
        union_mode="left_to_right",
    )
    last_login: Optional[Union[datetime, str, None]] = Field(
        default=None,
        description="The date of the last successful login.",
        union_mode="left_to_right",
    )

OldSchoolMarkdownHeading

Bases: Heading

A traditional markdown heading.

Source code in src/zenml/cli/text_utils.py
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
class OldSchoolMarkdownHeading(Heading):
    """A traditional markdown heading."""

    def __rich_console__(
        self, console: Console, options: ConsoleOptions
    ) -> RenderResult:
        """Render the heading.

        Args:
            console: The console rendering the content.
            options: The console options.

        Yields:
            RenderResult: The rendered content.
        """
        text = self.text
        text.justify = "left"
        if self.tag == "h1":
            # Underline and bold h1s
            yield Text("", style="bold")
            yield text
            yield "=" * len(text)
        else:
            if self.tag == "h2":
                # Just underline h2s
                yield Text("", style="underline")
            else:
                # Just bold everything else
                yield Text("", style="bold")
            yield text

__rich_console__(console, options)

Render the heading.

Parameters:

Name Type Description Default
console Console

The console rendering the content.

required
options ConsoleOptions

The console options.

required

Yields:

Name Type Description
RenderResult RenderResult

The rendered content.

Source code in src/zenml/cli/text_utils.py
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
def __rich_console__(
    self, console: Console, options: ConsoleOptions
) -> RenderResult:
    """Render the heading.

    Args:
        console: The console rendering the content.
        options: The console options.

    Yields:
        RenderResult: The rendered content.
    """
    text = self.text
    text.justify = "left"
    if self.tag == "h1":
        # Underline and bold h1s
        yield Text("", style="bold")
        yield text
        yield "=" * len(text)
    else:
        if self.tag == "h2":
            # Just underline h2s
            yield Text("", style="underline")
        else:
            # Just bold everything else
            yield Text("", style="bold")
        yield text

Pipeline

ZenML pipeline class.

Source code in src/zenml/pipelines/pipeline_definition.py
 116
 117
 118
 119
 120
 121
 122
 123
 124
 125
 126
 127
 128
 129
 130
 131
 132
 133
 134
 135
 136
 137
 138
 139
 140
 141
 142
 143
 144
 145
 146
 147
 148
 149
 150
 151
 152
 153
 154
 155
 156
 157
 158
 159
 160
 161
 162
 163
 164
 165
 166
 167
 168
 169
 170
 171
 172
 173
 174
 175
 176
 177
 178
 179
 180
 181
 182
 183
 184
 185
 186
 187
 188
 189
 190
 191
 192
 193
 194
 195
 196
 197
 198
 199
 200
 201
 202
 203
 204
 205
 206
 207
 208
 209
 210
 211
 212
 213
 214
 215
 216
 217
 218
 219
 220
 221
 222
 223
 224
 225
 226
 227
 228
 229
 230
 231
 232
 233
 234
 235
 236
 237
 238
 239
 240
 241
 242
 243
 244
 245
 246
 247
 248
 249
 250
 251
 252
 253
 254
 255
 256
 257
 258
 259
 260
 261
 262
 263
 264
 265
 266
 267
 268
 269
 270
 271
 272
 273
 274
 275
 276
 277
 278
 279
 280
 281
 282
 283
 284
 285
 286
 287
 288
 289
 290
 291
 292
 293
 294
 295
 296
 297
 298
 299
 300
 301
 302
 303
 304
 305
 306
 307
 308
 309
 310
 311
 312
 313
 314
 315
 316
 317
 318
 319
 320
 321
 322
 323
 324
 325
 326
 327
 328
 329
 330
 331
 332
 333
 334
 335
 336
 337
 338
 339
 340
 341
 342
 343
 344
 345
 346
 347
 348
 349
 350
 351
 352
 353
 354
 355
 356
 357
 358
 359
 360
 361
 362
 363
 364
 365
 366
 367
 368
 369
 370
 371
 372
 373
 374
 375
 376
 377
 378
 379
 380
 381
 382
 383
 384
 385
 386
 387
 388
 389
 390
 391
 392
 393
 394
 395
 396
 397
 398
 399
 400
 401
 402
 403
 404
 405
 406
 407
 408
 409
 410
 411
 412
 413
 414
 415
 416
 417
 418
 419
 420
 421
 422
 423
 424
 425
 426
 427
 428
 429
 430
 431
 432
 433
 434
 435
 436
 437
 438
 439
 440
 441
 442
 443
 444
 445
 446
 447
 448
 449
 450
 451
 452
 453
 454
 455
 456
 457
 458
 459
 460
 461
 462
 463
 464
 465
 466
 467
 468
 469
 470
 471
 472
 473
 474
 475
 476
 477
 478
 479
 480
 481
 482
 483
 484
 485
 486
 487
 488
 489
 490
 491
 492
 493
 494
 495
 496
 497
 498
 499
 500
 501
 502
 503
 504
 505
 506
 507
 508
 509
 510
 511
 512
 513
 514
 515
 516
 517
 518
 519
 520
 521
 522
 523
 524
 525
 526
 527
 528
 529
 530
 531
 532
 533
 534
 535
 536
 537
 538
 539
 540
 541
 542
 543
 544
 545
 546
 547
 548
 549
 550
 551
 552
 553
 554
 555
 556
 557
 558
 559
 560
 561
 562
 563
 564
 565
 566
 567
 568
 569
 570
 571
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
class Pipeline:
    """ZenML pipeline class."""

    # The active pipeline is the pipeline to which step invocations will be
    # added when a step is called. It is set using a context manager when a
    # pipeline is called (see Pipeline.__call__ for more context)
    ACTIVE_PIPELINE: ClassVar[Optional["Pipeline"]] = None

    def __init__(
        self,
        name: str,
        entrypoint: F,
        enable_cache: Optional[bool] = None,
        enable_artifact_metadata: Optional[bool] = None,
        enable_artifact_visualization: Optional[bool] = None,
        enable_step_logs: Optional[bool] = None,
        settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
        tags: Optional[List[str]] = None,
        extra: Optional[Dict[str, Any]] = None,
        on_failure: Optional["HookSpecification"] = None,
        on_success: Optional["HookSpecification"] = None,
        model: Optional["Model"] = None,
        substitutions: Optional[Dict[str, str]] = None,
    ) -> None:
        """Initializes a pipeline.

        Args:
            name: The name of the pipeline.
            entrypoint: The entrypoint function of the pipeline.
            enable_cache: If caching should be enabled for this pipeline.
            enable_artifact_metadata: If artifact metadata should be enabled for
                this pipeline.
            enable_artifact_visualization: If artifact visualization should be
                enabled for this pipeline.
            enable_step_logs: If step logs should be enabled for this pipeline.
            settings: Settings for this pipeline.
            tags: Tags to apply to runs of this pipeline.
            extra: Extra configurations for this pipeline.
            on_failure: Callback function in event of failure of the step. Can
                be a function with a single argument of type `BaseException`, or
                a source path to such a function (e.g. `module.my_function`).
            on_success: Callback function in event of success of the step. Can
                be a function with no arguments, or a source path to such a
                function (e.g. `module.my_function`).
            model: configuration of the model in the Model Control Plane.
            substitutions: Extra placeholders to use in the name templates.
        """
        self._invocations: Dict[str, StepInvocation] = {}
        self._run_args: Dict[str, Any] = {}

        self._configuration = PipelineConfiguration(
            name=name,
        )
        self._from_config_file: Dict[str, Any] = {}
        with self.__suppress_configure_warnings__():
            self.configure(
                enable_cache=enable_cache,
                enable_artifact_metadata=enable_artifact_metadata,
                enable_artifact_visualization=enable_artifact_visualization,
                enable_step_logs=enable_step_logs,
                settings=settings,
                tags=tags,
                extra=extra,
                on_failure=on_failure,
                on_success=on_success,
                model=model,
                substitutions=substitutions,
            )
        self.entrypoint = entrypoint
        self._parameters: Dict[str, Any] = {}

        self.__suppress_warnings_flag__ = False

    @property
    def name(self) -> str:
        """The name of the pipeline.

        Returns:
            The name of the pipeline.
        """
        return self.configuration.name

    @property
    def enable_cache(self) -> Optional[bool]:
        """If caching is enabled for the pipeline.

        Returns:
            If caching is enabled for the pipeline.
        """
        return self.configuration.enable_cache

    @property
    def configuration(self) -> PipelineConfiguration:
        """The configuration of the pipeline.

        Returns:
            The configuration of the pipeline.
        """
        return self._configuration

    @property
    def invocations(self) -> Dict[str, StepInvocation]:
        """Returns the step invocations of this pipeline.

        This dictionary will only be populated once the pipeline has been
        called.

        Returns:
            The step invocations.
        """
        return self._invocations

    def resolve(self) -> "Source":
        """Resolves the pipeline.

        Returns:
            The pipeline source.
        """
        return source_utils.resolve(self.entrypoint, skip_validation=True)

    @property
    def source_object(self) -> Any:
        """The source object of this pipeline.

        Returns:
            The source object of this pipeline.
        """
        return self.entrypoint

    @property
    def source_code(self) -> str:
        """The source code of this pipeline.

        Returns:
            The source code of this pipeline.
        """
        return inspect.getsource(self.source_object)

    @property
    def model(self) -> "PipelineResponse":
        """Gets the registered pipeline model for this instance.

        Returns:
            The registered pipeline model.

        Raises:
            RuntimeError: If the pipeline has not been registered yet.
        """
        self._prepare_if_possible()

        pipelines = Client().list_pipelines(name=self.name)
        if len(pipelines) == 1:
            return pipelines.items[0]

        raise RuntimeError(
            f"Cannot get the model of pipeline '{self.name}' because it has "
            f"not been registered yet. Please ensure that the pipeline has "
            f"been run or built and try again."
        )

    @contextmanager
    def __suppress_configure_warnings__(self) -> Iterator[Any]:
        """Context manager to suppress warnings in `Pipeline.configure(...)`.

        Used to suppress warnings when called from inner code and not user-facing code.

        Yields:
            Nothing.
        """
        self.__suppress_warnings_flag__ = True
        yield
        self.__suppress_warnings_flag__ = False

    def configure(
        self,
        enable_cache: Optional[bool] = None,
        enable_artifact_metadata: Optional[bool] = None,
        enable_artifact_visualization: Optional[bool] = None,
        enable_step_logs: Optional[bool] = None,
        settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
        tags: Optional[List[str]] = None,
        extra: Optional[Dict[str, Any]] = None,
        on_failure: Optional["HookSpecification"] = None,
        on_success: Optional["HookSpecification"] = None,
        model: Optional["Model"] = None,
        parameters: Optional[Dict[str, Any]] = None,
        merge: bool = True,
        substitutions: Optional[Dict[str, str]] = None,
    ) -> Self:
        """Configures the pipeline.

        Configuration merging example:
        * `merge==True`:
            pipeline.configure(extra={"key1": 1})
            pipeline.configure(extra={"key2": 2}, merge=True)
            pipeline.configuration.extra # {"key1": 1, "key2": 2}
        * `merge==False`:
            pipeline.configure(extra={"key1": 1})
            pipeline.configure(extra={"key2": 2}, merge=False)
            pipeline.configuration.extra # {"key2": 2}

        Args:
            enable_cache: If caching should be enabled for this pipeline.
            enable_artifact_metadata: If artifact metadata should be enabled for
                this pipeline.
            enable_artifact_visualization: If artifact visualization should be
                enabled for this pipeline.
            enable_step_logs: If step logs should be enabled for this pipeline.
            settings: settings for this pipeline.
            tags: Tags to apply to runs of this pipeline.
            extra: Extra configurations for this pipeline.
            on_failure: Callback function in event of failure of the step. Can
                be a function with a single argument of type `BaseException`, or
                a source path to such a function (e.g. `module.my_function`).
            on_success: Callback function in event of success of the step. Can
                be a function with no arguments, or a source path to such a
                function (e.g. `module.my_function`).
            merge: If `True`, will merge the given dictionary configurations
                like `extra` and `settings` with existing
                configurations. If `False` the given configurations will
                overwrite all existing ones. See the general description of this
                method for an example.
            model: configuration of the model version in the Model Control Plane.
            parameters: input parameters for the pipeline.
            substitutions: Extra placeholders to use in the name templates.

        Returns:
            The pipeline instance that this method was called on.
        """
        failure_hook_source = None
        if on_failure:
            # string of on_failure hook function to be used for this pipeline
            failure_hook_source = resolve_and_validate_hook(on_failure)

        success_hook_source = None
        if on_success:
            # string of on_success hook function to be used for this pipeline
            success_hook_source = resolve_and_validate_hook(on_success)

        if merge and tags and self._configuration.tags:
            # Merge tags explicitly here as the recursive update later only
            # merges dicts
            tags = self._configuration.tags + tags

        values = dict_utils.remove_none_values(
            {
                "enable_cache": enable_cache,
                "enable_artifact_metadata": enable_artifact_metadata,
                "enable_artifact_visualization": enable_artifact_visualization,
                "enable_step_logs": enable_step_logs,
                "settings": settings,
                "tags": tags,
                "extra": extra,
                "failure_hook_source": failure_hook_source,
                "success_hook_source": success_hook_source,
                "model": model,
                "parameters": parameters,
                "substitutions": substitutions,
            }
        )
        if not self.__suppress_warnings_flag__:
            to_be_reapplied = []
            for param_, value_ in values.items():
                if (
                    param_ in PipelineRunConfiguration.model_fields
                    and param_ in self._from_config_file
                    and value_ != self._from_config_file[param_]
                ):
                    to_be_reapplied.append(
                        (param_, self._from_config_file[param_], value_)
                    )
            if to_be_reapplied:
                msg = ""
                reapply_during_run_warning = (
                    "The value of parameter '{name}' has changed from "
                    "'{file_value}' to '{new_value}' set in your configuration "
                    "file.\n"
                )
                for name, file_value, new_value in to_be_reapplied:
                    msg += reapply_during_run_warning.format(
                        name=name, file_value=file_value, new_value=new_value
                    )
                msg += (
                    "Configuration file value will be used during pipeline "
                    "run, so you change will not be efficient. Consider "
                    "updating your configuration file instead."
                )
                logger.warning(msg)

        config = PipelineConfigurationUpdate(**values)
        self._apply_configuration(config, merge=merge)
        return self

    @property
    def required_parameters(self) -> List[str]:
        """List of required parameters for the pipeline entrypoint.

        Returns:
            List of required parameters for the pipeline entrypoint.
        """
        signature = inspect.signature(self.entrypoint, follow_wrapped=True)
        return [
            parameter.name
            for parameter in signature.parameters.values()
            if parameter.default is inspect.Parameter.empty
        ]

    @property
    def missing_parameters(self) -> List[str]:
        """List of missing parameters for the pipeline entrypoint.

        Returns:
            List of missing parameters for the pipeline entrypoint.
        """
        available_parameters = set(self.configuration.parameters or {})
        if params_from_file := self._from_config_file.get("parameters", None):
            available_parameters.update(params_from_file)

        return list(set(self.required_parameters) - available_parameters)

    @property
    def is_prepared(self) -> bool:
        """If the pipeline is prepared.

        Prepared means that the pipeline entrypoint has been called and the
        pipeline is fully defined.

        Returns:
            If the pipeline is prepared.
        """
        return len(self.invocations) > 0

    def prepare(self, *args: Any, **kwargs: Any) -> None:
        """Prepares the pipeline.

        Args:
            *args: Pipeline entrypoint input arguments.
            **kwargs: Pipeline entrypoint input keyword arguments.

        Raises:
            RuntimeError: If the pipeline has parameters configured differently in
                configuration file and code.
        """
        # Clear existing parameters and invocations
        self._parameters = {}
        self._invocations = {}

        conflicting_parameters = {}
        parameters_ = (self.configuration.parameters or {}).copy()
        if from_file_ := self._from_config_file.get("parameters", None):
            parameters_ = dict_utils.recursive_update(parameters_, from_file_)
        if parameters_:
            for k, v_runtime in kwargs.items():
                if k in parameters_:
                    v_config = parameters_[k]
                    if v_config != v_runtime:
                        conflicting_parameters[k] = (v_config, v_runtime)
            if conflicting_parameters:
                is_plural = "s" if len(conflicting_parameters) > 1 else ""
                msg = f"Configured parameter{is_plural} for the pipeline `{self.name}` conflict{'' if not is_plural else 's'} with parameter{is_plural} passed in runtime:\n"
                for key, values in conflicting_parameters.items():
                    msg += f"`{key}`: config=`{values[0]}` | runtime=`{values[1]}`\n"
                msg += """This happens, if you define values for pipeline parameters in configuration file and pass same parameters from the code. Example:
```
# config.yaml
    parameters:
        param_name: value1


# pipeline.py
@pipeline
def pipeline_(param_name: str):
    step_name()

if __name__=="__main__":
    pipeline_.with_options(config_path="config.yaml")(param_name="value2")
```
To avoid this consider setting pipeline parameters only in one place (config or code).
"""
                raise RuntimeError(msg)
            for k, v_config in parameters_.items():
                if k not in kwargs:
                    kwargs[k] = v_config

        with self:
            # Enter the context manager, so we become the active pipeline. This
            # means that all steps that get called while the entrypoint function
            # is executed will be added as invocation to this pipeline instance.
            self._call_entrypoint(*args, **kwargs)

    def register(self) -> "PipelineResponse":
        """Register the pipeline in the server.

        Returns:
            The registered pipeline model.
        """
        # Activating the built-in integrations to load all materializers
        from zenml.integrations.registry import integration_registry

        self._prepare_if_possible()
        integration_registry.activate_integrations()

        if self.configuration.model_dump(
            exclude_defaults=True, exclude={"name"}
        ):
            logger.warning(
                f"The pipeline `{self.name}` that you're registering has "
                "custom configurations applied to it. These will not be "
                "registered with the pipeline and won't be set when you build "
                "images or run the pipeline from the CLI. To provide these "
                "configurations, use the `--config` option of the `zenml "
                "pipeline build/run` commands."
            )

        return self._register()

    def build(
        self,
        settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
        step_configurations: Optional[
            Mapping[str, "StepConfigurationUpdateOrDict"]
        ] = None,
        config_path: Optional[str] = None,
    ) -> Optional["PipelineBuildResponse"]:
        """Builds Docker images for the pipeline.

        Args:
            settings: Settings for the pipeline.
            step_configurations: Configurations for steps of the pipeline.
            config_path: Path to a yaml configuration file. This file will
                be parsed as a
                `zenml.config.pipeline_configurations.PipelineRunConfiguration`
                object. Options provided in this file will be overwritten by
                options provided in code using the other arguments of this
                method.

        Returns:
            The build output.
        """
        with track_handler(event=AnalyticsEvent.BUILD_PIPELINE):
            self._prepare_if_possible()

            compile_args = self._run_args.copy()
            compile_args.pop("unlisted", None)
            compile_args.pop("prevent_build_reuse", None)
            if config_path:
                compile_args["config_path"] = config_path
            if step_configurations:
                compile_args["step_configurations"] = step_configurations
            if settings:
                compile_args["settings"] = settings

            deployment, _, _ = self._compile(**compile_args)
            pipeline_id = self._register().id

            local_repo = code_repository_utils.find_active_code_repository()
            code_repository = build_utils.verify_local_repository_context(
                deployment=deployment, local_repo_context=local_repo
            )

            return build_utils.create_pipeline_build(
                deployment=deployment,
                pipeline_id=pipeline_id,
                code_repository=code_repository,
            )

    def _create_deployment(
        self,
        *,
        run_name: Optional[str] = None,
        enable_cache: Optional[bool] = None,
        enable_artifact_metadata: Optional[bool] = None,
        enable_artifact_visualization: Optional[bool] = None,
        enable_step_logs: Optional[bool] = None,
        schedule: Optional[Schedule] = None,
        build: Union[str, "UUID", "PipelineBuildBase", None] = None,
        settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
        step_configurations: Optional[
            Mapping[str, "StepConfigurationUpdateOrDict"]
        ] = None,
        extra: Optional[Dict[str, Any]] = None,
        config_path: Optional[str] = None,
        unlisted: bool = False,
        prevent_build_reuse: bool = False,
        skip_schedule_registration: bool = False,
    ) -> PipelineDeploymentResponse:
        """Create a pipeline deployment.

        Args:
            run_name: Name of the pipeline run.
            enable_cache: If caching should be enabled for this pipeline run.
            enable_artifact_metadata: If artifact metadata should be enabled
                for this pipeline run.
            enable_artifact_visualization: If artifact visualization should be
                enabled for this pipeline run.
            enable_step_logs: If step logs should be enabled for this pipeline.
            schedule: Optional schedule to use for the run.
            build: Optional build to use for the run.
            settings: Settings for this pipeline run.
            step_configurations: Configurations for steps of the pipeline.
            extra: Extra configurations for this pipeline run.
            config_path: Path to a yaml configuration file. This file will
                be parsed as a
                `zenml.config.pipeline_configurations.PipelineRunConfiguration`
                object. Options provided in this file will be overwritten by
                options provided in code using the other arguments of this
                method.
            unlisted: Whether the pipeline run should be unlisted (not assigned
                to any pipeline).
            prevent_build_reuse: DEPRECATED: Use
                `DockerSettings.prevent_build_reuse` instead.
            skip_schedule_registration: Whether to skip schedule registration.

        Returns:
            The pipeline deployment.

        Raises:
            ValueError: If the orchestrator doesn't support scheduling, but a
                schedule was given
        """
        deployment, schedule, build = self._compile(
            config_path=config_path,
            run_name=run_name,
            enable_cache=enable_cache,
            enable_artifact_metadata=enable_artifact_metadata,
            enable_artifact_visualization=enable_artifact_visualization,
            enable_step_logs=enable_step_logs,
            steps=step_configurations,
            settings=settings,
            schedule=schedule,
            build=build,
            extra=extra,
        )

        skip_pipeline_registration = constants.handle_bool_env_var(
            constants.ENV_ZENML_SKIP_PIPELINE_REGISTRATION,
            default=False,
        )

        register_pipeline = not (skip_pipeline_registration or unlisted)

        pipeline_id = None
        if register_pipeline:
            pipeline_id = self._register().id
        else:
            logger.debug(f"Pipeline {self.name} is unlisted.")

        stack = Client().active_stack
        stack.validate()

        schedule_id = None
        if schedule and not skip_schedule_registration:
            if not stack.orchestrator.config.is_schedulable:
                raise ValueError(
                    f"Stack {stack.name} does not support scheduling. "
                    "Not all orchestrator types support scheduling, "
                    "kindly consult with "
                    "https://docs.zenml.io/how-to/build-pipelines/schedule-a-pipeline "
                    "for details."
                )
            if schedule.name:
                schedule_name = schedule.name
            else:
                schedule_name = format_name_template(
                    deployment.run_name_template,
                    substitutions=deployment.pipeline_configuration.substitutions,
                )
            components = Client().active_stack_model.components
            orchestrator = components[StackComponentType.ORCHESTRATOR][0]
            schedule_model = ScheduleRequest(
                workspace=Client().active_workspace.id,
                user=Client().active_user.id,
                pipeline_id=pipeline_id,
                orchestrator_id=orchestrator.id,
                name=schedule_name,
                active=True,
                cron_expression=schedule.cron_expression,
                start_time=schedule.start_time,
                end_time=schedule.end_time,
                interval_second=schedule.interval_second,
                catchup=schedule.catchup,
                run_once_start_time=schedule.run_once_start_time,
            )
            schedule_id = Client().zen_store.create_schedule(schedule_model).id
            logger.info(
                f"Created schedule `{schedule_name}` for pipeline "
                f"`{deployment.pipeline_configuration.name}`."
            )

        stack = Client().active_stack
        stack.validate()
        upload_notebook_cell_code_if_necessary(
            deployment=deployment, stack=stack
        )

        local_repo_context = (
            code_repository_utils.find_active_code_repository()
        )
        code_repository = build_utils.verify_local_repository_context(
            deployment=deployment, local_repo_context=local_repo_context
        )
        can_download_from_code_repository = code_repository is not None
        if local_repo_context:
            build_utils.log_code_repository_usage(
                deployment=deployment, local_repo_context=local_repo_context
            )

        if prevent_build_reuse:
            logger.warning(
                "Passing `prevent_build_reuse=True` to "
                "`pipeline.with_opitions(...)` is deprecated. Use "
                "`DockerSettings.prevent_build_reuse` instead."
            )

        build_model = build_utils.reuse_or_create_pipeline_build(
            deployment=deployment,
            pipeline_id=pipeline_id,
            allow_build_reuse=not prevent_build_reuse,
            build=build,
            code_repository=code_repository,
        )
        build_id = build_model.id if build_model else None

        code_reference = None
        if local_repo_context and not local_repo_context.is_dirty:
            source_root = source_utils.get_source_root()
            subdirectory = (
                Path(source_root)
                .resolve()
                .relative_to(local_repo_context.root)
            )

            code_reference = CodeReferenceRequest(
                commit=local_repo_context.current_commit,
                subdirectory=subdirectory.as_posix(),
                code_repository=local_repo_context.code_repository.id,
            )

        code_path = None
        if build_utils.should_upload_code(
            deployment=deployment,
            build=build_model,
            can_download_from_code_repository=can_download_from_code_repository,
        ):
            source_root = source_utils.get_source_root()
            code_archive = code_utils.CodeArchive(root=source_root)
            logger.info(
                "Archiving pipeline code directory: `%s`. If this is taking "
                "longer than you expected, make sure your source root "
                "is set correctly by running `zenml init`, and that it "
                "does not contain unnecessarily huge files.",
                source_root,
            )

            code_path = code_utils.upload_code_if_necessary(code_archive)

        request = PipelineDeploymentRequest(
            user=Client().active_user.id,
            workspace=Client().active_workspace.id,
            stack=stack.id,
            pipeline=pipeline_id,
            build=build_id,
            schedule=schedule_id,
            code_reference=code_reference,
            code_path=code_path,
            **deployment.model_dump(),
        )
        return Client().zen_store.create_deployment(deployment=request)

    def _run(
        self,
    ) -> Optional[PipelineRunResponse]:
        """Runs the pipeline on the active stack.

        Returns:
            The pipeline run or `None` if running with a schedule.
        """
        if constants.SHOULD_PREVENT_PIPELINE_EXECUTION:
            # An environment variable was set to stop the execution of
            # pipelines. This is done to prevent execution of module-level
            # pipeline.run() calls when importing modules needed to run a step.
            logger.info(
                "Preventing execution of pipeline '%s'. If this is not "
                "intended behavior, make sure to unset the environment "
                "variable '%s'.",
                self.name,
                constants.ENV_ZENML_PREVENT_PIPELINE_EXECUTION,
            )
            return None

        logger.info(f"Initiating a new run for the pipeline: `{self.name}`.")

        with track_handler(AnalyticsEvent.RUN_PIPELINE) as analytics_handler:
            stack = Client().active_stack
            deployment = self._create_deployment(**self._run_args)

            self.log_pipeline_deployment_metadata(deployment)
            run = create_placeholder_run(deployment=deployment)

            analytics_handler.metadata = self._get_pipeline_analytics_metadata(
                deployment=deployment,
                stack=stack,
                run_id=run.id if run else None,
            )

            if run:
                run_url = dashboard_utils.get_run_url(run)
                if run_url:
                    logger.info(f"Dashboard URL for Pipeline Run: {run_url}")
                else:
                    logger.info(
                        "You can visualize your pipeline runs in the `ZenML "
                        "Dashboard`. In order to try it locally, please run "
                        "`zenml login --local`."
                    )

            deploy_pipeline(
                deployment=deployment, stack=stack, placeholder_run=run
            )
            if run:
                return Client().get_pipeline_run(run.id)
            return None

    @staticmethod
    def log_pipeline_deployment_metadata(
        deployment_model: PipelineDeploymentResponse,
    ) -> None:
        """Displays logs based on the deployment model upon running a pipeline.

        Args:
            deployment_model: The model for the pipeline deployment
        """
        try:
            # Log about the caching status
            if deployment_model.pipeline_configuration.enable_cache is False:
                logger.info(
                    f"Caching is disabled by default for "
                    f"`{deployment_model.pipeline_configuration.name}`."
                )

            # Log about the used builds
            if deployment_model.build:
                logger.info("Using a build:")
                logger.info(
                    " Image(s): "
                    f"{', '.join([i.image for i in deployment_model.build.images.values()])}"
                )

                # Log about version mismatches between local and build
                from zenml import __version__

                if deployment_model.build.zenml_version != __version__:
                    logger.info(
                        f"ZenML version (different than the local version): "
                        f"{deployment_model.build.zenml_version}"
                    )

                import platform

                if (
                    deployment_model.build.python_version
                    != platform.python_version()
                ):
                    logger.info(
                        f"Python version (different than the local version): "
                        f"{deployment_model.build.python_version}"
                    )

            # Log about the user, stack and components
            if deployment_model.user is not None:
                logger.info(f"Using user: `{deployment_model.user.name}`")

            if deployment_model.stack is not None:
                logger.info(f"Using stack: `{deployment_model.stack.name}`")

                for (
                    component_type,
                    component_models,
                ) in deployment_model.stack.components.items():
                    logger.info(
                        f"  {component_type.value}: `{component_models[0].name}`"
                    )
        except Exception as e:
            logger.debug(f"Logging pipeline deployment metadata failed: {e}")

    def write_run_configuration_template(
        self, path: str, stack: Optional["Stack"] = None
    ) -> None:
        """Writes a run configuration yaml template.

        Args:
            path: The path where the template will be written.
            stack: The stack for which the template should be generated. If
                not given, the active stack will be used.
        """
        from zenml.config.base_settings import ConfigurationLevel
        from zenml.config.step_configurations import (
            PartialArtifactConfiguration,
        )

        self._prepare_if_possible()

        stack = stack or Client().active_stack

        setting_classes = stack.setting_classes
        setting_classes.update(settings_utils.get_general_settings())

        pipeline_settings = {}
        step_settings = {}
        for key, setting_class in setting_classes.items():
            fields = pydantic_utils.TemplateGenerator(setting_class).run()
            if ConfigurationLevel.PIPELINE in setting_class.LEVEL:
                pipeline_settings[key] = fields
            if ConfigurationLevel.STEP in setting_class.LEVEL:
                step_settings[key] = fields

        steps = {}
        for step_name, invocation in self.invocations.items():
            step = invocation.step
            outputs = {
                name: PartialArtifactConfiguration()
                for name in step.entrypoint_definition.outputs
            }
            step_template = StepConfigurationUpdate(
                parameters={},
                settings=step_settings,
                outputs=outputs,
            )
            steps[step_name] = step_template

        run_config = PipelineRunConfiguration(
            settings=pipeline_settings, steps=steps
        )
        template = pydantic_utils.TemplateGenerator(run_config).run()
        yaml_string = yaml.dump(template)
        yaml_string = yaml_utils.comment_out_yaml(yaml_string)

        with open(path, "w") as f:
            f.write(yaml_string)

    def _apply_configuration(
        self,
        config: PipelineConfigurationUpdate,
        merge: bool = True,
    ) -> None:
        """Applies an update to the pipeline configuration.

        Args:
            config: The configuration update.
            merge: Whether to merge the updates with the existing configuration
                or not. See the `BasePipeline.configure(...)` method for a
                detailed explanation.
        """
        self._validate_configuration(config)
        self._configuration = pydantic_utils.update_model(
            self._configuration, update=config, recursive=merge
        )
        logger.debug("Updated pipeline configuration:")
        logger.debug(self._configuration)

    @staticmethod
    def _validate_configuration(config: PipelineConfigurationUpdate) -> None:
        """Validates a configuration update.

        Args:
            config: The configuration update to validate.
        """
        settings_utils.validate_setting_keys(list(config.settings))

    def _get_pipeline_analytics_metadata(
        self,
        deployment: "PipelineDeploymentResponse",
        stack: "Stack",
        run_id: Optional[UUID] = None,
    ) -> Dict[str, Any]:
        """Returns the pipeline deployment metadata.

        Args:
            deployment: The pipeline deployment to track.
            stack: The stack on which the pipeline will be deployed.
            run_id: The ID of the pipeline run.

        Returns:
            the metadata about the pipeline deployment
        """
        custom_materializer = False
        for step in deployment.step_configurations.values():
            for output in step.config.outputs.values():
                for source in output.materializer_source:
                    if not source.is_internal:
                        custom_materializer = True

        stack_creator = Client().get_stack(stack.id).user
        active_user = Client().active_user
        own_stack = stack_creator and stack_creator.id == active_user.id

        stack_metadata = {
            component_type.value: component.flavor
            for component_type, component in stack.components.items()
        }
        return {
            "store_type": Client().zen_store.type.value,
            **stack_metadata,
            "total_steps": len(self.invocations),
            "schedule": bool(deployment.schedule),
            "custom_materializer": custom_materializer,
            "own_stack": own_stack,
            "pipeline_run_id": str(run_id) if run_id else None,
        }

    def _compile(
        self, config_path: Optional[str] = None, **run_configuration_args: Any
    ) -> Tuple[
        "PipelineDeploymentBase",
        Optional["Schedule"],
        Union["PipelineBuildBase", UUID, None],
    ]:
        """Compiles the pipeline.

        Args:
            config_path: Path to a config file.
            **run_configuration_args: Configurations for the pipeline run.

        Returns:
            A tuple containing the deployment, schedule and build of
            the compiled pipeline.
        """
        # Activating the built-in integrations to load all materializers
        from zenml.integrations.registry import integration_registry

        integration_registry.activate_integrations()

        _from_config_file = self._parse_config_file(
            config_path=config_path,
            matcher=list(PipelineRunConfiguration.model_fields.keys()),
        )

        self._reconfigure_from_file_with_overrides(config_path=config_path)

        run_config = PipelineRunConfiguration(**_from_config_file)

        new_values = dict_utils.remove_none_values(run_configuration_args)
        update = PipelineRunConfiguration.model_validate(new_values)

        # Update with the values in code so they take precedence
        run_config = pydantic_utils.update_model(run_config, update=update)
        run_config = env_utils.substitute_env_variable_placeholders(run_config)

        deployment = Compiler().compile(
            pipeline=self,
            stack=Client().active_stack,
            run_configuration=run_config,
        )
        deployment = env_utils.substitute_env_variable_placeholders(deployment)

        return deployment, run_config.schedule, run_config.build

    def _register(self) -> "PipelineResponse":
        """Register the pipeline in the server.

        Returns:
            The registered pipeline model.
        """
        client = Client()

        def _get() -> PipelineResponse:
            matching_pipelines = client.list_pipelines(
                name=self.name,
                size=1,
                sort_by="desc:created",
            )

            if matching_pipelines.total:
                registered_pipeline = matching_pipelines.items[0]
                return registered_pipeline
            raise RuntimeError("No matching pipelines found.")

        try:
            return _get()
        except RuntimeError:
            request = PipelineRequest(
                workspace=client.active_workspace.id,
                user=client.active_user.id,
                name=self.name,
            )

            try:
                registered_pipeline = client.zen_store.create_pipeline(
                    pipeline=request
                )
                logger.info(
                    "Registered new pipeline: `%s`.",
                    registered_pipeline.name,
                )
                return registered_pipeline
            except EntityExistsError:
                return _get()

    def _compute_unique_identifier(self, pipeline_spec: PipelineSpec) -> str:
        """Computes a unique identifier from the pipeline spec and steps.

        Args:
            pipeline_spec: Compiled spec of the pipeline.

        Returns:
            The unique identifier of the pipeline.
        """
        from packaging import version

        hash_ = hashlib.md5()  # nosec
        hash_.update(pipeline_spec.json_with_string_sources.encode())

        if version.parse(pipeline_spec.version) >= version.parse("0.4"):
            # Only add this for newer versions to keep backwards compatibility
            hash_.update(self.source_code.encode())

        for step_spec in pipeline_spec.steps:
            invocation = self.invocations[step_spec.pipeline_parameter_name]
            step_source = invocation.step.source_code
            hash_.update(step_source.encode())

        return hash_.hexdigest()

    def add_step_invocation(
        self,
        step: "BaseStep",
        input_artifacts: Dict[str, StepArtifact],
        external_artifacts: Dict[
            str, Union["ExternalArtifact", "ArtifactVersionResponse"]
        ],
        model_artifacts_or_metadata: Dict[str, "ModelVersionDataLazyLoader"],
        client_lazy_loaders: Dict[str, "ClientLazyLoader"],
        parameters: Dict[str, Any],
        default_parameters: Dict[str, Any],
        upstream_steps: Set[str],
        custom_id: Optional[str] = None,
        allow_id_suffix: bool = True,
    ) -> str:
        """Adds a step invocation to the pipeline.

        Args:
            step: The step for which to add an invocation.
            input_artifacts: The input artifacts for the invocation.
            external_artifacts: The external artifacts for the invocation.
            model_artifacts_or_metadata: The model artifacts or metadata for
                the invocation.
            client_lazy_loaders: The client lazy loaders for the invocation.
            parameters: The parameters for the invocation.
            default_parameters: The default parameters for the invocation.
            upstream_steps: The upstream steps for the invocation.
            custom_id: Custom ID to use for the invocation.
            allow_id_suffix: Whether a suffix can be appended to the invocation
                ID.

        Raises:
            RuntimeError: If the method is called on an inactive pipeline.
            RuntimeError: If the invocation was called with an artifact from
                a different pipeline.

        Returns:
            The step invocation ID.
        """
        if Pipeline.ACTIVE_PIPELINE != self:
            raise RuntimeError(
                "A step invocation can only be added to an active pipeline."
            )

        for artifact in input_artifacts.values():
            if artifact.pipeline is not self:
                raise RuntimeError(
                    "Got invalid input artifact for invocation of step "
                    f"{step.name}: The input artifact was produced by a step "
                    f"inside a different pipeline {artifact.pipeline.name}."
                )

        invocation_id = self._compute_invocation_id(
            step=step, custom_id=custom_id, allow_suffix=allow_id_suffix
        )
        invocation = StepInvocation(
            id=invocation_id,
            step=step,
            input_artifacts=input_artifacts,
            external_artifacts=external_artifacts,
            model_artifacts_or_metadata=model_artifacts_or_metadata,
            client_lazy_loaders=client_lazy_loaders,
            parameters=parameters,
            default_parameters=default_parameters,
            upstream_steps=upstream_steps,
            pipeline=self,
        )
        self._invocations[invocation_id] = invocation
        return invocation_id

    def _compute_invocation_id(
        self,
        step: "BaseStep",
        custom_id: Optional[str] = None,
        allow_suffix: bool = True,
    ) -> str:
        """Compute the invocation ID.

        Args:
            step: The step for which to compute the ID.
            custom_id: Custom ID to use for the invocation.
            allow_suffix: Whether a suffix can be appended to the invocation
                ID.

        Raises:
            RuntimeError: If no ID suffix is allowed and an invocation for the
                same ID already exists.
            RuntimeError: If no unique invocation ID can be found.

        Returns:
            The invocation ID.
        """
        base_id = id_ = custom_id or step.name

        if id_ not in self.invocations:
            return id_

        if not allow_suffix:
            raise RuntimeError("Duplicate step ID")

        for index in range(2, 10000):
            id_ = f"{base_id}_{index}"
            if id_ not in self.invocations:
                return id_

        raise RuntimeError("Unable to find step ID")

    def __enter__(self) -> Self:
        """Activate the pipeline context.

        Raises:
            RuntimeError: If a different pipeline is already active.

        Returns:
            The pipeline instance.
        """
        if Pipeline.ACTIVE_PIPELINE:
            raise RuntimeError(
                "Unable to enter pipeline context. A different pipeline "
                f"{Pipeline.ACTIVE_PIPELINE.name} is already active."
            )

        Pipeline.ACTIVE_PIPELINE = self
        return self

    def __exit__(self, *args: Any) -> None:
        """Deactivates the pipeline context.

        Args:
            *args: The arguments passed to the context exit handler.
        """
        Pipeline.ACTIVE_PIPELINE = None

    def _parse_config_file(
        self, config_path: Optional[str], matcher: List[str]
    ) -> Dict[str, Any]:
        """Parses the given configuration file and sets `self._from_config_file`.

        Args:
            config_path: Path to a yaml configuration file.
            matcher: List of keys to match in the configuration file.

        Returns:
            Parsed config file according to matcher settings.
        """
        _from_config_file: Dict[str, Any] = {}
        if config_path:
            with open(config_path, "r") as f:
                _from_config_file = yaml.load(f, Loader=yaml.SafeLoader)

            _from_config_file = dict_utils.remove_none_values(
                {k: v for k, v in _from_config_file.items() if k in matcher}
            )

            if "model" in _from_config_file:
                if "model" in self._from_config_file:
                    _from_config_file["model"] = self._from_config_file[
                        "model"
                    ]
                else:
                    from zenml.model.model import Model

                    _from_config_file["model"] = Model.model_validate(
                        _from_config_file["model"]
                    )
        return _from_config_file

    def with_options(
        self,
        run_name: Optional[str] = None,
        schedule: Optional[Schedule] = None,
        build: Union[str, "UUID", "PipelineBuildBase", None] = None,
        step_configurations: Optional[
            Mapping[str, "StepConfigurationUpdateOrDict"]
        ] = None,
        steps: Optional[Mapping[str, "StepConfigurationUpdateOrDict"]] = None,
        config_path: Optional[str] = None,
        unlisted: bool = False,
        prevent_build_reuse: bool = False,
        **kwargs: Any,
    ) -> "Pipeline":
        """Copies the pipeline and applies the given configurations.

        Args:
            run_name: Name of the pipeline run.
            schedule: Optional schedule to use for the run.
            build: Optional build to use for the run.
            step_configurations: Configurations for steps of the pipeline.
            steps: Configurations for steps of the pipeline. This is equivalent
                to `step_configurations`, and will be ignored if
                `step_configurations` is set as well.
            config_path: Path to a yaml configuration file. This file will
                be parsed as a
                `zenml.config.pipeline_configurations.PipelineRunConfiguration`
                object. Options provided in this file will be overwritten by
                options provided in code using the other arguments of this
                method.
            unlisted: Whether the pipeline run should be unlisted (not assigned
                to any pipeline).
            prevent_build_reuse: DEPRECATED: Use
                `DockerSettings.prevent_build_reuse` instead.
            **kwargs: Pipeline configuration options. These will be passed
                to the `pipeline.configure(...)` method.

        Returns:
            The copied pipeline instance.
        """
        if steps and step_configurations:
            logger.warning(
                "Step configurations were passed using both the "
                "`step_configurations` and `steps` keywords, ignoring the "
                "values passed using the `steps` keyword."
            )

        pipeline_copy = self.copy()

        pipeline_copy._reconfigure_from_file_with_overrides(
            config_path=config_path, **kwargs
        )

        run_args = dict_utils.remove_none_values(
            {
                "run_name": run_name,
                "schedule": schedule,
                "build": build,
                "step_configurations": step_configurations or steps,
                "config_path": config_path,
                "unlisted": unlisted,
                "prevent_build_reuse": prevent_build_reuse,
            }
        )
        pipeline_copy._run_args.update(run_args)
        return pipeline_copy

    def copy(self) -> "Pipeline":
        """Copies the pipeline.

        Returns:
            The pipeline copy.
        """
        return copy.deepcopy(self)

    def __call__(
        self, *args: Any, **kwargs: Any
    ) -> Optional[PipelineRunResponse]:
        """Handle a call of the pipeline.

        This method does one of two things:
        * If there is an active pipeline context, it calls the pipeline
          entrypoint function within that context and the step invocations
          will be added to the active pipeline.
        * If no pipeline is active, it activates this pipeline before calling
          the entrypoint function.

        Args:
            *args: Entrypoint function arguments.
            **kwargs: Entrypoint function keyword arguments.

        Returns:
            If called within another pipeline, returns the outputs of the
            `entrypoint` method. Otherwise, returns the pipeline run or `None`
            if running with a schedule.
        """
        if Pipeline.ACTIVE_PIPELINE:
            # Calling a pipeline inside a pipeline, we return the potential
            # outputs of the entrypoint function

            # TODO: This currently ignores the configuration of the pipeline
            #   and instead applies the configuration of the previously active
            #   pipeline. Is this what we want?
            return self.entrypoint(*args, **kwargs)

        self.prepare(*args, **kwargs)
        return self._run()

    def _call_entrypoint(self, *args: Any, **kwargs: Any) -> None:
        """Calls the pipeline entrypoint function with the given arguments.

        Args:
            *args: Entrypoint function arguments.
            **kwargs: Entrypoint function keyword arguments.

        Raises:
            ValueError: If an input argument is missing or not JSON
                serializable.
        """
        try:
            validated_args = pydantic_utils.validate_function_args(
                self.entrypoint,
                ConfigDict(arbitrary_types_allowed=False),
                *args,
                **kwargs,
            )
        except ValidationError as e:
            raise ValueError(
                "Invalid or missing pipeline function entrypoint arguments. "
                "Only JSON serializable inputs are allowed as pipeline inputs. "
                "Check out the pydantic error above for more details."
            ) from e

        self._parameters = validated_args
        self.entrypoint(**validated_args)

    def _prepare_if_possible(self) -> None:
        """Prepares the pipeline if possible.

        Raises:
            RuntimeError: If the pipeline is not prepared and the preparation
                requires parameters.
        """
        if not self.is_prepared:
            if missing_parameters := self.missing_parameters:
                raise RuntimeError(
                    f"Failed while trying to prepare pipeline {self.name}. "
                    "The entrypoint function of the pipeline requires "
                    "arguments which have not been configured yet: "
                    f"{missing_parameters}. Please provide those parameters by "
                    "calling `pipeline_instance.configure(parameters=...)` or "
                    "by calling `pipeline_instance.prepare(...)` and try again."
                )

            self.prepare()

    def create_run_template(
        self, name: str, **kwargs: Any
    ) -> RunTemplateResponse:
        """Create a run template for the pipeline.

        Args:
            name: The name of the run template.
            **kwargs: Keyword arguments for the client method to create a run
                template.

        Returns:
            The created run template.
        """
        self._prepare_if_possible()
        deployment = self._create_deployment(
            **self._run_args, skip_schedule_registration=True
        )

        return Client().create_run_template(
            name=name, deployment_id=deployment.id, **kwargs
        )

    def _reconfigure_from_file_with_overrides(
        self,
        config_path: Optional[str] = None,
        **kwargs: Any,
    ) -> None:
        """Update the pipeline configuration from config file.

        Accepts overrides as kwargs.

        Args:
            config_path: Path to a yaml configuration file. This file will
                be parsed as a
                `zenml.config.pipeline_configurations.PipelineRunConfiguration`
                object. Options provided in this file will be overwritten by
                options provided in code using the other arguments of this
                method.
            **kwargs: Pipeline configuration options. These will be passed
                to the `pipeline.configure(...)` method.
        """
        self._from_config_file = {}
        if config_path:
            self._from_config_file = self._parse_config_file(
                config_path=config_path,
                matcher=inspect.getfullargspec(self.configure)[0],
            )

        _from_config_file = dict_utils.recursive_update(
            self._from_config_file, kwargs
        )

        with self.__suppress_configure_warnings__():
            self.configure(**_from_config_file)

configuration property

The configuration of the pipeline.

Returns:

Type Description
PipelineConfiguration

The configuration of the pipeline.

enable_cache property

If caching is enabled for the pipeline.

Returns:

Type Description
Optional[bool]

If caching is enabled for the pipeline.

invocations property

Returns the step invocations of this pipeline.

This dictionary will only be populated once the pipeline has been called.

Returns:

Type Description
Dict[str, StepInvocation]

The step invocations.

is_prepared property

If the pipeline is prepared.

Prepared means that the pipeline entrypoint has been called and the pipeline is fully defined.

Returns:

Type Description
bool

If the pipeline is prepared.

missing_parameters property

List of missing parameters for the pipeline entrypoint.

Returns:

Type Description
List[str]

List of missing parameters for the pipeline entrypoint.

model property

Gets the registered pipeline model for this instance.

Returns:

Type Description
PipelineResponse

The registered pipeline model.

Raises:

Type Description
RuntimeError

If the pipeline has not been registered yet.

name property

The name of the pipeline.

Returns:

Type Description
str

The name of the pipeline.

required_parameters property

List of required parameters for the pipeline entrypoint.

Returns:

Type Description
List[str]

List of required parameters for the pipeline entrypoint.

source_code property

The source code of this pipeline.

Returns:

Type Description
str

The source code of this pipeline.

source_object property

The source object of this pipeline.

Returns:

Type Description
Any

The source object of this pipeline.

__call__(*args, **kwargs)

Handle a call of the pipeline.

This method does one of two things: * If there is an active pipeline context, it calls the pipeline entrypoint function within that context and the step invocations will be added to the active pipeline. * If no pipeline is active, it activates this pipeline before calling the entrypoint function.

Parameters:

Name Type Description Default
*args Any

Entrypoint function arguments.

()
**kwargs Any

Entrypoint function keyword arguments.

{}

Returns:

Type Description
Optional[PipelineRunResponse]

If called within another pipeline, returns the outputs of the

Optional[PipelineRunResponse]

entrypoint method. Otherwise, returns the pipeline run or None

Optional[PipelineRunResponse]

if running with a schedule.

Source code in src/zenml/pipelines/pipeline_definition.py
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
def __call__(
    self, *args: Any, **kwargs: Any
) -> Optional[PipelineRunResponse]:
    """Handle a call of the pipeline.

    This method does one of two things:
    * If there is an active pipeline context, it calls the pipeline
      entrypoint function within that context and the step invocations
      will be added to the active pipeline.
    * If no pipeline is active, it activates this pipeline before calling
      the entrypoint function.

    Args:
        *args: Entrypoint function arguments.
        **kwargs: Entrypoint function keyword arguments.

    Returns:
        If called within another pipeline, returns the outputs of the
        `entrypoint` method. Otherwise, returns the pipeline run or `None`
        if running with a schedule.
    """
    if Pipeline.ACTIVE_PIPELINE:
        # Calling a pipeline inside a pipeline, we return the potential
        # outputs of the entrypoint function

        # TODO: This currently ignores the configuration of the pipeline
        #   and instead applies the configuration of the previously active
        #   pipeline. Is this what we want?
        return self.entrypoint(*args, **kwargs)

    self.prepare(*args, **kwargs)
    return self._run()

__enter__()

Activate the pipeline context.

Raises:

Type Description
RuntimeError

If a different pipeline is already active.

Returns:

Type Description
Self

The pipeline instance.

Source code in src/zenml/pipelines/pipeline_definition.py
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
def __enter__(self) -> Self:
    """Activate the pipeline context.

    Raises:
        RuntimeError: If a different pipeline is already active.

    Returns:
        The pipeline instance.
    """
    if Pipeline.ACTIVE_PIPELINE:
        raise RuntimeError(
            "Unable to enter pipeline context. A different pipeline "
            f"{Pipeline.ACTIVE_PIPELINE.name} is already active."
        )

    Pipeline.ACTIVE_PIPELINE = self
    return self

__exit__(*args)

Deactivates the pipeline context.

Parameters:

Name Type Description Default
*args Any

The arguments passed to the context exit handler.

()
Source code in src/zenml/pipelines/pipeline_definition.py
1264
1265
1266
1267
1268
1269
1270
def __exit__(self, *args: Any) -> None:
    """Deactivates the pipeline context.

    Args:
        *args: The arguments passed to the context exit handler.
    """
    Pipeline.ACTIVE_PIPELINE = None

__init__(name, entrypoint, enable_cache=None, enable_artifact_metadata=None, enable_artifact_visualization=None, enable_step_logs=None, settings=None, tags=None, extra=None, on_failure=None, on_success=None, model=None, substitutions=None)

Initializes a pipeline.

Parameters:

Name Type Description Default
name str

The name of the pipeline.

required
entrypoint F

The entrypoint function of the pipeline.

required
enable_cache Optional[bool]

If caching should be enabled for this pipeline.

None
enable_artifact_metadata Optional[bool]

If artifact metadata should be enabled for this pipeline.

None
enable_artifact_visualization Optional[bool]

If artifact visualization should be enabled for this pipeline.

None
enable_step_logs Optional[bool]

If step logs should be enabled for this pipeline.

None
settings Optional[Mapping[str, SettingsOrDict]]

Settings for this pipeline.

None
tags Optional[List[str]]

Tags to apply to runs of this pipeline.

None
extra Optional[Dict[str, Any]]

Extra configurations for this pipeline.

None
on_failure Optional[HookSpecification]

Callback function in event of failure of the step. Can be a function with a single argument of type BaseException, or a source path to such a function (e.g. module.my_function).

None
on_success Optional[HookSpecification]

Callback function in event of success of the step. Can be a function with no arguments, or a source path to such a function (e.g. module.my_function).

None
model Optional[Model]

configuration of the model in the Model Control Plane.

None
substitutions Optional[Dict[str, str]]

Extra placeholders to use in the name templates.

None
Source code in src/zenml/pipelines/pipeline_definition.py
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
def __init__(
    self,
    name: str,
    entrypoint: F,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_artifact_visualization: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
    tags: Optional[List[str]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional["HookSpecification"] = None,
    on_success: Optional["HookSpecification"] = None,
    model: Optional["Model"] = None,
    substitutions: Optional[Dict[str, str]] = None,
) -> None:
    """Initializes a pipeline.

    Args:
        name: The name of the pipeline.
        entrypoint: The entrypoint function of the pipeline.
        enable_cache: If caching should be enabled for this pipeline.
        enable_artifact_metadata: If artifact metadata should be enabled for
            this pipeline.
        enable_artifact_visualization: If artifact visualization should be
            enabled for this pipeline.
        enable_step_logs: If step logs should be enabled for this pipeline.
        settings: Settings for this pipeline.
        tags: Tags to apply to runs of this pipeline.
        extra: Extra configurations for this pipeline.
        on_failure: Callback function in event of failure of the step. Can
            be a function with a single argument of type `BaseException`, or
            a source path to such a function (e.g. `module.my_function`).
        on_success: Callback function in event of success of the step. Can
            be a function with no arguments, or a source path to such a
            function (e.g. `module.my_function`).
        model: configuration of the model in the Model Control Plane.
        substitutions: Extra placeholders to use in the name templates.
    """
    self._invocations: Dict[str, StepInvocation] = {}
    self._run_args: Dict[str, Any] = {}

    self._configuration = PipelineConfiguration(
        name=name,
    )
    self._from_config_file: Dict[str, Any] = {}
    with self.__suppress_configure_warnings__():
        self.configure(
            enable_cache=enable_cache,
            enable_artifact_metadata=enable_artifact_metadata,
            enable_artifact_visualization=enable_artifact_visualization,
            enable_step_logs=enable_step_logs,
            settings=settings,
            tags=tags,
            extra=extra,
            on_failure=on_failure,
            on_success=on_success,
            model=model,
            substitutions=substitutions,
        )
    self.entrypoint = entrypoint
    self._parameters: Dict[str, Any] = {}

    self.__suppress_warnings_flag__ = False

__suppress_configure_warnings__()

Context manager to suppress warnings in Pipeline.configure(...).

Used to suppress warnings when called from inner code and not user-facing code.

Yields:

Type Description
Any

Nothing.

Source code in src/zenml/pipelines/pipeline_definition.py
276
277
278
279
280
281
282
283
284
285
286
287
@contextmanager
def __suppress_configure_warnings__(self) -> Iterator[Any]:
    """Context manager to suppress warnings in `Pipeline.configure(...)`.

    Used to suppress warnings when called from inner code and not user-facing code.

    Yields:
        Nothing.
    """
    self.__suppress_warnings_flag__ = True
    yield
    self.__suppress_warnings_flag__ = False

add_step_invocation(step, input_artifacts, external_artifacts, model_artifacts_or_metadata, client_lazy_loaders, parameters, default_parameters, upstream_steps, custom_id=None, allow_id_suffix=True)

Adds a step invocation to the pipeline.

Parameters:

Name Type Description Default
step BaseStep

The step for which to add an invocation.

required
input_artifacts Dict[str, StepArtifact]

The input artifacts for the invocation.

required
external_artifacts Dict[str, Union[ExternalArtifact, ArtifactVersionResponse]]

The external artifacts for the invocation.

required
model_artifacts_or_metadata Dict[str, ModelVersionDataLazyLoader]

The model artifacts or metadata for the invocation.

required
client_lazy_loaders Dict[str, ClientLazyLoader]

The client lazy loaders for the invocation.

required
parameters Dict[str, Any]

The parameters for the invocation.

required
default_parameters Dict[str, Any]

The default parameters for the invocation.

required
upstream_steps Set[str]

The upstream steps for the invocation.

required
custom_id Optional[str]

Custom ID to use for the invocation.

None
allow_id_suffix bool

Whether a suffix can be appended to the invocation ID.

True

Raises:

Type Description
RuntimeError

If the method is called on an inactive pipeline.

RuntimeError

If the invocation was called with an artifact from a different pipeline.

Returns:

Type Description
str

The step invocation ID.

Source code in src/zenml/pipelines/pipeline_definition.py
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
def add_step_invocation(
    self,
    step: "BaseStep",
    input_artifacts: Dict[str, StepArtifact],
    external_artifacts: Dict[
        str, Union["ExternalArtifact", "ArtifactVersionResponse"]
    ],
    model_artifacts_or_metadata: Dict[str, "ModelVersionDataLazyLoader"],
    client_lazy_loaders: Dict[str, "ClientLazyLoader"],
    parameters: Dict[str, Any],
    default_parameters: Dict[str, Any],
    upstream_steps: Set[str],
    custom_id: Optional[str] = None,
    allow_id_suffix: bool = True,
) -> str:
    """Adds a step invocation to the pipeline.

    Args:
        step: The step for which to add an invocation.
        input_artifacts: The input artifacts for the invocation.
        external_artifacts: The external artifacts for the invocation.
        model_artifacts_or_metadata: The model artifacts or metadata for
            the invocation.
        client_lazy_loaders: The client lazy loaders for the invocation.
        parameters: The parameters for the invocation.
        default_parameters: The default parameters for the invocation.
        upstream_steps: The upstream steps for the invocation.
        custom_id: Custom ID to use for the invocation.
        allow_id_suffix: Whether a suffix can be appended to the invocation
            ID.

    Raises:
        RuntimeError: If the method is called on an inactive pipeline.
        RuntimeError: If the invocation was called with an artifact from
            a different pipeline.

    Returns:
        The step invocation ID.
    """
    if Pipeline.ACTIVE_PIPELINE != self:
        raise RuntimeError(
            "A step invocation can only be added to an active pipeline."
        )

    for artifact in input_artifacts.values():
        if artifact.pipeline is not self:
            raise RuntimeError(
                "Got invalid input artifact for invocation of step "
                f"{step.name}: The input artifact was produced by a step "
                f"inside a different pipeline {artifact.pipeline.name}."
            )

    invocation_id = self._compute_invocation_id(
        step=step, custom_id=custom_id, allow_suffix=allow_id_suffix
    )
    invocation = StepInvocation(
        id=invocation_id,
        step=step,
        input_artifacts=input_artifacts,
        external_artifacts=external_artifacts,
        model_artifacts_or_metadata=model_artifacts_or_metadata,
        client_lazy_loaders=client_lazy_loaders,
        parameters=parameters,
        default_parameters=default_parameters,
        upstream_steps=upstream_steps,
        pipeline=self,
    )
    self._invocations[invocation_id] = invocation
    return invocation_id

build(settings=None, step_configurations=None, config_path=None)

Builds Docker images for the pipeline.

Parameters:

Name Type Description Default
settings Optional[Mapping[str, SettingsOrDict]]

Settings for the pipeline.

None
step_configurations Optional[Mapping[str, StepConfigurationUpdateOrDict]]

Configurations for steps of the pipeline.

None
config_path Optional[str]

Path to a yaml configuration file. This file will be parsed as a zenml.config.pipeline_configurations.PipelineRunConfiguration object. Options provided in this file will be overwritten by options provided in code using the other arguments of this method.

None

Returns:

Type Description
Optional[PipelineBuildResponse]

The build output.

Source code in src/zenml/pipelines/pipeline_definition.py
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
def build(
    self,
    settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
    step_configurations: Optional[
        Mapping[str, "StepConfigurationUpdateOrDict"]
    ] = None,
    config_path: Optional[str] = None,
) -> Optional["PipelineBuildResponse"]:
    """Builds Docker images for the pipeline.

    Args:
        settings: Settings for the pipeline.
        step_configurations: Configurations for steps of the pipeline.
        config_path: Path to a yaml configuration file. This file will
            be parsed as a
            `zenml.config.pipeline_configurations.PipelineRunConfiguration`
            object. Options provided in this file will be overwritten by
            options provided in code using the other arguments of this
            method.

    Returns:
        The build output.
    """
    with track_handler(event=AnalyticsEvent.BUILD_PIPELINE):
        self._prepare_if_possible()

        compile_args = self._run_args.copy()
        compile_args.pop("unlisted", None)
        compile_args.pop("prevent_build_reuse", None)
        if config_path:
            compile_args["config_path"] = config_path
        if step_configurations:
            compile_args["step_configurations"] = step_configurations
        if settings:
            compile_args["settings"] = settings

        deployment, _, _ = self._compile(**compile_args)
        pipeline_id = self._register().id

        local_repo = code_repository_utils.find_active_code_repository()
        code_repository = build_utils.verify_local_repository_context(
            deployment=deployment, local_repo_context=local_repo
        )

        return build_utils.create_pipeline_build(
            deployment=deployment,
            pipeline_id=pipeline_id,
            code_repository=code_repository,
        )

configure(enable_cache=None, enable_artifact_metadata=None, enable_artifact_visualization=None, enable_step_logs=None, settings=None, tags=None, extra=None, on_failure=None, on_success=None, model=None, parameters=None, merge=True, substitutions=None)

Configures the pipeline.

Configuration merging example: * merge==True: pipeline.configure(extra={"key1": 1}) pipeline.configure(extra={"key2": 2}, merge=True) pipeline.configuration.extra # {"key1": 1, "key2": 2} * merge==False: pipeline.configure(extra={"key1": 1}) pipeline.configure(extra={"key2": 2}, merge=False) pipeline.configuration.extra # {"key2": 2}

Parameters:

Name Type Description Default
enable_cache Optional[bool]

If caching should be enabled for this pipeline.

None
enable_artifact_metadata Optional[bool]

If artifact metadata should be enabled for this pipeline.

None
enable_artifact_visualization Optional[bool]

If artifact visualization should be enabled for this pipeline.

None
enable_step_logs Optional[bool]

If step logs should be enabled for this pipeline.

None
settings Optional[Mapping[str, SettingsOrDict]]

settings for this pipeline.

None
tags Optional[List[str]]

Tags to apply to runs of this pipeline.

None
extra Optional[Dict[str, Any]]

Extra configurations for this pipeline.

None
on_failure Optional[HookSpecification]

Callback function in event of failure of the step. Can be a function with a single argument of type BaseException, or a source path to such a function (e.g. module.my_function).

None
on_success Optional[HookSpecification]

Callback function in event of success of the step. Can be a function with no arguments, or a source path to such a function (e.g. module.my_function).

None
merge bool

If True, will merge the given dictionary configurations like extra and settings with existing configurations. If False the given configurations will overwrite all existing ones. See the general description of this method for an example.

True
model Optional[Model]

configuration of the model version in the Model Control Plane.

None
parameters Optional[Dict[str, Any]]

input parameters for the pipeline.

None
substitutions Optional[Dict[str, str]]

Extra placeholders to use in the name templates.

None

Returns:

Type Description
Self

The pipeline instance that this method was called on.

Source code in src/zenml/pipelines/pipeline_definition.py
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
def configure(
    self,
    enable_cache: Optional[bool] = None,
    enable_artifact_metadata: Optional[bool] = None,
    enable_artifact_visualization: Optional[bool] = None,
    enable_step_logs: Optional[bool] = None,
    settings: Optional[Mapping[str, "SettingsOrDict"]] = None,
    tags: Optional[List[str]] = None,
    extra: Optional[Dict[str, Any]] = None,
    on_failure: Optional["HookSpecification"] = None,
    on_success: Optional["HookSpecification"] = None,
    model: Optional["Model"] = None,
    parameters: Optional[Dict[str, Any]] = None,
    merge: bool = True,
    substitutions: Optional[Dict[str, str]] = None,
) -> Self:
    """Configures the pipeline.

    Configuration merging example:
    * `merge==True`:
        pipeline.configure(extra={"key1": 1})
        pipeline.configure(extra={"key2": 2}, merge=True)
        pipeline.configuration.extra # {"key1": 1, "key2": 2}
    * `merge==False`:
        pipeline.configure(extra={"key1": 1})
        pipeline.configure(extra={"key2": 2}, merge=False)
        pipeline.configuration.extra # {"key2": 2}

    Args:
        enable_cache: If caching should be enabled for this pipeline.
        enable_artifact_metadata: If artifact metadata should be enabled for
            this pipeline.
        enable_artifact_visualization: If artifact visualization should be
            enabled for this pipeline.
        enable_step_logs: If step logs should be enabled for this pipeline.
        settings: settings for this pipeline.
        tags: Tags to apply to runs of this pipeline.
        extra: Extra configurations for this pipeline.
        on_failure: Callback function in event of failure of the step. Can
            be a function with a single argument of type `BaseException`, or
            a source path to such a function (e.g. `module.my_function`).
        on_success: Callback function in event of success of the step. Can
            be a function with no arguments, or a source path to such a
            function (e.g. `module.my_function`).
        merge: If `True`, will merge the given dictionary configurations
            like `extra` and `settings` with existing
            configurations. If `False` the given configurations will
            overwrite all existing ones. See the general description of this
            method for an example.
        model: configuration of the model version in the Model Control Plane.
        parameters: input parameters for the pipeline.
        substitutions: Extra placeholders to use in the name templates.

    Returns:
        The pipeline instance that this method was called on.
    """
    failure_hook_source = None
    if on_failure:
        # string of on_failure hook function to be used for this pipeline
        failure_hook_source = resolve_and_validate_hook(on_failure)

    success_hook_source = None
    if on_success:
        # string of on_success hook function to be used for this pipeline
        success_hook_source = resolve_and_validate_hook(on_success)

    if merge and tags and self._configuration.tags:
        # Merge tags explicitly here as the recursive update later only
        # merges dicts
        tags = self._configuration.tags + tags

    values = dict_utils.remove_none_values(
        {
            "enable_cache": enable_cache,
            "enable_artifact_metadata": enable_artifact_metadata,
            "enable_artifact_visualization": enable_artifact_visualization,
            "enable_step_logs": enable_step_logs,
            "settings": settings,
            "tags": tags,
            "extra": extra,
            "failure_hook_source": failure_hook_source,
            "success_hook_source": success_hook_source,
            "model": model,
            "parameters": parameters,
            "substitutions": substitutions,
        }
    )
    if not self.__suppress_warnings_flag__:
        to_be_reapplied = []
        for param_, value_ in values.items():
            if (
                param_ in PipelineRunConfiguration.model_fields
                and param_ in self._from_config_file
                and value_ != self._from_config_file[param_]
            ):
                to_be_reapplied.append(
                    (param_, self._from_config_file[param_], value_)
                )
        if to_be_reapplied:
            msg = ""
            reapply_during_run_warning = (
                "The value of parameter '{name}' has changed from "
                "'{file_value}' to '{new_value}' set in your configuration "
                "file.\n"
            )
            for name, file_value, new_value in to_be_reapplied:
                msg += reapply_during_run_warning.format(
                    name=name, file_value=file_value, new_value=new_value
                )
            msg += (
                "Configuration file value will be used during pipeline "
                "run, so you change will not be efficient. Consider "
                "updating your configuration file instead."
            )
            logger.warning(msg)

    config = PipelineConfigurationUpdate(**values)
    self._apply_configuration(config, merge=merge)
    return self

copy()

Copies the pipeline.

Returns:

Type Description
Pipeline

The pipeline copy.

Source code in src/zenml/pipelines/pipeline_definition.py
1373
1374
1375
1376
1377
1378
1379
def copy(self) -> "Pipeline":
    """Copies the pipeline.

    Returns:
        The pipeline copy.
    """
    return copy.deepcopy(self)

create_run_template(name, **kwargs)

Create a run template for the pipeline.

Parameters:

Name Type Description Default
name str

The name of the run template.

required
**kwargs Any

Keyword arguments for the client method to create a run template.

{}

Returns:

Type Description
RunTemplateResponse

The created run template.

Source code in src/zenml/pipelines/pipeline_definition.py
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
def create_run_template(
    self, name: str, **kwargs: Any
) -> RunTemplateResponse:
    """Create a run template for the pipeline.

    Args:
        name: The name of the run template.
        **kwargs: Keyword arguments for the client method to create a run
            template.

    Returns:
        The created run template.
    """
    self._prepare_if_possible()
    deployment = self._create_deployment(
        **self._run_args, skip_schedule_registration=True
    )

    return Client().create_run_template(
        name=name, deployment_id=deployment.id, **kwargs
    )

log_pipeline_deployment_metadata(deployment_model) staticmethod

Displays logs based on the deployment model upon running a pipeline.

Parameters:

Name Type Description Default
deployment_model PipelineDeploymentResponse

The model for the pipeline deployment

required
Source code in src/zenml/pipelines/pipeline_definition.py
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
@staticmethod
def log_pipeline_deployment_metadata(
    deployment_model: PipelineDeploymentResponse,
) -> None:
    """Displays logs based on the deployment model upon running a pipeline.

    Args:
        deployment_model: The model for the pipeline deployment
    """
    try:
        # Log about the caching status
        if deployment_model.pipeline_configuration.enable_cache is False:
            logger.info(
                f"Caching is disabled by default for "
                f"`{deployment_model.pipeline_configuration.name}`."
            )

        # Log about the used builds
        if deployment_model.build:
            logger.info("Using a build:")
            logger.info(
                " Image(s): "
                f"{', '.join([i.image for i in deployment_model.build.images.values()])}"
            )

            # Log about version mismatches between local and build
            from zenml import __version__

            if deployment_model.build.zenml_version != __version__:
                logger.info(
                    f"ZenML version (different than the local version): "
                    f"{deployment_model.build.zenml_version}"
                )

            import platform

            if (
                deployment_model.build.python_version
                != platform.python_version()
            ):
                logger.info(
                    f"Python version (different than the local version): "
                    f"{deployment_model.build.python_version}"
                )

        # Log about the user, stack and components
        if deployment_model.user is not None:
            logger.info(f"Using user: `{deployment_model.user.name}`")

        if deployment_model.stack is not None:
            logger.info(f"Using stack: `{deployment_model.stack.name}`")

            for (
                component_type,
                component_models,
            ) in deployment_model.stack.components.items():
                logger.info(
                    f"  {component_type.value}: `{component_models[0].name}`"
                )
    except Exception as e:
        logger.debug(f"Logging pipeline deployment metadata failed: {e}")

prepare(*args, **kwargs)

Prepares the pipeline.

Parameters:

Name Type Description Default
*args Any

Pipeline entrypoint input arguments.

()
**kwargs Any

Pipeline entrypoint input keyword arguments.

{}

Raises:

Type Description
RuntimeError

If the pipeline has parameters configured differently in configuration file and code.

Source code in src/zenml/pipelines/pipeline_definition.py
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
    def prepare(self, *args: Any, **kwargs: Any) -> None:
        """Prepares the pipeline.

        Args:
            *args: Pipeline entrypoint input arguments.
            **kwargs: Pipeline entrypoint input keyword arguments.

        Raises:
            RuntimeError: If the pipeline has parameters configured differently in
                configuration file and code.
        """
        # Clear existing parameters and invocations
        self._parameters = {}
        self._invocations = {}

        conflicting_parameters = {}
        parameters_ = (self.configuration.parameters or {}).copy()
        if from_file_ := self._from_config_file.get("parameters", None):
            parameters_ = dict_utils.recursive_update(parameters_, from_file_)
        if parameters_:
            for k, v_runtime in kwargs.items():
                if k in parameters_:
                    v_config = parameters_[k]
                    if v_config != v_runtime:
                        conflicting_parameters[k] = (v_config, v_runtime)
            if conflicting_parameters:
                is_plural = "s" if len(conflicting_parameters) > 1 else ""
                msg = f"Configured parameter{is_plural} for the pipeline `{self.name}` conflict{'' if not is_plural else 's'} with parameter{is_plural} passed in runtime:\n"
                for key, values in conflicting_parameters.items():
                    msg += f"`{key}`: config=`{values[0]}` | runtime=`{values[1]}`\n"
                msg += """This happens, if you define values for pipeline parameters in configuration file and pass same parameters from the code. Example:
```
# config.yaml
    parameters:
        param_name: value1


# pipeline.py
@pipeline
def pipeline_(param_name: str):
    step_name()

if __name__=="__main__":
    pipeline_.with_options(config_path="config.yaml")(param_name="value2")
```
To avoid this consider setting pipeline parameters only in one place (config or code).
"""
                raise RuntimeError(msg)
            for k, v_config in parameters_.items():
                if k not in kwargs:
                    kwargs[k] = v_config

        with self:
            # Enter the context manager, so we become the active pipeline. This
            # means that all steps that get called while the entrypoint function
            # is executed will be added as invocation to this pipeline instance.
            self._call_entrypoint(*args, **kwargs)

register()

Register the pipeline in the server.

Returns:

Type Description
PipelineResponse

The registered pipeline model.

Source code in src/zenml/pipelines/pipeline_definition.py
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
def register(self) -> "PipelineResponse":
    """Register the pipeline in the server.

    Returns:
        The registered pipeline model.
    """
    # Activating the built-in integrations to load all materializers
    from zenml.integrations.registry import integration_registry

    self._prepare_if_possible()
    integration_registry.activate_integrations()

    if self.configuration.model_dump(
        exclude_defaults=True, exclude={"name"}
    ):
        logger.warning(
            f"The pipeline `{self.name}` that you're registering has "
            "custom configurations applied to it. These will not be "
            "registered with the pipeline and won't be set when you build "
            "images or run the pipeline from the CLI. To provide these "
            "configurations, use the `--config` option of the `zenml "
            "pipeline build/run` commands."
        )

    return self._register()

resolve()

Resolves the pipeline.

Returns:

Type Description
Source

The pipeline source.

Source code in src/zenml/pipelines/pipeline_definition.py
228
229
230
231
232
233
234
def resolve(self) -> "Source":
    """Resolves the pipeline.

    Returns:
        The pipeline source.
    """
    return source_utils.resolve(self.entrypoint, skip_validation=True)

with_options(run_name=None, schedule=None, build=None, step_configurations=None, steps=None, config_path=None, unlisted=False, prevent_build_reuse=False, **kwargs)

Copies the pipeline and applies the given configurations.

Parameters:

Name Type Description Default
run_name Optional[str]

Name of the pipeline run.

None
schedule Optional[Schedule]

Optional schedule to use for the run.

None
build Union[str, UUID, PipelineBuildBase, None]

Optional build to use for the run.

None
step_configurations Optional[Mapping[str, StepConfigurationUpdateOrDict]]

Configurations for steps of the pipeline.

None
steps Optional[Mapping[str, StepConfigurationUpdateOrDict]]

Configurations for steps of the pipeline. This is equivalent to step_configurations, and will be ignored if step_configurations is set as well.

None
config_path Optional[str]

Path to a yaml configuration file. This file will be parsed as a zenml.config.pipeline_configurations.PipelineRunConfiguration object. Options provided in this file will be overwritten by options provided in code using the other arguments of this method.

None
unlisted bool

Whether the pipeline run should be unlisted (not assigned to any pipeline).

False
prevent_build_reuse bool

DEPRECATED: Use DockerSettings.prevent_build_reuse instead.

False
**kwargs Any

Pipeline configuration options. These will be passed to the pipeline.configure(...) method.

{}

Returns:

Type Description
Pipeline

The copied pipeline instance.

Source code in src/zenml/pipelines/pipeline_definition.py
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
def with_options(
    self,
    run_name: Optional[str] = None,
    schedule: Optional[Schedule] = None,
    build: Union[str, "UUID", "PipelineBuildBase", None] = None,
    step_configurations: Optional[
        Mapping[str, "StepConfigurationUpdateOrDict"]
    ] = None,
    steps: Optional[Mapping[str, "StepConfigurationUpdateOrDict"]] = None,
    config_path: Optional[str] = None,
    unlisted: bool = False,
    prevent_build_reuse: bool = False,
    **kwargs: Any,
) -> "Pipeline":
    """Copies the pipeline and applies the given configurations.

    Args:
        run_name: Name of the pipeline run.
        schedule: Optional schedule to use for the run.
        build: Optional build to use for the run.
        step_configurations: Configurations for steps of the pipeline.
        steps: Configurations for steps of the pipeline. This is equivalent
            to `step_configurations`, and will be ignored if
            `step_configurations` is set as well.
        config_path: Path to a yaml configuration file. This file will
            be parsed as a
            `zenml.config.pipeline_configurations.PipelineRunConfiguration`
            object. Options provided in this file will be overwritten by
            options provided in code using the other arguments of this
            method.
        unlisted: Whether the pipeline run should be unlisted (not assigned
            to any pipeline).
        prevent_build_reuse: DEPRECATED: Use
            `DockerSettings.prevent_build_reuse` instead.
        **kwargs: Pipeline configuration options. These will be passed
            to the `pipeline.configure(...)` method.

    Returns:
        The copied pipeline instance.
    """
    if steps and step_configurations:
        logger.warning(
            "Step configurations were passed using both the "
            "`step_configurations` and `steps` keywords, ignoring the "
            "values passed using the `steps` keyword."
        )

    pipeline_copy = self.copy()

    pipeline_copy._reconfigure_from_file_with_overrides(
        config_path=config_path, **kwargs
    )

    run_args = dict_utils.remove_none_values(
        {
            "run_name": run_name,
            "schedule": schedule,
            "build": build,
            "step_configurations": step_configurations or steps,
            "config_path": config_path,
            "unlisted": unlisted,
            "prevent_build_reuse": prevent_build_reuse,
        }
    )
    pipeline_copy._run_args.update(run_args)
    return pipeline_copy

write_run_configuration_template(path, stack=None)

Writes a run configuration yaml template.

Parameters:

Name Type Description Default
path str

The path where the template will be written.

required
stack Optional[Stack]

The stack for which the template should be generated. If not given, the active stack will be used.

None
Source code in src/zenml/pipelines/pipeline_definition.py
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
def write_run_configuration_template(
    self, path: str, stack: Optional["Stack"] = None
) -> None:
    """Writes a run configuration yaml template.

    Args:
        path: The path where the template will be written.
        stack: The stack for which the template should be generated. If
            not given, the active stack will be used.
    """
    from zenml.config.base_settings import ConfigurationLevel
    from zenml.config.step_configurations import (
        PartialArtifactConfiguration,
    )

    self._prepare_if_possible()

    stack = stack or Client().active_stack

    setting_classes = stack.setting_classes
    setting_classes.update(settings_utils.get_general_settings())

    pipeline_settings = {}
    step_settings = {}
    for key, setting_class in setting_classes.items():
        fields = pydantic_utils.TemplateGenerator(setting_class).run()
        if ConfigurationLevel.PIPELINE in setting_class.LEVEL:
            pipeline_settings[key] = fields
        if ConfigurationLevel.STEP in setting_class.LEVEL:
            step_settings[key] = fields

    steps = {}
    for step_name, invocation in self.invocations.items():
        step = invocation.step
        outputs = {
            name: PartialArtifactConfiguration()
            for name in step.entrypoint_definition.outputs
        }
        step_template = StepConfigurationUpdate(
            parameters={},
            settings=step_settings,
            outputs=outputs,
        )
        steps[step_name] = step_template

    run_config = PipelineRunConfiguration(
        settings=pipeline_settings, steps=steps
    )
    template = pydantic_utils.TemplateGenerator(run_config).run()
    yaml_string = yaml.dump(template)
    yaml_string = yaml_utils.comment_out_yaml(yaml_string)

    with open(path, "w") as f:
        f.write(yaml_string)

PipelineBuildBase

Bases: BaseZenModel

Base model for pipeline builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
class PipelineBuildBase(BaseZenModel):
    """Base model for pipeline builds."""

    images: Dict[str, BuildItem] = Field(
        default={}, title="The images of this build."
    )
    is_local: bool = Field(
        title="Whether the build images are stored in a container registry "
        "or locally.",
    )
    contains_code: bool = Field(
        title="Whether any image of the build contains user code.",
    )
    zenml_version: Optional[str] = Field(
        title="The version of ZenML used for this build.", default=None
    )
    python_version: Optional[str] = Field(
        title="The Python version used for this build.", default=None
    )

    # Helper methods
    @property
    def requires_code_download(self) -> bool:
        """Whether the build requires code download.

        Returns:
            Whether the build requires code download.
        """
        return any(
            item.requires_code_download for item in self.images.values()
        )

    @staticmethod
    def get_image_key(component_key: str, step: Optional[str] = None) -> str:
        """Get the image key.

        Args:
            component_key: The component key.
            step: The pipeline step for which the image was built.

        Returns:
            The image key.
        """
        if step:
            return f"{step}.{component_key}"
        else:
            return component_key

    def get_image(self, component_key: str, step: Optional[str] = None) -> str:
        """Get the image built for a specific key.

        Args:
            component_key: The key for which to get the image.
            step: The pipeline step for which to get the image. If no image
                exists for this step, will fall back to the pipeline image for
                the same key.

        Returns:
            The image name or digest.
        """
        return self._get_item(component_key=component_key, step=step).image

    def get_settings_checksum(
        self, component_key: str, step: Optional[str] = None
    ) -> Optional[str]:
        """Get the settings checksum for a specific key.

        Args:
            component_key: The key for which to get the checksum.
            step: The pipeline step for which to get the checksum. If no
                image exists for this step, will fall back to the pipeline image
                for the same key.

        Returns:
            The settings checksum.
        """
        return self._get_item(
            component_key=component_key, step=step
        ).settings_checksum

    def _get_item(
        self, component_key: str, step: Optional[str] = None
    ) -> "BuildItem":
        """Get the item for a specific key.

        Args:
            component_key: The key for which to get the item.
            step: The pipeline step for which to get the item. If no item
                exists for this step, will fall back to the item for
                the same key.

        Raises:
            KeyError: If no item exists for the given key.

        Returns:
            The build item.
        """
        if step:
            try:
                combined_key = self.get_image_key(
                    component_key=component_key, step=step
                )
                return self.images[combined_key]
            except KeyError:
                pass

        try:
            return self.images[component_key]
        except KeyError:
            raise KeyError(
                f"Unable to find image for key {component_key}. Available keys: "
                f"{set(self.images)}."
            )

requires_code_download property

Whether the build requires code download.

Returns:

Type Description
bool

Whether the build requires code download.

get_image(component_key, step=None)

Get the image built for a specific key.

Parameters:

Name Type Description Default
component_key str

The key for which to get the image.

required
step Optional[str]

The pipeline step for which to get the image. If no image exists for this step, will fall back to the pipeline image for the same key.

None

Returns:

Type Description
str

The image name or digest.

Source code in src/zenml/models/v2/core/pipeline_build.py
104
105
106
107
108
109
110
111
112
113
114
115
116
def get_image(self, component_key: str, step: Optional[str] = None) -> str:
    """Get the image built for a specific key.

    Args:
        component_key: The key for which to get the image.
        step: The pipeline step for which to get the image. If no image
            exists for this step, will fall back to the pipeline image for
            the same key.

    Returns:
        The image name or digest.
    """
    return self._get_item(component_key=component_key, step=step).image

get_image_key(component_key, step=None) staticmethod

Get the image key.

Parameters:

Name Type Description Default
component_key str

The component key.

required
step Optional[str]

The pipeline step for which the image was built.

None

Returns:

Type Description
str

The image key.

Source code in src/zenml/models/v2/core/pipeline_build.py
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
@staticmethod
def get_image_key(component_key: str, step: Optional[str] = None) -> str:
    """Get the image key.

    Args:
        component_key: The component key.
        step: The pipeline step for which the image was built.

    Returns:
        The image key.
    """
    if step:
        return f"{step}.{component_key}"
    else:
        return component_key

get_settings_checksum(component_key, step=None)

Get the settings checksum for a specific key.

Parameters:

Name Type Description Default
component_key str

The key for which to get the checksum.

required
step Optional[str]

The pipeline step for which to get the checksum. If no image exists for this step, will fall back to the pipeline image for the same key.

None

Returns:

Type Description
Optional[str]

The settings checksum.

Source code in src/zenml/models/v2/core/pipeline_build.py
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
def get_settings_checksum(
    self, component_key: str, step: Optional[str] = None
) -> Optional[str]:
    """Get the settings checksum for a specific key.

    Args:
        component_key: The key for which to get the checksum.
        step: The pipeline step for which to get the checksum. If no
            image exists for this step, will fall back to the pipeline image
            for the same key.

    Returns:
        The settings checksum.
    """
    return self._get_item(
        component_key=component_key, step=step
    ).settings_checksum

PipelineBuildFilter

Bases: WorkspaceScopedFilter

Model to enable advanced filtering of all pipeline builds.

Source code in src/zenml/models/v2/core/pipeline_build.py
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
class PipelineBuildFilter(WorkspaceScopedFilter):
    """Model to enable advanced filtering of all pipeline builds."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        "container_registry_id",
    ]

    pipeline_id: Optional[Union[UUID, str]] = Field(
        description="Pipeline associated with the pipeline build.",
        default=None,
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        description="Stack associated with the pipeline build.",
        default=None,
        union_mode="left_to_right",
    )
    container_registry_id: Optional[Union[UUID, str]] = Field(
        description="Container registry associated with the pipeline build.",
        default=None,
        union_mode="left_to_right",
    )
    is_local: Optional[bool] = Field(
        description="Whether the build images are stored in a container "
        "registry or locally.",
        default=None,
    )
    contains_code: Optional[bool] = Field(
        description="Whether any image of the build contains user code.",
        default=None,
    )
    zenml_version: Optional[str] = Field(
        description="The version of ZenML used for this build.", default=None
    )
    python_version: Optional[str] = Field(
        description="The Python version used for this build.", default=None
    )
    checksum: Optional[str] = Field(
        description="The build checksum.", default=None
    )
    stack_checksum: Optional[str] = Field(
        description="The stack checksum.", default=None
    )

    def get_custom_filters(
        self,
        table: Type["AnySchema"],
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_

        from zenml.enums import StackComponentType
        from zenml.zen_stores.schemas import (
            PipelineBuildSchema,
            StackComponentSchema,
            StackCompositionSchema,
            StackSchema,
        )

        if self.container_registry_id:
            container_registry_filter = and_(
                PipelineBuildSchema.stack_id == StackSchema.id,
                StackSchema.id == StackCompositionSchema.stack_id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
                StackComponentSchema.type
                == StackComponentType.CONTAINER_REGISTRY.value,
                StackComponentSchema.id == self.container_registry_id,
            )
            custom_filters.append(container_registry_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/pipeline_build.py
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
def get_custom_filters(
    self,
    table: Type["AnySchema"],
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_

    from zenml.enums import StackComponentType
    from zenml.zen_stores.schemas import (
        PipelineBuildSchema,
        StackComponentSchema,
        StackCompositionSchema,
        StackSchema,
    )

    if self.container_registry_id:
        container_registry_filter = and_(
            PipelineBuildSchema.stack_id == StackSchema.id,
            StackSchema.id == StackCompositionSchema.stack_id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
            StackComponentSchema.type
            == StackComponentType.CONTAINER_REGISTRY.value,
            StackComponentSchema.id == self.container_registry_id,
        )
        custom_filters.append(container_registry_filter)

    return custom_filters

PipelineFilter

Bases: WorkspaceScopedFilter, TaggableFilter

Pipeline filter model.

Source code in src/zenml/models/v2/core/pipeline.py
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
class PipelineFilter(WorkspaceScopedFilter, TaggableFilter):
    """Pipeline filter model."""

    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        SORT_PIPELINES_BY_LATEST_RUN_KEY,
    ]
    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        "latest_run_status",
    ]
    CLI_EXCLUDE_FIELDS = [
        *WorkspaceScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline",
    )
    latest_run_status: Optional[str] = Field(
        default=None,
        description="Filter by the status of the latest run of a pipeline. "
        "This will always be applied as an `AND` filter for now.",
    )

    def apply_filter(
        self, query: AnyQuery, table: Type["AnySchema"]
    ) -> AnyQuery:
        """Applies the filter to a query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query, table)

        from sqlmodel import and_, col, func, select

        from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

        if self.latest_run_status:
            latest_pipeline_run_subquery = (
                select(
                    PipelineRunSchema.pipeline_id,
                    func.max(PipelineRunSchema.created).label("created"),
                )
                .where(col(PipelineRunSchema.pipeline_id).is_not(None))
                .group_by(col(PipelineRunSchema.pipeline_id))
                .subquery()
            )

            query = (
                query.join(
                    PipelineRunSchema,
                    PipelineSchema.id == PipelineRunSchema.pipeline_id,
                )
                .join(
                    latest_pipeline_run_subquery,
                    and_(
                        PipelineRunSchema.pipeline_id
                        == latest_pipeline_run_subquery.c.pipeline_id,
                        PipelineRunSchema.created
                        == latest_pipeline_run_subquery.c.created,
                    ),
                )
                .where(
                    self.generate_custom_query_conditions_for_column(
                        value=self.latest_run_status,
                        table=PipelineRunSchema,
                        column="status",
                    )
                )
            )

        return query

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, case, col, desc, func, select

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

        sort_by, operand = self.sorting_params

        if sort_by == SORT_PIPELINES_BY_LATEST_RUN_KEY:
            # Subquery to find the latest run per pipeline
            latest_run_subquery = (
                select(
                    PipelineSchema.id,
                    case(
                        (
                            func.max(PipelineRunSchema.created).is_(None),
                            PipelineSchema.created,
                        ),
                        else_=func.max(PipelineRunSchema.created),
                    ).label("latest_run"),
                )
                .outerjoin(
                    PipelineRunSchema,
                    PipelineSchema.id == PipelineRunSchema.pipeline_id,  # type: ignore[arg-type]
                )
                .group_by(col(PipelineSchema.id))
                .subquery()
            )

            query = query.add_columns(
                latest_run_subquery.c.latest_run,
            ).where(PipelineSchema.id == latest_run_subquery.c.id)

            if operand == SorterOps.ASCENDING:
                query = query.order_by(
                    asc(latest_run_subquery.c.latest_run),
                    asc(PipelineSchema.id),
                )
            else:
                query = query.order_by(
                    desc(latest_run_subquery.c.latest_run),
                    desc(PipelineSchema.id),
                )
            return query
        else:
            return super().apply_sorting(query=query, table=table)

apply_filter(query, table)

Applies the filter to a query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/pipeline.py
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
def apply_filter(
    self, query: AnyQuery, table: Type["AnySchema"]
) -> AnyQuery:
    """Applies the filter to a query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query, table)

    from sqlmodel import and_, col, func, select

    from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

    if self.latest_run_status:
        latest_pipeline_run_subquery = (
            select(
                PipelineRunSchema.pipeline_id,
                func.max(PipelineRunSchema.created).label("created"),
            )
            .where(col(PipelineRunSchema.pipeline_id).is_not(None))
            .group_by(col(PipelineRunSchema.pipeline_id))
            .subquery()
        )

        query = (
            query.join(
                PipelineRunSchema,
                PipelineSchema.id == PipelineRunSchema.pipeline_id,
            )
            .join(
                latest_pipeline_run_subquery,
                and_(
                    PipelineRunSchema.pipeline_id
                    == latest_pipeline_run_subquery.c.pipeline_id,
                    PipelineRunSchema.created
                    == latest_pipeline_run_subquery.c.created,
                ),
            )
            .where(
                self.generate_custom_query_conditions_for_column(
                    value=self.latest_run_status,
                    table=PipelineRunSchema,
                    column="status",
                )
            )
        )

    return query

apply_sorting(query, table)

Apply sorting to the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/pipeline.py
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, case, col, desc, func, select

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import PipelineRunSchema, PipelineSchema

    sort_by, operand = self.sorting_params

    if sort_by == SORT_PIPELINES_BY_LATEST_RUN_KEY:
        # Subquery to find the latest run per pipeline
        latest_run_subquery = (
            select(
                PipelineSchema.id,
                case(
                    (
                        func.max(PipelineRunSchema.created).is_(None),
                        PipelineSchema.created,
                    ),
                    else_=func.max(PipelineRunSchema.created),
                ).label("latest_run"),
            )
            .outerjoin(
                PipelineRunSchema,
                PipelineSchema.id == PipelineRunSchema.pipeline_id,  # type: ignore[arg-type]
            )
            .group_by(col(PipelineSchema.id))
            .subquery()
        )

        query = query.add_columns(
            latest_run_subquery.c.latest_run,
        ).where(PipelineSchema.id == latest_run_subquery.c.id)

        if operand == SorterOps.ASCENDING:
            query = query.order_by(
                asc(latest_run_subquery.c.latest_run),
                asc(PipelineSchema.id),
            )
        else:
            query = query.order_by(
                desc(latest_run_subquery.c.latest_run),
                desc(PipelineSchema.id),
            )
        return query
    else:
        return super().apply_sorting(query=query, table=table)

PipelineRunFilter

Bases: WorkspaceScopedFilter, TaggableFilter

Model to enable advanced filtering of all Workspaces.

Source code in src/zenml/models/v2/core/pipeline_run.py
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
class PipelineRunFilter(WorkspaceScopedFilter, TaggableFilter):
    """Model to enable advanced filtering of all Workspaces."""

    CUSTOM_SORTING_OPTIONS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.CUSTOM_SORTING_OPTIONS,
        *TaggableFilter.CUSTOM_SORTING_OPTIONS,
        "tag",
        "stack",
        "pipeline",
        "model",
        "model_version",
    ]
    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        *TaggableFilter.FILTER_EXCLUDE_FIELDS,
        "unlisted",
        "code_repository_id",
        "build_id",
        "schedule_id",
        "stack_id",
        "template_id",
        "pipeline",
        "stack",
        "code_repository",
        "model",
        "stack_component",
        "pipeline_name",
        "templatable",
        "run_metadata",
    ]
    CLI_EXCLUDE_FIELDS = [
        *WorkspaceScopedFilter.CLI_EXCLUDE_FIELDS,
        *TaggableFilter.CLI_EXCLUDE_FIELDS,
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline Run",
    )
    orchestrator_run_id: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline Run within the orchestrator",
    )
    pipeline_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Pipeline associated with the Pipeline Run",
        union_mode="left_to_right",
    )
    stack_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Stack used for the Pipeline Run",
        union_mode="left_to_right",
    )
    schedule_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Schedule that triggered the Pipeline Run",
        union_mode="left_to_right",
    )
    build_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Build used for the Pipeline Run",
        union_mode="left_to_right",
    )
    deployment_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Deployment used for the Pipeline Run",
        union_mode="left_to_right",
    )
    code_repository_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Code repository used for the Pipeline Run",
        union_mode="left_to_right",
    )
    template_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Template used for the pipeline run.",
        union_mode="left_to_right",
    )
    model_version_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Model version associated with the pipeline run.",
        union_mode="left_to_right",
    )
    status: Optional[str] = Field(
        default=None,
        description="Name of the Pipeline Run",
    )
    start_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="Start time for this run",
        union_mode="left_to_right",
    )
    end_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="End time for this run",
        union_mode="left_to_right",
    )
    unlisted: Optional[bool] = None
    run_metadata: Optional[Dict[str, Any]] = Field(
        default=None,
        description="The run_metadata to filter the pipeline runs by.",
    )
    # TODO: Remove once frontend is ready for it. This is replaced by the more
    #   generic `pipeline` filter below.
    pipeline_name: Optional[str] = Field(
        default=None,
        description="Name of the pipeline associated with the run",
    )
    pipeline: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the pipeline associated with the run.",
    )
    stack: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the stack associated with the run.",
    )
    code_repository: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the code repository associated with the run.",
    )
    model: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the model associated with the run.",
    )
    stack_component: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Name/ID of the stack component associated with the run.",
    )
    templatable: Optional[bool] = Field(
        default=None, description="Whether the run is templatable."
    )
    model_config = ConfigDict(protected_namespaces=())

    def get_custom_filters(
        self,
        table: Type["AnySchema"],
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from sqlmodel import and_, col, or_

        from zenml.zen_stores.schemas import (
            CodeReferenceSchema,
            CodeRepositorySchema,
            ModelSchema,
            ModelVersionSchema,
            PipelineBuildSchema,
            PipelineDeploymentSchema,
            PipelineRunSchema,
            PipelineSchema,
            RunMetadataResourceSchema,
            RunMetadataSchema,
            ScheduleSchema,
            StackComponentSchema,
            StackCompositionSchema,
            StackSchema,
        )

        if self.unlisted is not None:
            if self.unlisted is True:
                unlisted_filter = PipelineRunSchema.pipeline_id.is_(None)  # type: ignore[union-attr]
            else:
                unlisted_filter = PipelineRunSchema.pipeline_id.is_not(None)  # type: ignore[union-attr]
            custom_filters.append(unlisted_filter)

        if self.code_repository_id:
            code_repo_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.code_reference_id
                == CodeReferenceSchema.id,
                CodeReferenceSchema.code_repository_id
                == self.code_repository_id,
            )
            custom_filters.append(code_repo_filter)

        if self.stack_id:
            stack_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
                StackSchema.id == self.stack_id,
            )
            custom_filters.append(stack_filter)

        if self.schedule_id:
            schedule_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.schedule_id == ScheduleSchema.id,
                ScheduleSchema.id == self.schedule_id,
            )
            custom_filters.append(schedule_filter)

        if self.build_id:
            pipeline_build_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.build_id == PipelineBuildSchema.id,
                PipelineBuildSchema.id == self.build_id,
            )
            custom_filters.append(pipeline_build_filter)

        if self.template_id:
            run_template_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.template_id == self.template_id,
            )
            custom_filters.append(run_template_filter)

        if self.pipeline:
            pipeline_filter = and_(
                PipelineRunSchema.pipeline_id == PipelineSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.pipeline, table=PipelineSchema
                ),
            )
            custom_filters.append(pipeline_filter)

        if self.stack:
            stack_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.stack,
                    table=StackSchema,
                ),
            )
            custom_filters.append(stack_filter)

        if self.code_repository:
            code_repo_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.code_reference_id
                == CodeReferenceSchema.id,
                CodeReferenceSchema.code_repository_id
                == CodeRepositorySchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.code_repository,
                    table=CodeRepositorySchema,
                ),
            )
            custom_filters.append(code_repo_filter)

        if self.model:
            model_filter = and_(
                PipelineRunSchema.model_version_id == ModelVersionSchema.id,
                ModelVersionSchema.model_id == ModelSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.model, table=ModelSchema
                ),
            )
            custom_filters.append(model_filter)

        if self.stack_component:
            component_filter = and_(
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
                StackSchema.id == StackCompositionSchema.stack_id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.stack_component,
                    table=StackComponentSchema,
                ),
            )
            custom_filters.append(component_filter)

        if self.pipeline_name:
            pipeline_name_filter = and_(
                PipelineRunSchema.pipeline_id == PipelineSchema.id,
                self.generate_custom_query_conditions_for_column(
                    value=self.pipeline_name,
                    table=PipelineSchema,
                    column="name",
                ),
            )
            custom_filters.append(pipeline_name_filter)

        if self.templatable is not None:
            if self.templatable is True:
                templatable_filter = and_(
                    # The following condition is not perfect as it does not
                    # consider stacks with custom flavor components or local
                    # components, but the best we can do currently with our
                    # table columns.
                    PipelineRunSchema.deployment_id
                    == PipelineDeploymentSchema.id,
                    PipelineDeploymentSchema.build_id
                    == PipelineBuildSchema.id,
                    col(PipelineBuildSchema.is_local).is_(False),
                    col(PipelineBuildSchema.stack_id).is_not(None),
                )
            else:
                templatable_filter = or_(
                    col(PipelineRunSchema.deployment_id).is_(None),
                    and_(
                        PipelineRunSchema.deployment_id
                        == PipelineDeploymentSchema.id,
                        col(PipelineDeploymentSchema.build_id).is_(None),
                    ),
                    and_(
                        PipelineRunSchema.deployment_id
                        == PipelineDeploymentSchema.id,
                        PipelineDeploymentSchema.build_id
                        == PipelineBuildSchema.id,
                        or_(
                            col(PipelineBuildSchema.is_local).is_(True),
                            col(PipelineBuildSchema.stack_id).is_(None),
                        ),
                    ),
                )

            custom_filters.append(templatable_filter)
        if self.run_metadata is not None:
            from zenml.enums import MetadataResourceTypes

            for key, value in self.run_metadata.items():
                additional_filter = and_(
                    RunMetadataResourceSchema.resource_id
                    == PipelineRunSchema.id,
                    RunMetadataResourceSchema.resource_type
                    == MetadataResourceTypes.PIPELINE_RUN.value,
                    RunMetadataResourceSchema.run_metadata_id
                    == RunMetadataSchema.id,
                    self.generate_custom_query_conditions_for_column(
                        value=key,
                        table=RunMetadataSchema,
                        column="key",
                    ),
                    self.generate_custom_query_conditions_for_column(
                        value=value,
                        table=RunMetadataSchema,
                        column="value",
                        json_encode_value=True,
                    ),
                )
                custom_filters.append(additional_filter)

        return custom_filters

    def apply_sorting(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Apply sorting to the query.

        Args:
            query: The query to which to apply the sorting.
            table: The query table.

        Returns:
            The query with sorting applied.
        """
        from sqlmodel import asc, desc

        from zenml.enums import SorterOps
        from zenml.zen_stores.schemas import (
            ModelSchema,
            ModelVersionSchema,
            PipelineDeploymentSchema,
            PipelineRunSchema,
            PipelineSchema,
            StackSchema,
        )

        sort_by, operand = self.sorting_params

        if sort_by == "pipeline":
            query = query.outerjoin(
                PipelineSchema,
                PipelineRunSchema.pipeline_id == PipelineSchema.id,
            )
            column = PipelineSchema.name
        elif sort_by == "stack":
            query = query.outerjoin(
                PipelineDeploymentSchema,
                PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            ).outerjoin(
                StackSchema,
                PipelineDeploymentSchema.stack_id == StackSchema.id,
            )
            column = StackSchema.name
        elif sort_by == "model":
            query = query.outerjoin(
                ModelVersionSchema,
                PipelineRunSchema.model_version_id == ModelVersionSchema.id,
            ).outerjoin(
                ModelSchema,
                ModelVersionSchema.model_id == ModelSchema.id,
            )
            column = ModelSchema.name
        elif sort_by == "model_version":
            query = query.outerjoin(
                ModelVersionSchema,
                PipelineRunSchema.model_version_id == ModelVersionSchema.id,
            )
            column = ModelVersionSchema.name
        else:
            return super().apply_sorting(query=query, table=table)

        query = query.add_columns(column)

        if operand == SorterOps.ASCENDING:
            query = query.order_by(asc(column))
        else:
            query = query.order_by(desc(column))

        return query

apply_sorting(query, table)

Apply sorting to the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the sorting.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with sorting applied.

Source code in src/zenml/models/v2/core/pipeline_run.py
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
def apply_sorting(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Apply sorting to the query.

    Args:
        query: The query to which to apply the sorting.
        table: The query table.

    Returns:
        The query with sorting applied.
    """
    from sqlmodel import asc, desc

    from zenml.enums import SorterOps
    from zenml.zen_stores.schemas import (
        ModelSchema,
        ModelVersionSchema,
        PipelineDeploymentSchema,
        PipelineRunSchema,
        PipelineSchema,
        StackSchema,
    )

    sort_by, operand = self.sorting_params

    if sort_by == "pipeline":
        query = query.outerjoin(
            PipelineSchema,
            PipelineRunSchema.pipeline_id == PipelineSchema.id,
        )
        column = PipelineSchema.name
    elif sort_by == "stack":
        query = query.outerjoin(
            PipelineDeploymentSchema,
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
        ).outerjoin(
            StackSchema,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
        )
        column = StackSchema.name
    elif sort_by == "model":
        query = query.outerjoin(
            ModelVersionSchema,
            PipelineRunSchema.model_version_id == ModelVersionSchema.id,
        ).outerjoin(
            ModelSchema,
            ModelVersionSchema.model_id == ModelSchema.id,
        )
        column = ModelSchema.name
    elif sort_by == "model_version":
        query = query.outerjoin(
            ModelVersionSchema,
            PipelineRunSchema.model_version_id == ModelVersionSchema.id,
        )
        column = ModelVersionSchema.name
    else:
        return super().apply_sorting(query=query, table=table)

    query = query.add_columns(column)

    if operand == SorterOps.ASCENDING:
        query = query.order_by(asc(column))
    else:
        query = query.order_by(desc(column))

    return query

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/pipeline_run.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
def get_custom_filters(
    self,
    table: Type["AnySchema"],
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from sqlmodel import and_, col, or_

    from zenml.zen_stores.schemas import (
        CodeReferenceSchema,
        CodeRepositorySchema,
        ModelSchema,
        ModelVersionSchema,
        PipelineBuildSchema,
        PipelineDeploymentSchema,
        PipelineRunSchema,
        PipelineSchema,
        RunMetadataResourceSchema,
        RunMetadataSchema,
        ScheduleSchema,
        StackComponentSchema,
        StackCompositionSchema,
        StackSchema,
    )

    if self.unlisted is not None:
        if self.unlisted is True:
            unlisted_filter = PipelineRunSchema.pipeline_id.is_(None)  # type: ignore[union-attr]
        else:
            unlisted_filter = PipelineRunSchema.pipeline_id.is_not(None)  # type: ignore[union-attr]
        custom_filters.append(unlisted_filter)

    if self.code_repository_id:
        code_repo_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.code_reference_id
            == CodeReferenceSchema.id,
            CodeReferenceSchema.code_repository_id
            == self.code_repository_id,
        )
        custom_filters.append(code_repo_filter)

    if self.stack_id:
        stack_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
            StackSchema.id == self.stack_id,
        )
        custom_filters.append(stack_filter)

    if self.schedule_id:
        schedule_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.schedule_id == ScheduleSchema.id,
            ScheduleSchema.id == self.schedule_id,
        )
        custom_filters.append(schedule_filter)

    if self.build_id:
        pipeline_build_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.build_id == PipelineBuildSchema.id,
            PipelineBuildSchema.id == self.build_id,
        )
        custom_filters.append(pipeline_build_filter)

    if self.template_id:
        run_template_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.template_id == self.template_id,
        )
        custom_filters.append(run_template_filter)

    if self.pipeline:
        pipeline_filter = and_(
            PipelineRunSchema.pipeline_id == PipelineSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.pipeline, table=PipelineSchema
            ),
        )
        custom_filters.append(pipeline_filter)

    if self.stack:
        stack_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.stack,
                table=StackSchema,
            ),
        )
        custom_filters.append(stack_filter)

    if self.code_repository:
        code_repo_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.code_reference_id
            == CodeReferenceSchema.id,
            CodeReferenceSchema.code_repository_id
            == CodeRepositorySchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.code_repository,
                table=CodeRepositorySchema,
            ),
        )
        custom_filters.append(code_repo_filter)

    if self.model:
        model_filter = and_(
            PipelineRunSchema.model_version_id == ModelVersionSchema.id,
            ModelVersionSchema.model_id == ModelSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.model, table=ModelSchema
            ),
        )
        custom_filters.append(model_filter)

    if self.stack_component:
        component_filter = and_(
            PipelineRunSchema.deployment_id == PipelineDeploymentSchema.id,
            PipelineDeploymentSchema.stack_id == StackSchema.id,
            StackSchema.id == StackCompositionSchema.stack_id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.stack_component,
                table=StackComponentSchema,
            ),
        )
        custom_filters.append(component_filter)

    if self.pipeline_name:
        pipeline_name_filter = and_(
            PipelineRunSchema.pipeline_id == PipelineSchema.id,
            self.generate_custom_query_conditions_for_column(
                value=self.pipeline_name,
                table=PipelineSchema,
                column="name",
            ),
        )
        custom_filters.append(pipeline_name_filter)

    if self.templatable is not None:
        if self.templatable is True:
            templatable_filter = and_(
                # The following condition is not perfect as it does not
                # consider stacks with custom flavor components or local
                # components, but the best we can do currently with our
                # table columns.
                PipelineRunSchema.deployment_id
                == PipelineDeploymentSchema.id,
                PipelineDeploymentSchema.build_id
                == PipelineBuildSchema.id,
                col(PipelineBuildSchema.is_local).is_(False),
                col(PipelineBuildSchema.stack_id).is_not(None),
            )
        else:
            templatable_filter = or_(
                col(PipelineRunSchema.deployment_id).is_(None),
                and_(
                    PipelineRunSchema.deployment_id
                    == PipelineDeploymentSchema.id,
                    col(PipelineDeploymentSchema.build_id).is_(None),
                ),
                and_(
                    PipelineRunSchema.deployment_id
                    == PipelineDeploymentSchema.id,
                    PipelineDeploymentSchema.build_id
                    == PipelineBuildSchema.id,
                    or_(
                        col(PipelineBuildSchema.is_local).is_(True),
                        col(PipelineBuildSchema.stack_id).is_(None),
                    ),
                ),
            )

        custom_filters.append(templatable_filter)
    if self.run_metadata is not None:
        from zenml.enums import MetadataResourceTypes

        for key, value in self.run_metadata.items():
            additional_filter = and_(
                RunMetadataResourceSchema.resource_id
                == PipelineRunSchema.id,
                RunMetadataResourceSchema.resource_type
                == MetadataResourceTypes.PIPELINE_RUN.value,
                RunMetadataResourceSchema.run_metadata_id
                == RunMetadataSchema.id,
                self.generate_custom_query_conditions_for_column(
                    value=key,
                    table=RunMetadataSchema,
                    column="key",
                ),
                self.generate_custom_query_conditions_for_column(
                    value=value,
                    table=RunMetadataSchema,
                    column="value",
                    json_encode_value=True,
                ),
            )
            custom_filters.append(additional_filter)

    return custom_filters

ScheduleFilter

Bases: WorkspaceScopedFilter

Model to enable advanced filtering of all Users.

Source code in src/zenml/models/v2/core/schedule.py
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
class ScheduleFilter(WorkspaceScopedFilter):
    """Model to enable advanced filtering of all Users."""

    pipeline_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Pipeline that the schedule is attached to.",
        union_mode="left_to_right",
    )
    orchestrator_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Orchestrator that the schedule is attached to.",
        union_mode="left_to_right",
    )
    active: Optional[bool] = Field(
        default=None,
        description="If the schedule is active",
    )
    cron_expression: Optional[str] = Field(
        default=None,
        description="The cron expression, describing the schedule",
    )
    start_time: Optional[Union[datetime, str]] = Field(
        default=None, description="Start time", union_mode="left_to_right"
    )
    end_time: Optional[Union[datetime, str]] = Field(
        default=None, description="End time", union_mode="left_to_right"
    )
    interval_second: Optional[Optional[float]] = Field(
        default=None,
        description="The repetition interval in seconds",
    )
    catchup: Optional[bool] = Field(
        default=None,
        description="Whether or not the schedule is set to catchup past missed "
        "events",
    )
    name: Optional[str] = Field(
        default=None,
        description="Name of the schedule",
    )
    run_once_start_time: Optional[Union[datetime, str]] = Field(
        default=None,
        description="The time at which the schedule should run once",
        union_mode="left_to_right",
    )

SecretFilter

Bases: WorkspaceScopedFilter

Model to enable advanced filtering of all Secrets.

Source code in src/zenml/models/v2/core/secret.py
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
class SecretFilter(WorkspaceScopedFilter):
    """Model to enable advanced filtering of all Secrets."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        "values",
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the secret",
    )
    scope: Optional[Union[SecretScope, str]] = Field(
        default=None,
        description="Scope in which to filter secrets",
        union_mode="left_to_right",
    )

    @staticmethod
    def _get_filtering_value(value: Optional[Any]) -> str:
        """Convert the value to a string that can be used for lexicographical filtering and sorting.

        Args:
            value: The value to convert.

        Returns:
            The value converted to string format that can be used for
            lexicographical sorting and filtering.
        """
        if value is None:
            return ""
        str_value = str(value)
        if isinstance(value, datetime):
            str_value = value.strftime("%Y-%m-%d %H:%M:%S")
        return str_value

    def secret_matches(self, secret: SecretResponse) -> bool:
        """Checks if a secret matches the filter criteria.

        Args:
            secret: The secret to check.

        Returns:
            True if the secret matches the filter criteria, False otherwise.
        """
        for filter in self.list_of_filters:
            column_value: Optional[Any] = None
            if filter.column == "workspace_id":
                column_value = secret.workspace.id
            elif filter.column == "user_id":
                column_value = secret.user.id if secret.user else None
            else:
                column_value = getattr(secret, filter.column)

            # Convert the values to strings for lexicographical comparison.
            str_column_value = self._get_filtering_value(column_value)
            str_filter_value = self._get_filtering_value(filter.value)

            # Compare the lexicographical values according to the operation.
            if filter.operation == GenericFilterOps.EQUALS:
                result = str_column_value == str_filter_value
            elif filter.operation == GenericFilterOps.CONTAINS:
                result = str_filter_value in str_column_value
            elif filter.operation == GenericFilterOps.STARTSWITH:
                result = str_column_value.startswith(str_filter_value)
            elif filter.operation == GenericFilterOps.ENDSWITH:
                result = str_column_value.endswith(str_filter_value)
            elif filter.operation == GenericFilterOps.GT:
                result = str_column_value > str_filter_value
            elif filter.operation == GenericFilterOps.GTE:
                result = str_column_value >= str_filter_value
            elif filter.operation == GenericFilterOps.LT:
                result = str_column_value < str_filter_value
            elif filter.operation == GenericFilterOps.LTE:
                result = str_column_value <= str_filter_value

            # Exit early if the result is False for AND, and True for OR
            if self.logical_operator == LogicalOperators.AND:
                if not result:
                    return False
            else:
                if result:
                    return True

        # If we get here, all filters have been checked and the result is
        # True for AND, and False for OR
        if self.logical_operator == LogicalOperators.AND:
            return True
        else:
            return False

    def sort_secrets(
        self, secrets: List[SecretResponse]
    ) -> List[SecretResponse]:
        """Sorts a list of secrets according to the filter criteria.

        Args:
            secrets: The list of secrets to sort.

        Returns:
            The sorted list of secrets.
        """
        column, sort_op = self.sorting_params
        sorted_secrets = sorted(
            secrets,
            key=lambda secret: self._get_filtering_value(
                getattr(secret, column)
            ),
            reverse=sort_op == SorterOps.DESCENDING,
        )

        return sorted_secrets

secret_matches(secret)

Checks if a secret matches the filter criteria.

Parameters:

Name Type Description Default
secret SecretResponse

The secret to check.

required

Returns:

Type Description
bool

True if the secret matches the filter criteria, False otherwise.

Source code in src/zenml/models/v2/core/secret.py
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
def secret_matches(self, secret: SecretResponse) -> bool:
    """Checks if a secret matches the filter criteria.

    Args:
        secret: The secret to check.

    Returns:
        True if the secret matches the filter criteria, False otherwise.
    """
    for filter in self.list_of_filters:
        column_value: Optional[Any] = None
        if filter.column == "workspace_id":
            column_value = secret.workspace.id
        elif filter.column == "user_id":
            column_value = secret.user.id if secret.user else None
        else:
            column_value = getattr(secret, filter.column)

        # Convert the values to strings for lexicographical comparison.
        str_column_value = self._get_filtering_value(column_value)
        str_filter_value = self._get_filtering_value(filter.value)

        # Compare the lexicographical values according to the operation.
        if filter.operation == GenericFilterOps.EQUALS:
            result = str_column_value == str_filter_value
        elif filter.operation == GenericFilterOps.CONTAINS:
            result = str_filter_value in str_column_value
        elif filter.operation == GenericFilterOps.STARTSWITH:
            result = str_column_value.startswith(str_filter_value)
        elif filter.operation == GenericFilterOps.ENDSWITH:
            result = str_column_value.endswith(str_filter_value)
        elif filter.operation == GenericFilterOps.GT:
            result = str_column_value > str_filter_value
        elif filter.operation == GenericFilterOps.GTE:
            result = str_column_value >= str_filter_value
        elif filter.operation == GenericFilterOps.LT:
            result = str_column_value < str_filter_value
        elif filter.operation == GenericFilterOps.LTE:
            result = str_column_value <= str_filter_value

        # Exit early if the result is False for AND, and True for OR
        if self.logical_operator == LogicalOperators.AND:
            if not result:
                return False
        else:
            if result:
                return True

    # If we get here, all filters have been checked and the result is
    # True for AND, and False for OR
    if self.logical_operator == LogicalOperators.AND:
        return True
    else:
        return False

sort_secrets(secrets)

Sorts a list of secrets according to the filter criteria.

Parameters:

Name Type Description Default
secrets List[SecretResponse]

The list of secrets to sort.

required

Returns:

Type Description
List[SecretResponse]

The sorted list of secrets.

Source code in src/zenml/models/v2/core/secret.py
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
def sort_secrets(
    self, secrets: List[SecretResponse]
) -> List[SecretResponse]:
    """Sorts a list of secrets according to the filter criteria.

    Args:
        secrets: The list of secrets to sort.

    Returns:
        The sorted list of secrets.
    """
    column, sort_op = self.sorting_params
    sorted_secrets = sorted(
        secrets,
        key=lambda secret: self._get_filtering_value(
            getattr(secret, column)
        ),
        reverse=sort_op == SorterOps.DESCENDING,
    )

    return sorted_secrets

SecretResponse

Bases: WorkspaceScopedResponse[SecretResponseBody, SecretResponseMetadata, SecretResponseResources]

Response model for secrets.

Source code in src/zenml/models/v2/core/secret.py
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
class SecretResponse(
    WorkspaceScopedResponse[
        SecretResponseBody, SecretResponseMetadata, SecretResponseResources
    ]
):
    """Response model for secrets."""

    ANALYTICS_FIELDS: ClassVar[List[str]] = ["scope"]

    name: str = Field(
        title="The name of the secret.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_hydrated_version(self) -> "SecretResponse":
        """Get the hydrated version of this workspace.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_secret(self.id)

    # Body and metadata properties

    @property
    def scope(self) -> SecretScope:
        """The `scope` property.

        Returns:
            the value of the property.
        """
        return self.get_body().scope

    @property
    def values(self) -> Dict[str, Optional[SecretStr]]:
        """The `values` property.

        Returns:
            the value of the property.
        """
        return self.get_body().values

    # Helper methods
    @property
    def secret_values(self) -> Dict[str, str]:
        """A dictionary with all un-obfuscated values stored in this secret.

        The values are returned as strings, not SecretStr. If a value is
        None, it is not included in the returned dictionary. This is to enable
        the use of None values in the update model to indicate that a secret
        value should be deleted.

        Returns:
            A dictionary containing the secret's values.
        """
        return {
            k: v.get_secret_value()
            for k, v in self.values.items()
            if v is not None
        }

    @property
    def has_missing_values(self) -> bool:
        """Returns True if the secret has missing values (i.e. None).

        Values can be missing from a secret for example if the user retrieves a
        secret but does not have the permission to view the secret values.

        Returns:
            True if the secret has any values set to None.
        """
        return any(v is None for v in self.values.values())

    def add_secret(self, key: str, value: str) -> None:
        """Adds a secret value to the secret.

        Args:
            key: The key of the secret value.
            value: The secret value.
        """
        self.get_body().values[key] = SecretStr(value)

    def remove_secret(self, key: str) -> None:
        """Removes a secret value from the secret.

        Args:
            key: The key of the secret value.
        """
        del self.get_body().values[key]

    def remove_secrets(self) -> None:
        """Removes all secret values from the secret but keep the keys."""
        self.get_body().values = {k: None for k in self.values.keys()}

    def set_secrets(self, values: Dict[str, str]) -> None:
        """Sets the secret values of the secret.

        Args:
            values: The secret values to set.
        """
        self.get_body().values = {k: SecretStr(v) for k, v in values.items()}

has_missing_values property

Returns True if the secret has missing values (i.e. None).

Values can be missing from a secret for example if the user retrieves a secret but does not have the permission to view the secret values.

Returns:

Type Description
bool

True if the secret has any values set to None.

scope property

The scope property.

Returns:

Type Description
SecretScope

the value of the property.

secret_values property

A dictionary with all un-obfuscated values stored in this secret.

The values are returned as strings, not SecretStr. If a value is None, it is not included in the returned dictionary. This is to enable the use of None values in the update model to indicate that a secret value should be deleted.

Returns:

Type Description
Dict[str, str]

A dictionary containing the secret's values.

values property

The values property.

Returns:

Type Description
Dict[str, Optional[SecretStr]]

the value of the property.

add_secret(key, value)

Adds a secret value to the secret.

Parameters:

Name Type Description Default
key str

The key of the secret value.

required
value str

The secret value.

required
Source code in src/zenml/models/v2/core/secret.py
210
211
212
213
214
215
216
217
def add_secret(self, key: str, value: str) -> None:
    """Adds a secret value to the secret.

    Args:
        key: The key of the secret value.
        value: The secret value.
    """
    self.get_body().values[key] = SecretStr(value)

get_hydrated_version()

Get the hydrated version of this workspace.

Returns:

Type Description
SecretResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/secret.py
149
150
151
152
153
154
155
156
157
def get_hydrated_version(self) -> "SecretResponse":
    """Get the hydrated version of this workspace.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_secret(self.id)

remove_secret(key)

Removes a secret value from the secret.

Parameters:

Name Type Description Default
key str

The key of the secret value.

required
Source code in src/zenml/models/v2/core/secret.py
219
220
221
222
223
224
225
def remove_secret(self, key: str) -> None:
    """Removes a secret value from the secret.

    Args:
        key: The key of the secret value.
    """
    del self.get_body().values[key]

remove_secrets()

Removes all secret values from the secret but keep the keys.

Source code in src/zenml/models/v2/core/secret.py
227
228
229
def remove_secrets(self) -> None:
    """Removes all secret values from the secret but keep the keys."""
    self.get_body().values = {k: None for k in self.values.keys()}

set_secrets(values)

Sets the secret values of the secret.

Parameters:

Name Type Description Default
values Dict[str, str]

The secret values to set.

required
Source code in src/zenml/models/v2/core/secret.py
231
232
233
234
235
236
237
def set_secrets(self, values: Dict[str, str]) -> None:
    """Sets the secret values of the secret.

    Args:
        values: The secret values to set.
    """
    self.get_body().values = {k: SecretStr(v) for k, v in values.items()}

SecretScope

Bases: StrEnum

Enum for the scope of a secret.

Source code in src/zenml/enums.py
138
139
140
141
142
class SecretScope(StrEnum):
    """Enum for the scope of a secret."""

    WORKSPACE = "workspace"
    USER = "user"

ServerCredentials

Bases: BaseModel

Cached Server Credentials.

Source code in src/zenml/login/credentials.py
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
class ServerCredentials(BaseModel):
    """Cached Server Credentials."""

    url: str
    api_key: Optional[str] = None
    api_token: Optional[APIToken] = None
    username: Optional[str] = None
    password: Optional[str] = None

    # Extra server attributes
    deployment_type: Optional[ServerDeploymentType] = None
    server_id: Optional[UUID] = None
    server_name: Optional[str] = None
    status: Optional[str] = None
    version: Optional[str] = None

    # Pro server attributes
    organization_name: Optional[str] = None
    organization_id: Optional[UUID] = None
    tenant_name: Optional[str] = None
    tenant_id: Optional[UUID] = None
    pro_api_url: Optional[str] = None
    pro_dashboard_url: Optional[str] = None

    @property
    def id(self) -> str:
        """Get the server identifier.

        Returns:
            The server identifier.
        """
        if self.server_id:
            return str(self.server_id)
        return self.url

    @property
    def type(self) -> ServerType:
        """Get the server type.

        Returns:
            The server type.
        """
        if self.deployment_type == ServerDeploymentType.CLOUD:
            return ServerType.PRO
        if self.url == ZENML_PRO_API_URL:
            return ServerType.PRO_API
        if self.url == self.pro_api_url:
            return ServerType.PRO_API
        if self.organization_id or self.tenant_id:
            return ServerType.PRO
        if urlparse(self.url).hostname in [
            "localhost",
            "127.0.0.1",
            "host.docker.internal",
        ]:
            return ServerType.LOCAL
        return ServerType.REMOTE

    def update_server_info(
        self, server_info: Union[ServerModel, TenantRead]
    ) -> None:
        """Update with server information received from the server itself or from a ZenML Pro tenant descriptor.

        Args:
            server_info: The server information to update with.
        """
        if isinstance(server_info, ServerModel):
            # The server ID doesn't change during the lifetime of the server
            self.server_id = self.server_id or server_info.id
            # All other attributes can change during the lifetime of the server
            self.deployment_type = server_info.deployment_type
            server_name = (
                server_info.pro_tenant_name
                or server_info.metadata.get("tenant_name")
                or server_info.name
            )
            if server_name:
                self.server_name = server_name
            if server_info.pro_organization_id:
                self.organization_id = server_info.pro_organization_id
            if server_info.pro_tenant_id:
                self.server_id = server_info.pro_tenant_id
            if server_info.pro_organization_name:
                self.organization_name = server_info.pro_organization_name
            if server_info.pro_tenant_name:
                self.tenant_name = server_info.pro_tenant_name
            if server_info.pro_api_url:
                self.pro_api_url = server_info.pro_api_url
            if server_info.pro_dashboard_url:
                self.pro_dashboard_url = server_info.pro_dashboard_url
            self.version = server_info.version or self.version
            # The server information was retrieved from the server itself, so we
            # can assume that the server is available
            self.status = "available"
        else:
            self.deployment_type = ServerDeploymentType.CLOUD
            self.server_id = server_info.id
            self.server_name = server_info.name
            self.organization_name = server_info.organization_name
            self.organization_id = server_info.organization_id
            self.tenant_name = server_info.name
            self.tenant_id = server_info.id
            self.status = server_info.status
            self.version = server_info.version

    @property
    def is_available(self) -> bool:
        """Check if the server is available (running and authenticated).

        Returns:
            True if the server is available, False otherwise.
        """
        if self.status not in [TenantStatus.AVAILABLE, ServiceState.ACTIVE]:
            return False
        if (
            self.api_key
            or self.api_token
            or self.username
            and self.password is not None
            or self.type in [ServerType.PRO, ServerType.LOCAL]
        ):
            return True
        if self.api_token and not self.api_token.expired:
            return True
        return False

    @property
    def auth_status(self) -> str:
        """Get the authentication status.

        Returns:
            The authentication status.
        """
        if self.api_key:
            return "API key"
        if self.username and self.password is not None:
            return "password"
        if not self.api_token:
            if self.type == ServerType.LOCAL:
                return "no authentication required"
            return "N/A"
        expires_at = self.api_token.expires_at_with_leeway
        if not expires_at:
            return "never expires"
        if expires_at < utc_now(tz_aware=expires_at):
            return "expired at " + self.expires_at

        return f"valid until {self.expires_at} (in {self.expires_in})"

    @property
    def expires_at(self) -> str:
        """Get the expiration time of the token as a string.

        Returns:
            The expiration time of the token as a string.
        """
        if not self.api_token:
            return "N/A"
        expires_at = self.api_token.expires_at_with_leeway
        if not expires_at:
            return "never"

        # Convert the date in the local timezone
        local_expires_at = to_local_tz(expires_at)
        return local_expires_at.strftime("%Y-%m-%d %H:%M:%S %Z")

    @property
    def expires_in(self) -> str:
        """Get the time remaining until the token expires.

        Returns:
            The time remaining until the token expires.
        """
        if not self.api_token:
            return "N/A"
        expires_at = self.api_token.expires_at_with_leeway
        if not expires_at:
            return "never"

        # Get the time remaining until the token expires
        expires_in = expires_at - utc_now(tz_aware=expires_at)
        return get_human_readable_time(expires_in.total_seconds())

    @property
    def dashboard_url(self) -> str:
        """Get the URL to the ZenML dashboard for this server.

        Returns:
            The URL to the ZenML dashboard for this server.
        """
        if self.organization_id and self.server_id:
            return (
                (self.pro_dashboard_url or ZENML_PRO_URL)
                + f"/organizations/{str(self.organization_id)}/tenants/{str(self.server_id)}"
            )

        return self.url

    @property
    def dashboard_organization_url(self) -> str:
        """Get the URL to the ZenML Pro dashboard for this tenant's organization.

        Returns:
            The URL to the ZenML Pro dashboard for this tenant's organization.
        """
        if self.organization_id:
            return (
                self.pro_dashboard_url or ZENML_PRO_URL
            ) + f"/organizations/{str(self.organization_id)}"
        return ""

    @property
    def dashboard_hyperlink(self) -> str:
        """Get the hyperlink to the ZenML dashboard for this tenant.

        Returns:
            The hyperlink to the ZenML dashboard for this tenant.
        """
        return f"[link={self.dashboard_url}]{self.dashboard_url}[/link]"

    @property
    def api_hyperlink(self) -> str:
        """Get the hyperlink to the ZenML OpenAPI dashboard for this tenant.

        Returns:
            The hyperlink to the ZenML OpenAPI dashboard for this tenant.
        """
        api_url = self.url + "/docs"
        return f"[link={api_url}]{self.url}[/link]"

    @property
    def server_name_hyperlink(self) -> str:
        """Get the hyperlink to the ZenML dashboard for this server using its name.

        Returns:
            The hyperlink to the ZenML dashboard for this server using its name.
        """
        if self.server_name is None:
            return "N/A"
        return f"[link={self.dashboard_url}]{self.server_name}[/link]"

    @property
    def server_id_hyperlink(self) -> str:
        """Get the hyperlink to the ZenML dashboard for this server using its ID.

        Returns:
            The hyperlink to the ZenML dashboard for this server using its ID.
        """
        if self.server_id is None:
            return "N/A"
        return f"[link={self.dashboard_url}]{str(self.server_id)}[/link]"

    @property
    def organization_hyperlink(self) -> str:
        """Get the hyperlink to the ZenML Pro dashboard for this server's organization.

        Returns:
            The hyperlink to the ZenML Pro dashboard for this server's
            organization.
        """
        if self.organization_name:
            return self.organization_name_hyperlink
        if self.organization_id:
            return self.organization_id_hyperlink
        return "N/A"

    @property
    def organization_name_hyperlink(self) -> str:
        """Get the hyperlink to the ZenML Pro dashboard for this server's organization using its name.

        Returns:
            The hyperlink to the ZenML Pro dashboard for this server's
            organization using its name.
        """
        if self.organization_name is None:
            return "N/A"
        return f"[link={self.dashboard_organization_url}]{self.organization_name}[/link]"

    @property
    def organization_id_hyperlink(self) -> str:
        """Get the hyperlink to the ZenML Pro dashboard for this tenant's organization using its ID.

        Returns:
            The hyperlink to the ZenML Pro dashboard for this tenant's
            organization using its ID.
        """
        if self.organization_id is None:
            return "N/A"
        return f"[link={self.dashboard_organization_url}]{self.organization_id}[/link]"

Get the hyperlink to the ZenML OpenAPI dashboard for this tenant.

Returns:

Type Description
str

The hyperlink to the ZenML OpenAPI dashboard for this tenant.

auth_status property

Get the authentication status.

Returns:

Type Description
str

The authentication status.

Get the hyperlink to the ZenML dashboard for this tenant.

Returns:

Type Description
str

The hyperlink to the ZenML dashboard for this tenant.

dashboard_organization_url property

Get the URL to the ZenML Pro dashboard for this tenant's organization.

Returns:

Type Description
str

The URL to the ZenML Pro dashboard for this tenant's organization.

dashboard_url property

Get the URL to the ZenML dashboard for this server.

Returns:

Type Description
str

The URL to the ZenML dashboard for this server.

expires_at property

Get the expiration time of the token as a string.

Returns:

Type Description
str

The expiration time of the token as a string.

expires_in property

Get the time remaining until the token expires.

Returns:

Type Description
str

The time remaining until the token expires.

id property

Get the server identifier.

Returns:

Type Description
str

The server identifier.

is_available property

Check if the server is available (running and authenticated).

Returns:

Type Description
bool

True if the server is available, False otherwise.

Get the hyperlink to the ZenML Pro dashboard for this server's organization.

Returns:

Type Description
str

The hyperlink to the ZenML Pro dashboard for this server's

str

organization.

Get the hyperlink to the ZenML Pro dashboard for this tenant's organization using its ID.

Returns:

Type Description
str

The hyperlink to the ZenML Pro dashboard for this tenant's

str

organization using its ID.

Get the hyperlink to the ZenML Pro dashboard for this server's organization using its name.

Returns:

Type Description
str

The hyperlink to the ZenML Pro dashboard for this server's

str

organization using its name.

Get the hyperlink to the ZenML dashboard for this server using its ID.

Returns:

Type Description
str

The hyperlink to the ZenML dashboard for this server using its ID.

Get the hyperlink to the ZenML dashboard for this server using its name.

Returns:

Type Description
str

The hyperlink to the ZenML dashboard for this server using its name.

type property

Get the server type.

Returns:

Type Description
ServerType

The server type.

update_server_info(server_info)

Update with server information received from the server itself or from a ZenML Pro tenant descriptor.

Parameters:

Name Type Description Default
server_info Union[ServerModel, TenantRead]

The server information to update with.

required
Source code in src/zenml/login/credentials.py
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
def update_server_info(
    self, server_info: Union[ServerModel, TenantRead]
) -> None:
    """Update with server information received from the server itself or from a ZenML Pro tenant descriptor.

    Args:
        server_info: The server information to update with.
    """
    if isinstance(server_info, ServerModel):
        # The server ID doesn't change during the lifetime of the server
        self.server_id = self.server_id or server_info.id
        # All other attributes can change during the lifetime of the server
        self.deployment_type = server_info.deployment_type
        server_name = (
            server_info.pro_tenant_name
            or server_info.metadata.get("tenant_name")
            or server_info.name
        )
        if server_name:
            self.server_name = server_name
        if server_info.pro_organization_id:
            self.organization_id = server_info.pro_organization_id
        if server_info.pro_tenant_id:
            self.server_id = server_info.pro_tenant_id
        if server_info.pro_organization_name:
            self.organization_name = server_info.pro_organization_name
        if server_info.pro_tenant_name:
            self.tenant_name = server_info.pro_tenant_name
        if server_info.pro_api_url:
            self.pro_api_url = server_info.pro_api_url
        if server_info.pro_dashboard_url:
            self.pro_dashboard_url = server_info.pro_dashboard_url
        self.version = server_info.version or self.version
        # The server information was retrieved from the server itself, so we
        # can assume that the server is available
        self.status = "available"
    else:
        self.deployment_type = ServerDeploymentType.CLOUD
        self.server_id = server_info.id
        self.server_name = server_info.name
        self.organization_name = server_info.organization_name
        self.organization_id = server_info.organization_id
        self.tenant_name = server_info.name
        self.tenant_id = server_info.id
        self.status = server_info.status
        self.version = server_info.version

ServerProviderType

Bases: StrEnum

ZenML server providers.

Source code in src/zenml/enums.py
207
208
209
210
211
class ServerProviderType(StrEnum):
    """ZenML server providers."""

    DAEMON = "daemon"
    DOCKER = "docker"

ServerType

Bases: StrEnum

The type of server.

Source code in src/zenml/login/credentials.py
33
34
35
36
37
38
39
class ServerType(StrEnum):
    """The type of server."""

    PRO_API = "PRO_API"
    PRO = "PRO"
    REMOTE = "REMOTE"
    LOCAL = "LOCAL"

ServiceAccountFilter

Bases: BaseFilter

Model to enable advanced filtering of service accounts.

Source code in src/zenml/models/v2/core/service_account.py
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
class ServiceAccountFilter(BaseFilter):
    """Model to enable advanced filtering of service accounts."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the user",
    )
    description: Optional[str] = Field(
        default=None,
        title="Filter by the service account description.",
    )
    active: Optional[Union[bool, str]] = Field(
        default=None,
        description="Whether the user is active",
        union_mode="left_to_right",
    )

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Override to filter out user accounts from the query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)
        query = query.where(
            getattr(table, "is_service_account") == True  # noqa: E712
        )

        return query

apply_filter(query, table)

Override to filter out user accounts from the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/service_account.py
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Override to filter out user accounts from the query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)
    query = query.where(
        getattr(table, "is_service_account") == True  # noqa: E712
    )

    return query

ServiceConnectorFilter

Bases: WorkspaceScopedFilter

Model to enable advanced filtering of service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
class ServiceConnectorFilter(WorkspaceScopedFilter):
    """Model to enable advanced filtering of service connectors."""

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        "scope_type",
        "resource_type",
        "labels_str",
        "labels",
    ]
    CLI_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.CLI_EXCLUDE_FIELDS,
        "scope_type",
        "labels_str",
        "labels",
    ]
    scope_type: Optional[str] = Field(
        default=None,
        description="The type to scope this query to.",
    )
    name: Optional[str] = Field(
        default=None,
        description="The name to filter by",
    )
    connector_type: Optional[str] = Field(
        default=None,
        description="The type of service connector to filter by",
    )
    auth_method: Optional[str] = Field(
        default=None,
        title="Filter by the authentication method configured for the "
        "connector",
    )
    resource_type: Optional[str] = Field(
        default=None,
        title="Filter by the type of resource that the connector can be used "
        "to access",
    )
    resource_id: Optional[str] = Field(
        default=None,
        title="Filter by the ID of the resource instance that the connector "
        "is configured to access",
    )
    labels_str: Optional[str] = Field(
        default=None,
        title="Filter by one or more labels. This field can be either a JSON "
        "formatted dictionary of label names and values, where the values are "
        'optional and can be set to None (e.g. `{"label1":"value1", "label2": '
        "null}` ), or a comma-separated list of label names and values (e.g "
        "`label1=value1,label2=`. If a label name is specified without a "
        "value, the filter will match all service connectors that have that "
        "label present, regardless of value.",
    )
    secret_id: Optional[Union[UUID, str]] = Field(
        default=None,
        title="Filter by the ID of the secret that contains the service "
        "connector's credentials",
        union_mode="left_to_right",
    )

    # Use this internally to configure and access the labels as a dictionary
    labels: Optional[Dict[str, Optional[str]]] = Field(
        default=None,
        title="The labels to filter by, as a dictionary",
        exclude=True,
    )

    @model_validator(mode="after")
    def validate_labels(self) -> "ServiceConnectorFilter":
        """Parse the labels string into a label dictionary and vice-versa.

        Returns:
            The validated values.
        """
        if self.labels_str is not None:
            try:
                self.labels = json.loads(self.labels_str)
            except json.JSONDecodeError:
                # Interpret as comma-separated values instead
                self.labels = {
                    label.split("=", 1)[0]: label.split("=", 1)[1]
                    if "=" in label
                    else None
                    for label in self.labels_str.split(",")
                }
        elif self.labels is not None:
            self.labels_str = json.dumps(self.labels)

        return self

validate_labels()

Parse the labels string into a label dictionary and vice-versa.

Returns:

Type Description
ServiceConnectorFilter

The validated values.

Source code in src/zenml/models/v2/core/service_connector.py
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
@model_validator(mode="after")
def validate_labels(self) -> "ServiceConnectorFilter":
    """Parse the labels string into a label dictionary and vice-versa.

    Returns:
        The validated values.
    """
    if self.labels_str is not None:
        try:
            self.labels = json.loads(self.labels_str)
        except json.JSONDecodeError:
            # Interpret as comma-separated values instead
            self.labels = {
                label.split("=", 1)[0]: label.split("=", 1)[1]
                if "=" in label
                else None
                for label in self.labels_str.split(",")
            }
    elif self.labels is not None:
        self.labels_str = json.dumps(self.labels)

    return self

ServiceConnectorInfo

Bases: BaseModel

Information about the service connector when creating a full stack.

Source code in src/zenml/models/v2/misc/info_models.py
26
27
28
29
30
31
class ServiceConnectorInfo(BaseModel):
    """Information about the service connector when creating a full stack."""

    type: str
    auth_method: str
    configuration: Dict[str, Any] = {}

ServiceConnectorRequest

Bases: WorkspaceScopedRequest

Request model for service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
class ServiceConnectorRequest(WorkspaceScopedRequest):
    """Request model for service connectors."""

    name: str = Field(
        title="The service connector name.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    connector_type: Union[str, "ServiceConnectorTypeModel"] = Field(
        title="The type of service connector.",
        union_mode="left_to_right",
    )
    description: str = Field(
        default="",
        title="The service connector instance description.",
    )
    auth_method: str = Field(
        title="The authentication method that the connector instance uses to "
        "access the resources.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    resource_types: List[str] = Field(
        default_factory=list,
        title="The type(s) of resource that the connector instance can be used "
        "to gain access to.",
    )
    resource_id: Optional[str] = Field(
        default=None,
        title="Uniquely identifies a specific resource instance that the "
        "connector instance can be used to access. If omitted, the connector "
        "instance can be used to access any and all resource instances that "
        "the authentication method and resource type(s) allow.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    supports_instances: bool = Field(
        default=False,
        title="Indicates whether the connector instance can be used to access "
        "multiple instances of the configured resource type.",
    )
    expires_at: Optional[datetime] = Field(
        default=None,
        title="Time when the authentication credentials configured for the "
        "connector expire. If omitted, the credentials do not expire.",
    )
    expires_skew_tolerance: Optional[int] = Field(
        default=None,
        title="The number of seconds of tolerance to apply when checking "
        "whether the authentication credentials configured for the connector "
        "have expired. If omitted, no tolerance is applied.",
    )
    expiration_seconds: Optional[int] = Field(
        default=None,
        title="The duration, in seconds, that the temporary credentials "
        "generated by this connector should remain valid. Only applicable for "
        "connectors and authentication methods that involve generating "
        "temporary credentials from the ones configured in the connector.",
    )
    configuration: Dict[str, Any] = Field(
        default_factory=dict,
        title="The service connector configuration, not including secrets.",
    )
    secrets: Dict[str, Optional[PlainSerializedSecretStr]] = Field(
        default_factory=dict,
        title="The service connector secrets.",
    )
    labels: Dict[str, str] = Field(
        default_factory=dict,
        title="Service connector labels.",
    )

    # Analytics
    ANALYTICS_FIELDS: ClassVar[List[str]] = [
        "connector_type",
        "auth_method",
        "resource_types",
    ]

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Format the resource types in the analytics metadata.

        Returns:
            Dict of analytics metadata.
        """
        metadata = super().get_analytics_metadata()
        if len(self.resource_types) == 1:
            metadata["resource_types"] = self.resource_types[0]
        else:
            metadata["resource_types"] = ", ".join(self.resource_types)
        metadata["connector_type"] = self.type
        return metadata

    # Helper methods
    @property
    def type(self) -> str:
        """Get the connector type.

        Returns:
            The connector type.
        """
        if isinstance(self.connector_type, str):
            return self.connector_type
        return self.connector_type.connector_type

    @property
    def emojified_connector_type(self) -> str:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return self.connector_type.emojified_connector_type

        return self.connector_type

    @property
    def emojified_resource_types(self) -> List[str]:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
                for resource_type in self.resource_types
            ]

        return self.resource_types

    def validate_and_configure_resources(
        self,
        connector_type: "ServiceConnectorTypeModel",
        resource_types: Optional[Union[str, List[str]]] = None,
        resource_id: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
    ) -> None:
        """Validate and configure the resources that the connector can be used to access.

        Args:
            connector_type: The connector type specification used to validate
                the connector configuration.
            resource_types: The type(s) of resource that the connector instance
                can be used to access. If omitted, a multi-type connector is
                configured.
            resource_id: Uniquely identifies a specific resource instance that
                the connector instance can be used to access.
            configuration: The connector configuration.
            secrets: The connector secrets.
        """
        _validate_and_configure_resources(
            connector=self,
            connector_type=connector_type,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=configuration,
            secrets=secrets,
        )

emojified_connector_type property

Get the emojified connector type.

Returns:

Type Description
str

The emojified connector type.

emojified_resource_types property

Get the emojified connector type.

Returns:

Type Description
List[str]

The emojified connector type.

type property

Get the connector type.

Returns:

Type Description
str

The connector type.

get_analytics_metadata()

Format the resource types in the analytics metadata.

Returns:

Type Description
Dict[str, Any]

Dict of analytics metadata.

Source code in src/zenml/models/v2/core/service_connector.py
120
121
122
123
124
125
126
127
128
129
130
131
132
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Format the resource types in the analytics metadata.

    Returns:
        Dict of analytics metadata.
    """
    metadata = super().get_analytics_metadata()
    if len(self.resource_types) == 1:
        metadata["resource_types"] = self.resource_types[0]
    else:
        metadata["resource_types"] = ", ".join(self.resource_types)
    metadata["connector_type"] = self.type
    return metadata

validate_and_configure_resources(connector_type, resource_types=None, resource_id=None, configuration=None, secrets=None)

Validate and configure the resources that the connector can be used to access.

Parameters:

Name Type Description Default
connector_type ServiceConnectorTypeModel

The connector type specification used to validate the connector configuration.

required
resource_types Optional[Union[str, List[str]]]

The type(s) of resource that the connector instance can be used to access. If omitted, a multi-type connector is configured.

None
resource_id Optional[str]

Uniquely identifies a specific resource instance that the connector instance can be used to access.

None
configuration Optional[Dict[str, Any]]

The connector configuration.

None
secrets Optional[Dict[str, Optional[SecretStr]]]

The connector secrets.

None
Source code in src/zenml/models/v2/core/service_connector.py
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
def validate_and_configure_resources(
    self,
    connector_type: "ServiceConnectorTypeModel",
    resource_types: Optional[Union[str, List[str]]] = None,
    resource_id: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
) -> None:
    """Validate and configure the resources that the connector can be used to access.

    Args:
        connector_type: The connector type specification used to validate
            the connector configuration.
        resource_types: The type(s) of resource that the connector instance
            can be used to access. If omitted, a multi-type connector is
            configured.
        resource_id: Uniquely identifies a specific resource instance that
            the connector instance can be used to access.
        configuration: The connector configuration.
        secrets: The connector secrets.
    """
    _validate_and_configure_resources(
        connector=self,
        connector_type=connector_type,
        resource_types=resource_types,
        resource_id=resource_id,
        configuration=configuration,
        secrets=secrets,
    )

ServiceConnectorResourcesInfo

Bases: BaseModel

Information about the service connector resources needed for CLI and UI.

Source code in src/zenml/models/v2/misc/info_models.py
73
74
75
76
77
78
class ServiceConnectorResourcesInfo(BaseModel):
    """Information about the service connector resources needed for CLI and UI."""

    connector_type: str

    components_resources_info: Dict[StackComponentType, List[ResourcesInfo]]

ServiceConnectorResourcesModel

Bases: BaseModel

Service connector resources list.

Lists the resource types and resource instances that a service connector can provide access to.

Source code in src/zenml/models/v2/misc/service_connector_type.py
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
class ServiceConnectorResourcesModel(BaseModel):
    """Service connector resources list.

    Lists the resource types and resource instances that a service connector
    can provide access to.
    """

    id: Optional[UUID] = Field(
        default=None,
        title="The ID of the service connector instance providing this "
        "resource.",
    )

    name: Optional[str] = Field(
        default=None,
        title="The name of the service connector instance providing this "
        "resource.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    connector_type: Union[str, "ServiceConnectorTypeModel"] = Field(
        title="The type of service connector.", union_mode="left_to_right"
    )

    resources: List[ServiceConnectorTypedResourcesModel] = Field(
        default_factory=list,
        title="The list of resources that the service connector instance can "
        "give access to. Contains one entry for every resource type "
        "that the connector is configured for.",
    )

    error: Optional[str] = Field(
        default=None,
        title="A global error message describing why the service connector "
        "instance could not authenticate to the remote service.",
    )

    @property
    def resources_dict(self) -> Dict[str, ServiceConnectorTypedResourcesModel]:
        """Get the resources as a dictionary indexed by resource type.

        Returns:
            The resources as a dictionary indexed by resource type.
        """
        return {
            resource.resource_type: resource for resource in self.resources
        }

    @property
    def resource_types(self) -> List[str]:
        """Get the resource types.

        Returns:
            The resource types.
        """
        return [resource.resource_type for resource in self.resources]

    def set_error(
        self, error: str, resource_type: Optional[str] = None
    ) -> None:
        """Set a global error message or an error for a single resource type.

        Args:
            error: The error message.
            resource_type: The resource type to set the error message for. If
                omitted, or if there is only one resource type involved, the
                error message is (also) set globally.

        Raises:
            KeyError: If the resource type is not found in the resources list.
        """
        if resource_type:
            resource = self.resources_dict.get(resource_type)
            if not resource:
                raise KeyError(
                    f"resource type '{resource_type}' not found in "
                    "service connector resources list"
                )
            resource.error = error
            resource.resource_ids = None
            if len(self.resources) == 1:
                # If there is only one resource type involved, set the global
                # error message as well.
                self.error = error
        else:
            self.error = error
            for resource in self.resources:
                resource.error = error
                resource.resource_ids = None

    def set_resource_ids(
        self, resource_type: str, resource_ids: List[str]
    ) -> None:
        """Set the resource IDs for a resource type.

        Args:
            resource_type: The resource type to set the resource IDs for.
            resource_ids: The resource IDs to set.

        Raises:
            KeyError: If the resource type is not found in the resources list.
        """
        resource = self.resources_dict.get(resource_type)
        if not resource:
            raise KeyError(
                f"resource type '{resource_type}' not found in "
                "service connector resources list"
            )
        resource.resource_ids = resource_ids
        resource.error = None

    @property
    def type(self) -> str:
        """Get the connector type.

        Returns:
            The connector type.
        """
        if isinstance(self.connector_type, str):
            return self.connector_type
        return self.connector_type.connector_type

    @property
    def emojified_connector_type(self) -> str:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return self.connector_type.emojified_connector_type

        return self.connector_type

    def get_emojified_resource_types(
        self, resource_type: Optional[str] = None
    ) -> List[str]:
        """Get the emojified resource type.

        Args:
            resource_type: The resource type to get the emojified resource type
                for. If omitted, the emojified resource type for all resource
                types is returned.


        Returns:
            The list of emojified resource types.
        """
        if not isinstance(self.connector_type, str):
            if resource_type:
                return [
                    self.connector_type.resource_type_dict[
                        resource_type
                    ].emojified_resource_type
                ]
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
                for resource_type in self.resources_dict.keys()
            ]
        if resource_type:
            return [resource_type]
        return list(self.resources_dict.keys())

    def get_default_resource_id(self) -> Optional[str]:
        """Get the default resource ID, if included in the resource list.

        The default resource ID is a resource ID supplied by the connector
        implementation only for resource types that do not support multiple
        instances.

        Returns:
            The default resource ID, or None if no resource ID is set.
        """
        if len(self.resources) != 1:
            # multi-type connectors do not have a default resource ID
            return None

        if isinstance(self.connector_type, str):
            # can't determine default resource ID for unknown connector types
            return None

        resource_type_spec = self.connector_type.resource_type_dict[
            self.resources[0].resource_type
        ]
        if resource_type_spec.supports_instances:
            # resource types that support multiple instances do not have a
            # default resource ID
            return None

        resource_ids = self.resources[0].resource_ids

        if not resource_ids or len(resource_ids) != 1:
            return None

        return resource_ids[0]

    @classmethod
    def from_connector_model(
        cls,
        connector_model: "ServiceConnectorResponse",
        resource_type: Optional[str] = None,
    ) -> "ServiceConnectorResourcesModel":
        """Initialize a resource model from a connector model.

        Args:
            connector_model: The connector model.
            resource_type: The resource type to set on the resource model. If
                omitted, the resource type is set according to the connector
                model.

        Returns:
            A resource list model instance.
        """
        resources = cls(
            id=connector_model.id,
            name=connector_model.name,
            connector_type=connector_model.type,
        )

        resource_types = resource_type or connector_model.resource_types
        for resource_type in resource_types:
            resources.resources.append(
                ServiceConnectorTypedResourcesModel(
                    resource_type=resource_type,
                    resource_ids=[connector_model.resource_id]
                    if connector_model.resource_id
                    else None,
                )
            )

        return resources

emojified_connector_type property

Get the emojified connector type.

Returns:

Type Description
str

The emojified connector type.

resource_types property

Get the resource types.

Returns:

Type Description
List[str]

The resource types.

resources_dict property

Get the resources as a dictionary indexed by resource type.

Returns:

Type Description
Dict[str, ServiceConnectorTypedResourcesModel]

The resources as a dictionary indexed by resource type.

type property

Get the connector type.

Returns:

Type Description
str

The connector type.

from_connector_model(connector_model, resource_type=None) classmethod

Initialize a resource model from a connector model.

Parameters:

Name Type Description Default
connector_model ServiceConnectorResponse

The connector model.

required
resource_type Optional[str]

The resource type to set on the resource model. If omitted, the resource type is set according to the connector model.

None

Returns:

Type Description
ServiceConnectorResourcesModel

A resource list model instance.

Source code in src/zenml/models/v2/misc/service_connector_type.py
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
@classmethod
def from_connector_model(
    cls,
    connector_model: "ServiceConnectorResponse",
    resource_type: Optional[str] = None,
) -> "ServiceConnectorResourcesModel":
    """Initialize a resource model from a connector model.

    Args:
        connector_model: The connector model.
        resource_type: The resource type to set on the resource model. If
            omitted, the resource type is set according to the connector
            model.

    Returns:
        A resource list model instance.
    """
    resources = cls(
        id=connector_model.id,
        name=connector_model.name,
        connector_type=connector_model.type,
    )

    resource_types = resource_type or connector_model.resource_types
    for resource_type in resource_types:
        resources.resources.append(
            ServiceConnectorTypedResourcesModel(
                resource_type=resource_type,
                resource_ids=[connector_model.resource_id]
                if connector_model.resource_id
                else None,
            )
        )

    return resources

get_default_resource_id()

Get the default resource ID, if included in the resource list.

The default resource ID is a resource ID supplied by the connector implementation only for resource types that do not support multiple instances.

Returns:

Type Description
Optional[str]

The default resource ID, or None if no resource ID is set.

Source code in src/zenml/models/v2/misc/service_connector_type.py
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
def get_default_resource_id(self) -> Optional[str]:
    """Get the default resource ID, if included in the resource list.

    The default resource ID is a resource ID supplied by the connector
    implementation only for resource types that do not support multiple
    instances.

    Returns:
        The default resource ID, or None if no resource ID is set.
    """
    if len(self.resources) != 1:
        # multi-type connectors do not have a default resource ID
        return None

    if isinstance(self.connector_type, str):
        # can't determine default resource ID for unknown connector types
        return None

    resource_type_spec = self.connector_type.resource_type_dict[
        self.resources[0].resource_type
    ]
    if resource_type_spec.supports_instances:
        # resource types that support multiple instances do not have a
        # default resource ID
        return None

    resource_ids = self.resources[0].resource_ids

    if not resource_ids or len(resource_ids) != 1:
        return None

    return resource_ids[0]

get_emojified_resource_types(resource_type=None)

Get the emojified resource type.

Parameters:

Name Type Description Default
resource_type Optional[str]

The resource type to get the emojified resource type for. If omitted, the emojified resource type for all resource types is returned.

None

Returns:

Type Description
List[str]

The list of emojified resource types.

Source code in src/zenml/models/v2/misc/service_connector_type.py
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
def get_emojified_resource_types(
    self, resource_type: Optional[str] = None
) -> List[str]:
    """Get the emojified resource type.

    Args:
        resource_type: The resource type to get the emojified resource type
            for. If omitted, the emojified resource type for all resource
            types is returned.


    Returns:
        The list of emojified resource types.
    """
    if not isinstance(self.connector_type, str):
        if resource_type:
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
            ]
        return [
            self.connector_type.resource_type_dict[
                resource_type
            ].emojified_resource_type
            for resource_type in self.resources_dict.keys()
        ]
    if resource_type:
        return [resource_type]
    return list(self.resources_dict.keys())

set_error(error, resource_type=None)

Set a global error message or an error for a single resource type.

Parameters:

Name Type Description Default
error str

The error message.

required
resource_type Optional[str]

The resource type to set the error message for. If omitted, or if there is only one resource type involved, the error message is (also) set globally.

None

Raises:

Type Description
KeyError

If the resource type is not found in the resources list.

Source code in src/zenml/models/v2/misc/service_connector_type.py
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
def set_error(
    self, error: str, resource_type: Optional[str] = None
) -> None:
    """Set a global error message or an error for a single resource type.

    Args:
        error: The error message.
        resource_type: The resource type to set the error message for. If
            omitted, or if there is only one resource type involved, the
            error message is (also) set globally.

    Raises:
        KeyError: If the resource type is not found in the resources list.
    """
    if resource_type:
        resource = self.resources_dict.get(resource_type)
        if not resource:
            raise KeyError(
                f"resource type '{resource_type}' not found in "
                "service connector resources list"
            )
        resource.error = error
        resource.resource_ids = None
        if len(self.resources) == 1:
            # If there is only one resource type involved, set the global
            # error message as well.
            self.error = error
    else:
        self.error = error
        for resource in self.resources:
            resource.error = error
            resource.resource_ids = None

set_resource_ids(resource_type, resource_ids)

Set the resource IDs for a resource type.

Parameters:

Name Type Description Default
resource_type str

The resource type to set the resource IDs for.

required
resource_ids List[str]

The resource IDs to set.

required

Raises:

Type Description
KeyError

If the resource type is not found in the resources list.

Source code in src/zenml/models/v2/misc/service_connector_type.py
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
def set_resource_ids(
    self, resource_type: str, resource_ids: List[str]
) -> None:
    """Set the resource IDs for a resource type.

    Args:
        resource_type: The resource type to set the resource IDs for.
        resource_ids: The resource IDs to set.

    Raises:
        KeyError: If the resource type is not found in the resources list.
    """
    resource = self.resources_dict.get(resource_type)
    if not resource:
        raise KeyError(
            f"resource type '{resource_type}' not found in "
            "service connector resources list"
        )
    resource.resource_ids = resource_ids
    resource.error = None

ServiceConnectorResponse

Bases: WorkspaceScopedResponse[ServiceConnectorResponseBody, ServiceConnectorResponseMetadata, ServiceConnectorResponseResources]

Response model for service connectors.

Source code in src/zenml/models/v2/core/service_connector.py
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
class ServiceConnectorResponse(
    WorkspaceScopedResponse[
        ServiceConnectorResponseBody,
        ServiceConnectorResponseMetadata,
        ServiceConnectorResponseResources,
    ]
):
    """Response model for service connectors."""

    # Disable the warning for updating responses, because we update the
    # service connector type in place
    _warn_on_response_updates: bool = False

    name: str = Field(
        title="The service connector name.",
        max_length=STR_FIELD_MAX_LENGTH,
    )

    def get_analytics_metadata(self) -> Dict[str, Any]:
        """Add the service connector labels to analytics metadata.

        Returns:
            Dict of analytics metadata.
        """
        metadata = super().get_analytics_metadata()

        metadata.update(
            {
                label[6:]: value
                for label, value in self.labels.items()
                if label.startswith("zenml:")
            }
        )
        return metadata

    def get_hydrated_version(self) -> "ServiceConnectorResponse":
        """Get the hydrated version of this service connector.

        Returns:
            an instance of the same entity with the metadata field attached.
        """
        from zenml.client import Client

        return Client().zen_store.get_service_connector(self.id)

    # Helper methods
    @property
    def type(self) -> str:
        """Get the connector type.

        Returns:
            The connector type.
        """
        if isinstance(self.connector_type, str):
            return self.connector_type
        return self.connector_type.connector_type

    @property
    def emojified_connector_type(self) -> str:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return self.connector_type.emojified_connector_type

        return self.connector_type

    @property
    def emojified_resource_types(self) -> List[str]:
        """Get the emojified connector type.

        Returns:
            The emojified connector type.
        """
        if not isinstance(self.connector_type, str):
            return [
                self.connector_type.resource_type_dict[
                    resource_type
                ].emojified_resource_type
                for resource_type in self.resource_types
            ]

        return self.resource_types

    @property
    def is_multi_type(self) -> bool:
        """Checks if the connector is multi-type.

        A multi-type connector can be used to access multiple types of
        resources.

        Returns:
            True if the connector is multi-type, False otherwise.
        """
        return len(self.resource_types) > 1

    @property
    def is_multi_instance(self) -> bool:
        """Checks if the connector is multi-instance.

        A multi-instance connector is configured to access multiple instances
        of the configured resource type.

        Returns:
            True if the connector is multi-instance, False otherwise.
        """
        return (
            not self.is_multi_type
            and self.supports_instances
            and not self.resource_id
        )

    @property
    def is_single_instance(self) -> bool:
        """Checks if the connector is single-instance.

        A single-instance connector is configured to access only a single
        instance of the configured resource type or does not support multiple
        resource instances.

        Returns:
            True if the connector is single-instance, False otherwise.
        """
        return not self.is_multi_type and not self.is_multi_instance

    @property
    def full_configuration(self) -> Dict[str, str]:
        """Get the full connector configuration, including secrets.

        Returns:
            The full connector configuration, including secrets.
        """
        config = self.configuration.copy()
        config.update(
            {k: v.get_secret_value() for k, v in self.secrets.items() if v}
        )
        return config

    def set_connector_type(
        self, value: Union[str, "ServiceConnectorTypeModel"]
    ) -> None:
        """Auxiliary method to set the connector type.

        Args:
            value: the new value for the connector type.
        """
        self.get_body().connector_type = value

    def validate_and_configure_resources(
        self,
        connector_type: "ServiceConnectorTypeModel",
        resource_types: Optional[Union[str, List[str]]] = None,
        resource_id: Optional[str] = None,
        configuration: Optional[Dict[str, Any]] = None,
        secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
    ) -> None:
        """Validate and configure the resources that the connector can be used to access.

        Args:
            connector_type: The connector type specification used to validate
                the connector configuration.
            resource_types: The type(s) of resource that the connector instance
                can be used to access. If omitted, a multi-type connector is
                configured.
            resource_id: Uniquely identifies a specific resource instance that
                the connector instance can be used to access.
            configuration: The connector configuration.
            secrets: The connector secrets.
        """
        _validate_and_configure_resources(
            connector=self,
            connector_type=connector_type,
            resource_types=resource_types,
            resource_id=resource_id,
            configuration=configuration,
            secrets=secrets,
        )

    # Body and metadata properties
    @property
    def description(self) -> str:
        """The `description` property.

        Returns:
            the value of the property.
        """
        return self.get_body().description

    @property
    def connector_type(self) -> Union[str, "ServiceConnectorTypeModel"]:
        """The `connector_type` property.

        Returns:
            the value of the property.
        """
        return self.get_body().connector_type

    @property
    def auth_method(self) -> str:
        """The `auth_method` property.

        Returns:
            the value of the property.
        """
        return self.get_body().auth_method

    @property
    def resource_types(self) -> List[str]:
        """The `resource_types` property.

        Returns:
            the value of the property.
        """
        return self.get_body().resource_types

    @property
    def resource_id(self) -> Optional[str]:
        """The `resource_id` property.

        Returns:
            the value of the property.
        """
        return self.get_body().resource_id

    @property
    def supports_instances(self) -> bool:
        """The `supports_instances` property.

        Returns:
            the value of the property.
        """
        return self.get_body().supports_instances

    @property
    def expires_at(self) -> Optional[datetime]:
        """The `expires_at` property.

        Returns:
            the value of the property.
        """
        return self.get_body().expires_at

    @property
    def expires_skew_tolerance(self) -> Optional[int]:
        """The `expires_skew_tolerance` property.

        Returns:
            the value of the property.
        """
        return self.get_body().expires_skew_tolerance

    @property
    def configuration(self) -> Dict[str, Any]:
        """The `configuration` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().configuration

    @property
    def secret_id(self) -> Optional[UUID]:
        """The `secret_id` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().secret_id

    @property
    def expiration_seconds(self) -> Optional[int]:
        """The `expiration_seconds` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().expiration_seconds

    @property
    def secrets(self) -> Dict[str, Optional[SecretStr]]:
        """The `secrets` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().secrets

    @property
    def labels(self) -> Dict[str, str]:
        """The `labels` property.

        Returns:
            the value of the property.
        """
        return self.get_metadata().labels

auth_method property

The auth_method property.

Returns:

Type Description
str

the value of the property.

configuration property

The configuration property.

Returns:

Type Description
Dict[str, Any]

the value of the property.

connector_type property

The connector_type property.

Returns:

Type Description
Union[str, ServiceConnectorTypeModel]

the value of the property.

description property

The description property.

Returns:

Type Description
str

the value of the property.

emojified_connector_type property

Get the emojified connector type.

Returns:

Type Description
str

The emojified connector type.

emojified_resource_types property

Get the emojified connector type.

Returns:

Type Description
List[str]

The emojified connector type.

expiration_seconds property

The expiration_seconds property.

Returns:

Type Description
Optional[int]

the value of the property.

expires_at property

The expires_at property.

Returns:

Type Description
Optional[datetime]

the value of the property.

expires_skew_tolerance property

The expires_skew_tolerance property.

Returns:

Type Description
Optional[int]

the value of the property.

full_configuration property

Get the full connector configuration, including secrets.

Returns:

Type Description
Dict[str, str]

The full connector configuration, including secrets.

is_multi_instance property

Checks if the connector is multi-instance.

A multi-instance connector is configured to access multiple instances of the configured resource type.

Returns:

Type Description
bool

True if the connector is multi-instance, False otherwise.

is_multi_type property

Checks if the connector is multi-type.

A multi-type connector can be used to access multiple types of resources.

Returns:

Type Description
bool

True if the connector is multi-type, False otherwise.

is_single_instance property

Checks if the connector is single-instance.

A single-instance connector is configured to access only a single instance of the configured resource type or does not support multiple resource instances.

Returns:

Type Description
bool

True if the connector is single-instance, False otherwise.

labels property

The labels property.

Returns:

Type Description
Dict[str, str]

the value of the property.

resource_id property

The resource_id property.

Returns:

Type Description
Optional[str]

the value of the property.

resource_types property

The resource_types property.

Returns:

Type Description
List[str]

the value of the property.

secret_id property

The secret_id property.

Returns:

Type Description
Optional[UUID]

the value of the property.

secrets property

The secrets property.

Returns:

Type Description
Dict[str, Optional[SecretStr]]

the value of the property.

supports_instances property

The supports_instances property.

Returns:

Type Description
bool

the value of the property.

type property

Get the connector type.

Returns:

Type Description
str

The connector type.

get_analytics_metadata()

Add the service connector labels to analytics metadata.

Returns:

Type Description
Dict[str, Any]

Dict of analytics metadata.

Source code in src/zenml/models/v2/core/service_connector.py
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
def get_analytics_metadata(self) -> Dict[str, Any]:
    """Add the service connector labels to analytics metadata.

    Returns:
        Dict of analytics metadata.
    """
    metadata = super().get_analytics_metadata()

    metadata.update(
        {
            label[6:]: value
            for label, value in self.labels.items()
            if label.startswith("zenml:")
        }
    )
    return metadata

get_hydrated_version()

Get the hydrated version of this service connector.

Returns:

Type Description
ServiceConnectorResponse

an instance of the same entity with the metadata field attached.

Source code in src/zenml/models/v2/core/service_connector.py
517
518
519
520
521
522
523
524
525
def get_hydrated_version(self) -> "ServiceConnectorResponse":
    """Get the hydrated version of this service connector.

    Returns:
        an instance of the same entity with the metadata field attached.
    """
    from zenml.client import Client

    return Client().zen_store.get_service_connector(self.id)

set_connector_type(value)

Auxiliary method to set the connector type.

Parameters:

Name Type Description Default
value Union[str, ServiceConnectorTypeModel]

the new value for the connector type.

required
Source code in src/zenml/models/v2/core/service_connector.py
622
623
624
625
626
627
628
629
630
def set_connector_type(
    self, value: Union[str, "ServiceConnectorTypeModel"]
) -> None:
    """Auxiliary method to set the connector type.

    Args:
        value: the new value for the connector type.
    """
    self.get_body().connector_type = value

validate_and_configure_resources(connector_type, resource_types=None, resource_id=None, configuration=None, secrets=None)

Validate and configure the resources that the connector can be used to access.

Parameters:

Name Type Description Default
connector_type ServiceConnectorTypeModel

The connector type specification used to validate the connector configuration.

required
resource_types Optional[Union[str, List[str]]]

The type(s) of resource that the connector instance can be used to access. If omitted, a multi-type connector is configured.

None
resource_id Optional[str]

Uniquely identifies a specific resource instance that the connector instance can be used to access.

None
configuration Optional[Dict[str, Any]]

The connector configuration.

None
secrets Optional[Dict[str, Optional[SecretStr]]]

The connector secrets.

None
Source code in src/zenml/models/v2/core/service_connector.py
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
def validate_and_configure_resources(
    self,
    connector_type: "ServiceConnectorTypeModel",
    resource_types: Optional[Union[str, List[str]]] = None,
    resource_id: Optional[str] = None,
    configuration: Optional[Dict[str, Any]] = None,
    secrets: Optional[Dict[str, Optional[SecretStr]]] = None,
) -> None:
    """Validate and configure the resources that the connector can be used to access.

    Args:
        connector_type: The connector type specification used to validate
            the connector configuration.
        resource_types: The type(s) of resource that the connector instance
            can be used to access. If omitted, a multi-type connector is
            configured.
        resource_id: Uniquely identifies a specific resource instance that
            the connector instance can be used to access.
        configuration: The connector configuration.
        secrets: The connector secrets.
    """
    _validate_and_configure_resources(
        connector=self,
        connector_type=connector_type,
        resource_types=resource_types,
        resource_id=resource_id,
        configuration=configuration,
        secrets=secrets,
    )

ServiceState

Bases: StrEnum

Possible states for the service and service endpoint.

Source code in src/zenml/services/service_status.py
25
26
27
28
29
30
31
32
33
class ServiceState(StrEnum):
    """Possible states for the service and service endpoint."""

    INACTIVE = "inactive"
    ACTIVE = "active"
    PENDING_STARTUP = "pending_startup"
    PENDING_SHUTDOWN = "pending_shutdown"
    ERROR = "error"
    SCALED_TO_ZERO = "scaled_to_zero"

Source

Bases: BaseModel

Source specification.

A source specifies a module name as well as an optional attribute of that module. These values can be used to import the module and get the value of the attribute inside the module.

Example

The source Source(module="zenml.config.source", attribute="Source") references the class that this docstring is describing. This class is defined in the zenml.config.source module and the name of the attribute is the class name Source.

Attributes:

Name Type Description
module str

The module name.

attribute Optional[str]

Optional name of the attribute inside the module.

type SourceType

The type of the source.

Source code in src/zenml/config/source.py
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
class Source(BaseModel):
    """Source specification.

    A source specifies a module name as well as an optional attribute of that
    module. These values can be used to import the module and get the value
    of the attribute inside the module.

    Example:
        The source `Source(module="zenml.config.source", attribute="Source")`
        references the class that this docstring is describing. This class is
        defined in the `zenml.config.source` module and the name of the
        attribute is the class name `Source`.

    Attributes:
        module: The module name.
        attribute: Optional name of the attribute inside the module.
        type: The type of the source.
    """

    module: str
    attribute: Optional[str] = None
    type: SourceType

    @classmethod
    def from_import_path(
        cls, import_path: str, is_module_path: bool = False
    ) -> "Source":
        """Creates a source from an import path.

        Args:
            import_path: The import path.
            is_module_path: If the import path points to a module or not.

        Raises:
            ValueError: If the import path is empty.

        Returns:
            The source.
        """
        if not import_path:
            raise ValueError(
                "Invalid empty import path. The import path needs to refer "
                "to a Python module and an optional attribute of that module."
            )

        # Remove internal version pins for backwards compatibility
        if "@" in import_path:
            import_path = import_path.split("@", 1)[0]

        if is_module_path or "." not in import_path:
            module = import_path
            attribute = None
        else:
            module, attribute = import_path.rsplit(".", maxsplit=1)

        return Source(
            module=module, attribute=attribute, type=SourceType.UNKNOWN
        )

    @property
    def import_path(self) -> str:
        """The import path of the source.

        Returns:
            The import path of the source.
        """
        if self.attribute:
            return f"{self.module}.{self.attribute}"
        else:
            return self.module

    @property
    def is_internal(self) -> bool:
        """If the source is internal (=from the zenml package).

        Returns:
            True if the source is internal, False otherwise
        """
        if self.type not in {SourceType.UNKNOWN, SourceType.INTERNAL}:
            return False

        return self.module.split(".", maxsplit=1)[0] == "zenml"

    @property
    def is_module_source(self) -> bool:
        """If the source is a module source.

        Returns:
            If the source is a module source.
        """
        return self.attribute is None

    model_config = ConfigDict(extra="allow")

    def model_dump(self, **kwargs: Any) -> Dict[str, Any]:
        """Dump the source as a dictionary.

        Args:
            **kwargs: Additional keyword arguments.

        Returns:
            The source as a dictionary.
        """
        return super().model_dump(serialize_as_any=True, **kwargs)

    def model_dump_json(self, **kwargs: Any) -> str:
        """Dump the source as a JSON string.

        Args:
            **kwargs: Additional keyword arguments.

        Returns:
            The source as a JSON string.
        """
        return super().model_dump_json(serialize_as_any=True, **kwargs)

import_path property

The import path of the source.

Returns:

Type Description
str

The import path of the source.

is_internal property

If the source is internal (=from the zenml package).

Returns:

Type Description
bool

True if the source is internal, False otherwise

is_module_source property

If the source is a module source.

Returns:

Type Description
bool

If the source is a module source.

from_import_path(import_path, is_module_path=False) classmethod

Creates a source from an import path.

Parameters:

Name Type Description Default
import_path str

The import path.

required
is_module_path bool

If the import path points to a module or not.

False

Raises:

Type Description
ValueError

If the import path is empty.

Returns:

Type Description
Source

The source.

Source code in src/zenml/config/source.py
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
@classmethod
def from_import_path(
    cls, import_path: str, is_module_path: bool = False
) -> "Source":
    """Creates a source from an import path.

    Args:
        import_path: The import path.
        is_module_path: If the import path points to a module or not.

    Raises:
        ValueError: If the import path is empty.

    Returns:
        The source.
    """
    if not import_path:
        raise ValueError(
            "Invalid empty import path. The import path needs to refer "
            "to a Python module and an optional attribute of that module."
        )

    # Remove internal version pins for backwards compatibility
    if "@" in import_path:
        import_path = import_path.split("@", 1)[0]

    if is_module_path or "." not in import_path:
        module = import_path
        attribute = None
    else:
        module, attribute = import_path.rsplit(".", maxsplit=1)

    return Source(
        module=module, attribute=attribute, type=SourceType.UNKNOWN
    )

model_dump(**kwargs)

Dump the source as a dictionary.

Parameters:

Name Type Description Default
**kwargs Any

Additional keyword arguments.

{}

Returns:

Type Description
Dict[str, Any]

The source as a dictionary.

Source code in src/zenml/config/source.py
143
144
145
146
147
148
149
150
151
152
def model_dump(self, **kwargs: Any) -> Dict[str, Any]:
    """Dump the source as a dictionary.

    Args:
        **kwargs: Additional keyword arguments.

    Returns:
        The source as a dictionary.
    """
    return super().model_dump(serialize_as_any=True, **kwargs)

model_dump_json(**kwargs)

Dump the source as a JSON string.

Parameters:

Name Type Description Default
**kwargs Any

Additional keyword arguments.

{}

Returns:

Type Description
str

The source as a JSON string.

Source code in src/zenml/config/source.py
154
155
156
157
158
159
160
161
162
163
def model_dump_json(self, **kwargs: Any) -> str:
    """Dump the source as a JSON string.

    Args:
        **kwargs: Additional keyword arguments.

    Returns:
        The source as a JSON string.
    """
    return super().model_dump_json(serialize_as_any=True, **kwargs)

StackComponentType

Bases: StrEnum

All possible types a StackComponent can have.

Source code in src/zenml/enums.py
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
class StackComponentType(StrEnum):
    """All possible types a `StackComponent` can have."""

    ALERTER = "alerter"
    ANNOTATOR = "annotator"
    ARTIFACT_STORE = "artifact_store"
    CONTAINER_REGISTRY = "container_registry"
    DATA_VALIDATOR = "data_validator"
    EXPERIMENT_TRACKER = "experiment_tracker"
    FEATURE_STORE = "feature_store"
    IMAGE_BUILDER = "image_builder"
    MODEL_DEPLOYER = "model_deployer"
    ORCHESTRATOR = "orchestrator"
    STEP_OPERATOR = "step_operator"
    MODEL_REGISTRY = "model_registry"

    @property
    def plural(self) -> str:
        """Returns the plural of the enum value.

        Returns:
            The plural of the enum value.
        """
        if self == StackComponentType.CONTAINER_REGISTRY:
            return "container_registries"
        elif self == StackComponentType.MODEL_REGISTRY:
            return "model_registries"

        return f"{self.value}s"

plural property

Returns the plural of the enum value.

Returns:

Type Description
str

The plural of the enum value.

StackDeploymentProvider

Bases: StrEnum

All possible stack deployment providers.

Source code in src/zenml/enums.py
426
427
428
429
430
431
class StackDeploymentProvider(StrEnum):
    """All possible stack deployment providers."""

    AWS = "aws"
    GCP = "gcp"
    AZURE = "azure"

StackFilter

Bases: WorkspaceScopedFilter

Model to enable advanced filtering of all StackModels.

The Stack Model needs additional scoping. As such the _scope_user field can be set to the user that is doing the filtering. The generate_filter() method of the baseclass is overwritten to include the scoping.

Source code in src/zenml/models/v2/core/stack.py
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
class StackFilter(WorkspaceScopedFilter):
    """Model to enable advanced filtering of all StackModels.

    The Stack Model needs additional scoping. As such the `_scope_user` field
    can be set to the user that is doing the filtering. The
    `generate_filter()` method of the baseclass is overwritten to include the
    scoping.
    """

    FILTER_EXCLUDE_FIELDS: ClassVar[List[str]] = [
        *WorkspaceScopedFilter.FILTER_EXCLUDE_FIELDS,
        "component_id",
        "component",
    ]

    name: Optional[str] = Field(
        default=None,
        description="Name of the stack",
    )
    description: Optional[str] = Field(
        default=None, description="Description of the stack"
    )
    component_id: Optional[Union[UUID, str]] = Field(
        default=None,
        description="Component in the stack",
        union_mode="left_to_right",
    )
    component: Optional[Union[UUID, str]] = Field(
        default=None, description="Name/ID of a component in the stack."
    )

    def get_custom_filters(
        self, table: Type["AnySchema"]
    ) -> List["ColumnElement[bool]"]:
        """Get custom filters.

        Args:
            table: The query table.

        Returns:
            A list of custom filters.
        """
        custom_filters = super().get_custom_filters(table)

        from zenml.zen_stores.schemas import (
            StackComponentSchema,
            StackCompositionSchema,
            StackSchema,
        )

        if self.component_id:
            component_id_filter = and_(
                StackCompositionSchema.stack_id == StackSchema.id,
                StackCompositionSchema.component_id == self.component_id,
            )
            custom_filters.append(component_id_filter)

        if self.component:
            component_filter = and_(
                StackCompositionSchema.stack_id == StackSchema.id,
                StackCompositionSchema.component_id == StackComponentSchema.id,
                self.generate_name_or_id_query_conditions(
                    value=self.component,
                    table=StackComponentSchema,
                ),
            )
            custom_filters.append(component_filter)

        return custom_filters

get_custom_filters(table)

Get custom filters.

Parameters:

Name Type Description Default
table Type[AnySchema]

The query table.

required

Returns:

Type Description
List[ColumnElement[bool]]

A list of custom filters.

Source code in src/zenml/models/v2/core/stack.py
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
def get_custom_filters(
    self, table: Type["AnySchema"]
) -> List["ColumnElement[bool]"]:
    """Get custom filters.

    Args:
        table: The query table.

    Returns:
        A list of custom filters.
    """
    custom_filters = super().get_custom_filters(table)

    from zenml.zen_stores.schemas import (
        StackComponentSchema,
        StackCompositionSchema,
        StackSchema,
    )

    if self.component_id:
        component_id_filter = and_(
            StackCompositionSchema.stack_id == StackSchema.id,
            StackCompositionSchema.component_id == self.component_id,
        )
        custom_filters.append(component_id_filter)

    if self.component:
        component_filter = and_(
            StackCompositionSchema.stack_id == StackSchema.id,
            StackCompositionSchema.component_id == StackComponentSchema.id,
            self.generate_name_or_id_query_conditions(
                value=self.component,
                table=StackComponentSchema,
            ),
        )
        custom_filters.append(component_filter)

    return custom_filters

StackRequest

Bases: BaseRequest

Request model for a stack.

Source code in src/zenml/models/v2/core/stack.py
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
class StackRequest(BaseRequest):
    """Request model for a stack."""

    user: Optional[UUID] = None
    workspace: Optional[UUID] = None

    name: str = Field(
        title="The name of the stack.", max_length=STR_FIELD_MAX_LENGTH
    )
    description: str = Field(
        default="",
        title="The description of the stack",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    stack_spec_path: Optional[str] = Field(
        default=None,
        title="The path to the stack spec used for mlstacks deployments.",
    )
    components: Dict[StackComponentType, List[Union[UUID, ComponentInfo]]] = (
        Field(
            title="The mapping for the components of the full stack registration.",
            description="The mapping from component types to either UUIDs of "
            "existing components or request information for brand new "
            "components.",
        )
    )
    labels: Optional[Dict[str, Any]] = Field(
        default=None,
        title="The stack labels.",
    )
    service_connectors: List[Union[UUID, ServiceConnectorInfo]] = Field(
        default=[],
        title="The service connectors dictionary for the full stack "
        "registration.",
        description="The UUID of an already existing service connector or "
        "request information to create a service connector from "
        "scratch.",
    )

    @property
    def is_valid(self) -> bool:
        """Check if the stack is valid.

        Returns:
            True if the stack is valid, False otherwise.
        """
        if not self.components:
            return False
        return (
            StackComponentType.ARTIFACT_STORE in self.components
            and StackComponentType.ORCHESTRATOR in self.components
        )

    @model_validator(mode="after")
    def _validate_indexes_in_components(self) -> "StackRequest":
        for components in self.components.values():
            for component in components:
                if isinstance(component, ComponentInfo):
                    if component.service_connector_index is not None:
                        if (
                            component.service_connector_index < 0
                            or component.service_connector_index
                            >= len(self.service_connectors)
                        ):
                            raise ValueError(
                                f"Service connector index "
                                f"{component.service_connector_index} "
                                "is out of range. Please provide a valid index "
                                "referring to the position in the list of service "
                                "connectors."
                            )
        return self

is_valid property

Check if the stack is valid.

Returns:

Type Description
bool

True if the stack is valid, False otherwise.

StoreType

Bases: StrEnum

Zen Store Backend Types.

Source code in src/zenml/enums.py
145
146
147
148
149
class StoreType(StrEnum):
    """Zen Store Backend Types."""

    SQL = "sql"
    REST = "rest"

TagFilter

Bases: BaseFilter

Model to enable advanced filtering of all tags.

Source code in src/zenml/models/v2/core/tag.py
121
122
123
124
125
126
127
128
129
class TagFilter(BaseFilter):
    """Model to enable advanced filtering of all tags."""

    name: Optional[str] = Field(
        description="The unique title of the tag.", default=None
    )
    color: Optional[ColorVariants] = Field(
        description="The color variant assigned to the tag.", default=None
    )

TagRequest

Bases: BaseRequest

Request model for tags.

Source code in src/zenml/models/v2/core/tag.py
35
36
37
38
39
40
41
42
43
44
45
class TagRequest(BaseRequest):
    """Request model for tags."""

    name: str = Field(
        description="The unique title of the tag.",
        max_length=STR_FIELD_MAX_LENGTH,
    )
    color: ColorVariants = Field(
        description="The color variant assigned to the tag.",
        default_factory=lambda: random.choice(list(ColorVariants)),
    )

TagUpdate

Bases: BaseModel

Update model for tags.

Source code in src/zenml/models/v2/core/tag.py
51
52
53
54
55
class TagUpdate(BaseModel):
    """Update model for tags."""

    name: Optional[str] = None
    color: Optional[ColorVariants] = None

UserFilter

Bases: BaseFilter

Model to enable advanced filtering of all Users.

Source code in src/zenml/models/v2/core/user.py
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
class UserFilter(BaseFilter):
    """Model to enable advanced filtering of all Users."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the user",
    )
    full_name: Optional[str] = Field(
        default=None,
        description="Full Name of the user",
    )
    email: Optional[str] = Field(
        default=None,
        description="Email of the user",
    )
    active: Optional[Union[bool, str]] = Field(
        default=None,
        description="Whether the user is active",
        union_mode="left_to_right",
    )
    email_opted_in: Optional[Union[bool, str]] = Field(
        default=None,
        description="Whether the user has opted in to emails",
        union_mode="left_to_right",
    )
    external_user_id: Optional[Union[UUID, str]] = Field(
        default=None,
        title="The external user ID associated with the account.",
        union_mode="left_to_right",
    )

    def apply_filter(
        self,
        query: AnyQuery,
        table: Type["AnySchema"],
    ) -> AnyQuery:
        """Override to filter out service accounts from the query.

        Args:
            query: The query to which to apply the filter.
            table: The query table.

        Returns:
            The query with filter applied.
        """
        query = super().apply_filter(query=query, table=table)
        query = query.where(
            getattr(table, "is_service_account") != True  # noqa: E712
        )

        return query

apply_filter(query, table)

Override to filter out service accounts from the query.

Parameters:

Name Type Description Default
query AnyQuery

The query to which to apply the filter.

required
table Type[AnySchema]

The query table.

required

Returns:

Type Description
AnyQuery

The query with filter applied.

Source code in src/zenml/models/v2/core/user.py
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
def apply_filter(
    self,
    query: AnyQuery,
    table: Type["AnySchema"],
) -> AnyQuery:
    """Override to filter out service accounts from the query.

    Args:
        query: The query to which to apply the filter.
        table: The query table.

    Returns:
        The query with filter applied.
    """
    query = super().apply_filter(query=query, table=table)
    query = query.where(
        getattr(table, "is_service_account") != True  # noqa: E712
    )

    return query

WorkspaceFilter

Bases: BaseFilter

Model to enable advanced filtering of all Workspaces.

Source code in src/zenml/models/v2/core/workspace.py
125
126
127
128
129
130
131
class WorkspaceFilter(BaseFilter):
    """Model to enable advanced filtering of all Workspaces."""

    name: Optional[str] = Field(
        default=None,
        description="Name of the workspace",
    )

ZenKeyError

Bases: KeyError

Specialized key error which allows error messages with line breaks.

Source code in src/zenml/exceptions.py
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
class ZenKeyError(KeyError):
    """Specialized key error which allows error messages with line breaks."""

    def __init__(self, message: str) -> None:
        """Initialization.

        Args:
            message:str, the error message
        """
        self.message = message

    def __str__(self) -> str:
        """String function.

        Returns:
            the error message
        """
        return self.message

__init__(message)

Initialization.

Parameters:

Name Type Description Default
message str

str, the error message

required
Source code in src/zenml/exceptions.py
250
251
252
253
254
255
256
def __init__(self, message: str) -> None:
    """Initialization.

    Args:
        message:str, the error message
    """
    self.message = message

__str__()

String function.

Returns:

Type Description
str

the error message

Source code in src/zenml/exceptions.py
258
259
260
261
262
263
264
def __str__(self) -> str:
    """String function.

    Returns:
        the error message
    """
    return self.message

ZenMLProjectTemplateLocation

Bases: BaseModel

A ZenML project template location.

Source code in src/zenml/cli/base.py
63
64
65
66
67
68
69
70
71
72
73
74
75
76
class ZenMLProjectTemplateLocation(BaseModel):
    """A ZenML project template location."""

    github_url: str
    github_tag: str

    @property
    def copier_github_url(self) -> str:
        """Get the GitHub URL for the copier.

        Returns:
            A GitHub URL in copier format.
        """
        return f"gh:{self.github_url}"

copier_github_url property

Get the GitHub URL for the copier.

Returns:

Type Description
str

A GitHub URL in copier format.

track_handler

Bases: object

Context handler to enable tracking the success status of an event.

Source code in src/zenml/analytics/utils.py
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
class track_handler(object):
    """Context handler to enable tracking the success status of an event."""

    def __init__(
        self,
        event: AnalyticsEvent,
        metadata: Optional[Dict[str, Any]] = None,
    ):
        """Initialization of the context manager.

        Args:
            event: The type of the analytics event
            metadata: The metadata of the event.
        """
        self.event: AnalyticsEvent = event
        self.metadata: Dict[str, Any] = metadata or {}

    def __enter__(self) -> "track_handler":
        """Enter function of the event handler.

        Returns:
            the handler instance.
        """
        return self

    def __exit__(
        self,
        type_: Optional[Any],
        value: Optional[Any],
        traceback: Optional[Any],
    ) -> Any:
        """Exit function of the event handler.

        Checks whether there was a traceback and updates the metadata
        accordingly. Following the check, it calls the function to track the
        event.

        Args:
            type_: The class of the exception
            value: The instance of the exception
            traceback: The traceback of the exception

        """
        if traceback is not None:
            self.metadata.update({"event_success": False})
        else:
            self.metadata.update({"event_success": True})

        if type_ is not None:
            self.metadata.update({"event_error_type": type_.__name__})

        track(self.event, self.metadata)

__enter__()

Enter function of the event handler.

Returns:

Type Description
track_handler

the handler instance.

Source code in src/zenml/analytics/utils.py
221
222
223
224
225
226
227
def __enter__(self) -> "track_handler":
    """Enter function of the event handler.

    Returns:
        the handler instance.
    """
    return self

__exit__(type_, value, traceback)

Exit function of the event handler.

Checks whether there was a traceback and updates the metadata accordingly. Following the check, it calls the function to track the event.

Parameters:

Name Type Description Default
type_ Optional[Any]

The class of the exception

required
value Optional[Any]

The instance of the exception

required
traceback Optional[Any]

The traceback of the exception

required
Source code in src/zenml/analytics/utils.py
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
def __exit__(
    self,
    type_: Optional[Any],
    value: Optional[Any],
    traceback: Optional[Any],
) -> Any:
    """Exit function of the event handler.

    Checks whether there was a traceback and updates the metadata
    accordingly. Following the check, it calls the function to track the
    event.

    Args:
        type_: The class of the exception
        value: The instance of the exception
        traceback: The traceback of the exception

    """
    if traceback is not None:
        self.metadata.update({"event_success": False})
    else:
        self.metadata.update({"event_success": True})

    if type_ is not None:
        self.metadata.update({"event_error_type": type_.__name__})

    track(self.event, self.metadata)

__init__(event, metadata=None)

Initialization of the context manager.

Parameters:

Name Type Description Default
event AnalyticsEvent

The type of the analytics event

required
metadata Optional[Dict[str, Any]]

The metadata of the event.

None
Source code in src/zenml/analytics/utils.py
207
208
209
210
211
212
213
214
215
216
217
218
219
def __init__(
    self,
    event: AnalyticsEvent,
    metadata: Optional[Dict[str, Any]] = None,
):
    """Initialization of the context manager.

    Args:
        event: The type of the analytics event
        metadata: The metadata of the event.
    """
    self.event: AnalyticsEvent = event
    self.metadata: Dict[str, Any] = metadata or {}

analytics()

Analytics for opt-in and opt-out.

Source code in src/zenml/cli/config.py
27
28
29
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def analytics() -> None:
    """Analytics for opt-in and opt-out."""

api_key(ctx, service_account_name_or_id)

List and manage the API keys associated with a service account.

Parameters:

Name Type Description Default
ctx Context

The click context.

required
service_account_name_or_id str

The name or ID of the service account.

required
Source code in src/zenml/cli/service_accounts.py
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
@service_account.group(
    cls=TagGroup,
    help="Commands for interacting with API keys.",
)
@click.pass_context
@click.argument("service_account_name_or_id", type=str, required=True)
def api_key(
    ctx: click.Context,
    service_account_name_or_id: str,
) -> None:
    """List and manage the API keys associated with a service account.

    Args:
        ctx: The click context.
        service_account_name_or_id: The name or ID of the service account.
    """
    ctx.obj = service_account_name_or_id

artifact()

Commands for interacting with artifacts.

Source code in src/zenml/cli/artifact.py
33
34
35
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def artifact() -> None:
    """Commands for interacting with artifacts."""

authorized_device()

Interact with authorized devices.

Source code in src/zenml/cli/authorized_device.py
32
33
34
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def authorized_device() -> None:
    """Interact with authorized devices."""

backup_database(strategy=None, location=None, overwrite=False)

Backup the ZenML database.

Parameters:

Name Type Description Default
strategy Optional[str]

Custom backup strategy to use. Defaults to whatever is configured in the store config.

None
location Optional[str]

Custom location to store the backup. Defaults to whatever is configured in the store config. Depending on the strategy, this can be a local path or a database name.

None
overwrite bool

Whether to overwrite the existing backup.

False
Source code in src/zenml/cli/base.py
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
@cli.command("backup-database", help="Create a database backup.", hidden=True)
@click.option(
    "--strategy",
    "-s",
    help="Custom backup strategy to use. Defaults to whatever is configured "
    "in the store config.",
    type=click.Choice(choices=DatabaseBackupStrategy.values()),
    required=False,
    default=None,
)
@click.option(
    "--location",
    default=None,
    help="Custom location to store the backup. Defaults to whatever is "
    "configured in the store config. Depending on the strategy, this can be "
    "a local path or a database name.",
    type=str,
)
@click.option(
    "--overwrite",
    "-o",
    is_flag=True,
    default=False,
    help="Overwrite the existing backup.",
    type=bool,
)
def backup_database(
    strategy: Optional[str] = None,
    location: Optional[str] = None,
    overwrite: bool = False,
) -> None:
    """Backup the ZenML database.

    Args:
        strategy: Custom backup strategy to use. Defaults to whatever is
            configured in the store config.
        location: Custom location to store the backup. Defaults to whatever is
            configured in the store config. Depending on the strategy, this can
            be a local path or a database name.
        overwrite: Whether to overwrite the existing backup.
    """
    from zenml.zen_stores.base_zen_store import BaseZenStore
    from zenml.zen_stores.sql_zen_store import SqlZenStore

    store_config = GlobalConfiguration().store_configuration
    if store_config.type == StoreType.SQL:
        store = BaseZenStore.create_store(
            store_config, skip_default_registrations=True, skip_migrations=True
        )
        assert isinstance(store, SqlZenStore)
        msg, location = store.backup_database(
            strategy=DatabaseBackupStrategy(strategy) if strategy else None,
            location=location,
            overwrite=overwrite,
        )
        cli_utils.declare(f"Database was backed up to {msg}.")
    else:
        cli_utils.warning(
            "Cannot backup database while connected to a ZenML server."
        )

backup_secrets(ignore_errors=True, delete_secrets=False)

Backup all secrets to the backup secrets store.

Parameters:

Name Type Description Default
ignore_errors bool

Whether to ignore individual errors when backing up secrets and continue with the backup operation until all secrets have been backed up.

True
delete_secrets bool

Whether to delete the secrets that have been successfully backed up from the primary secrets store. Setting this flag effectively moves all secrets from the primary secrets store to the backup secrets store.

False
Source code in src/zenml/cli/secret.py
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
@secret.command(
    "backup", help="Backup all secrets to the backup secrets store."
)
@click.option(
    "--ignore-errors",
    "-i",
    type=click.BOOL,
    default=True,
    help="Whether to ignore individual errors when backing up secrets and "
    "continue with the backup operation until all secrets have been backed up.",
)
@click.option(
    "--delete-secrets",
    "-d",
    is_flag=True,
    default=False,
    help="Whether to delete the secrets that have been successfully backed up "
    "from the primary secrets store. Setting this flag effectively moves all "
    "secrets from the primary secrets store to the backup secrets store.",
)
def backup_secrets(
    ignore_errors: bool = True, delete_secrets: bool = False
) -> None:
    """Backup all secrets to the backup secrets store.

    Args:
        ignore_errors: Whether to ignore individual errors when backing up
            secrets and continue with the backup operation until all secrets
            have been backed up.
        delete_secrets: Whether to delete the secrets that have been
            successfully backed up from the primary secrets store. Setting
            this flag effectively moves all secrets from the primary secrets
            store to the backup secrets store.
    """
    client = Client()

    with console.status("Backing up secrets..."):
        try:
            client.backup_secrets(
                ignore_errors=ignore_errors, delete_secrets=delete_secrets
            )
            declare("Secrets successfully backed up.")
        except NotImplementedError as e:
            error(f"Could not backup secrets: {str(e)}")

build_pipeline(source, config_path=None, stack_name_or_id=None, output_path=None)

Build Docker images for a pipeline.

Parameters:

Name Type Description Default
source str

Importable source resolving to a pipeline instance.

required
config_path Optional[str]

Path to pipeline configuration file.

None
stack_name_or_id Optional[str]

Name or ID of the stack for which the images should be built.

None
output_path Optional[str]

Optional file path to write the output to.

None
Source code in src/zenml/cli/pipeline.py
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
@pipeline.command(
    "build",
    help="Build Docker images for a pipeline. The SOURCE argument needs to be "
    " an importable source path resolving to a ZenML pipeline instance, e.g. "
    "`my_module.my_pipeline_instance`.",
)
@click.argument("source")
@click.option(
    "--config",
    "-c",
    "config_path",
    type=click.Path(exists=True, dir_okay=False),
    required=False,
    help="Path to configuration file for the build.",
)
@click.option(
    "--stack",
    "-s",
    "stack_name_or_id",
    type=str,
    required=False,
    help="Name or ID of the stack to use for the build.",
)
@click.option(
    "--output",
    "-o",
    "output_path",
    type=click.Path(exists=False, dir_okay=False),
    required=False,
    help="Output path for the build information.",
)
def build_pipeline(
    source: str,
    config_path: Optional[str] = None,
    stack_name_or_id: Optional[str] = None,
    output_path: Optional[str] = None,
) -> None:
    """Build Docker images for a pipeline.

    Args:
        source: Importable source resolving to a pipeline instance.
        config_path: Path to pipeline configuration file.
        stack_name_or_id: Name or ID of the stack for which the images should
            be built.
        output_path: Optional file path to write the output to.
    """
    if not Client().root:
        cli_utils.warning(
            "You're running the `zenml pipeline build` command without a "
            "ZenML repository. Your current working directory will be used "
            "as the source root relative to which the registered step classes "
            "will be resolved. To silence this warning, run `zenml init` at "
            "your source code root."
        )

    with cli_utils.temporary_active_stack(stack_name_or_id=stack_name_or_id):
        pipeline_instance = _import_pipeline(source=source)

        pipeline_instance = pipeline_instance.with_options(
            config_path=config_path
        )
        build = pipeline_instance.build()

    if build:
        cli_utils.declare(f"Created pipeline build `{build.id}`.")

        if output_path:
            cli_utils.declare(
                f"Writing pipeline build output to `{output_path}`."
            )
            write_yaml(output_path, build.to_yaml())
    else:
        cli_utils.declare("No docker builds required.")

builds()

Commands for pipeline builds.

Source code in src/zenml/cli/pipeline.py
573
574
575
@pipeline.group()
def builds() -> None:
    """Commands for pipeline builds."""

change_user_password(password=None, old_password=None)

Change the password of the current user.

Parameters:

Name Type Description Default
password Optional[str]

The new password for the current user.

None
old_password Optional[str]

The old password for the current user.

None
Source code in src/zenml/cli/user_management.py
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
@user.command(
    "change-password",
    help="Change the password for the current user account.",
)
@click.option(
    "--password",
    help=(
        "The new user password. If omitted, a prompt will be shown to enter "
        "the password."
    ),
    required=False,
    type=str,
)
@click.option(
    "--old-password",
    help=(
        "The old user password. If omitted, a prompt will be shown to enter "
        "the old password."
    ),
    required=False,
    type=str,
)
def change_user_password(
    password: Optional[str] = None, old_password: Optional[str] = None
) -> None:
    """Change the password of the current user.

    Args:
        password: The new password for the current user.
        old_password: The old password for the current user.
    """
    active_user = Client().active_user

    if old_password is not None or password is not None:
        cli_utils.warning(
            "Supplying password values in the command line is not safe. "
            "Please consider using the prompt option."
        )

    if old_password is None:
        old_password = click.prompt(
            f"Current password for user {active_user.name}",
            hide_input=True,
        )
    if password is None:
        password = click.prompt(
            f"New password for user {active_user.name}",
            hide_input=True,
        )
        password_again = click.prompt(
            f"Please re-enter the new password for user {active_user.name}",
            hide_input=True,
        )
        if password != password_again:
            cli_utils.error("Passwords do not match.")

    try:
        Client().update_user(
            name_id_or_prefix=active_user.id,
            old_password=old_password,
            updated_password=password,
        )
    except (KeyError, IllegalOperationError, AuthorizationException) as err:
        cli_utils.error(str(err))

    cli_utils.declare(
        f"Successfully updated password for active user '{active_user.name}'."
    )

clean(yes=False, local=False)

Delete all ZenML metadata, artifacts and stacks.

This is a destructive operation, primarily intended for use in development.

Parameters:

Name Type Description Default
yes bool

If you don't want a confirmation prompt.

False
local bool

If you want to delete local files associated with the active stack.

False
Source code in src/zenml/cli/base.py
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
@cli.command(
    "clean",
    hidden=True,
    help="Delete all ZenML metadata, artifacts and stacks.",
)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    default=False,
    help="Don't ask for confirmation.",
)
@click.option(
    "--local",
    "-l",
    is_flag=True,
    default=False,
    help="Delete local files relating to the active stack.",
)
def clean(yes: bool = False, local: bool = False) -> None:
    """Delete all ZenML metadata, artifacts and stacks.

    This is a destructive operation, primarily intended for use in development.

    Args:
        yes: If you don't want a confirmation prompt.
        local: If you want to delete local files associated with the active
            stack.
    """
    if local:
        curr_version = version.parse(zenml_version)

        global_version = GlobalConfiguration().version
        if global_version is not None:
            config_version = version.parse(global_version)

            if config_version > curr_version:
                error(
                    "Due to this version mismatch, ZenML can not detect and "
                    "shut down any running dashboards or clean any resources "
                    "related to the active stack."
                )
        _delete_local_files(force_delete=yes)
        return

    confirm = None
    if not yes:
        confirm = confirmation(
            "DANGER: This will completely delete all artifacts, metadata and "
            "stacks \never created during the use of ZenML. Pipelines and "
            "stack components running non-\nlocally will still exist. Please "
            "delete those manually. \n\nAre you sure you want to proceed?"
        )

    if yes or confirm:
        server = get_local_server()

        if server:
            from zenml.zen_server.deploy.deployer import LocalServerDeployer

            deployer = LocalServerDeployer()
            deployer.remove_server()
            cli_utils.declare("The local ZenML dashboard has been shut down.")

        # delete the .zen folder
        local_zen_repo_config = Path.cwd() / REPOSITORY_DIRECTORY_NAME
        if fileio.exists(str(local_zen_repo_config)):
            fileio.rmtree(str(local_zen_repo_config))
            declare(
                f"Deleted local ZenML config from {local_zen_repo_config}."
            )

        # delete the zen store and all other files and directories used by ZenML
        # to persist information locally (e.g. artifacts)
        global_zen_config = Path(get_global_config_directory())
        if fileio.exists(str(global_zen_config)):
            gc = GlobalConfiguration()
            for dir_name in fileio.listdir(str(global_zen_config)):
                if fileio.isdir(str(global_zen_config / str(dir_name))):
                    warning(
                        f"Deleting '{str(dir_name)}' directory from global "
                        f"config."
                    )
            fileio.rmtree(str(global_zen_config))
            declare(f"Deleted global ZenML config from {global_zen_config}.")
            GlobalConfiguration._reset_instance()
            fresh_gc = GlobalConfiguration(
                user_id=gc.user_id,
                analytics_opt_in=gc.analytics_opt_in,
                version=zenml_version,
            )
            fresh_gc.set_default_store()
            declare(f"Reinitialized ZenML global config at {Path.cwd()}.")

    else:
        declare("Aborting clean.")

code_repository()

Interact with code repositories.

Source code in src/zenml/cli/code_repository.py
35
36
37
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def code_repository() -> None:
    """Interact with code repositories."""

confirmation(text, *args, **kwargs)

Echo a confirmation string on the CLI.

Parameters:

Name Type Description Default
text str

Input text string.

required
*args Any

Args to be passed to click.confirm().

()
**kwargs Any

Kwargs to be passed to click.confirm().

{}

Returns:

Type Description
bool

Boolean based on user response.

Source code in src/zenml/cli/utils.py
124
125
126
127
128
129
130
131
132
133
134
135
def confirmation(text: str, *args: Any, **kwargs: Any) -> bool:
    """Echo a confirmation string on the CLI.

    Args:
        text: Input text string.
        *args: Args to be passed to click.confirm().
        **kwargs: Kwargs to be passed to click.confirm().

    Returns:
        Boolean based on user response.
    """
    return Confirm.ask(text, console=console)

connect(url=None, username=None, password=None, api_key=None, no_verify_ssl=False, ssl_ca_cert=None)

Connect to a remote ZenML server.

Parameters:

Name Type Description Default
url Optional[str]

The URL where the ZenML server is reachable.

None
username Optional[str]

The username that is used to authenticate with the ZenML server.

None
password Optional[str]

The password that is used to authenticate with the ZenML server.

None
api_key Optional[str]

The API key that is used to authenticate with the ZenML server.

None
no_verify_ssl bool

Whether to verify the server's TLS certificate.

False
ssl_ca_cert Optional[str]

A path to a CA bundle to use to verify the server's TLS certificate or the CA bundle value itself.

None
Source code in src/zenml/cli/server.py
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
@cli.command(
    "connect",
    help=(
        """Connect to a remote ZenML server.

    DEPRECATED: Please use `zenml login` instead.

    Examples:

      * to re-login to the current ZenML server or connect to a ZenML Pro server:

        zenml connect

      * to log in to a particular ZenML server:

        zenml connect --url=http://zenml.example.com:8080
    """
    ),
)
@click.option(
    "--url",
    "-u",
    help="The URL where the ZenML server is running.",
    required=False,
    type=str,
)
@click.option(
    "--username",
    help="(Deprecated) The username that is used to authenticate with a ZenML "
    "server. If omitted, the web login will be used.",
    required=False,
    type=str,
)
@click.option(
    "--password",
    help="(Deprecated) The password that is used to authenticate with a ZenML "
    "server. If omitted, a prompt will be shown to enter the password.",
    required=False,
    type=str,
)
@click.option(
    "--api-key",
    help="Use an API key to authenticate with a ZenML server. If "
    "omitted, the web login will be used.",
    required=False,
    type=str,
)
@click.option(
    "--no-verify-ssl",
    is_flag=True,
    help="Whether to verify the server's TLS certificate",
    default=False,
)
@click.option(
    "--ssl-ca-cert",
    help="A path to a CA bundle file to use to verify the server's TLS "
    "certificate or the CA bundle value itself",
    required=False,
    type=str,
)
def connect(
    url: Optional[str] = None,
    username: Optional[str] = None,
    password: Optional[str] = None,
    api_key: Optional[str] = None,
    no_verify_ssl: bool = False,
    ssl_ca_cert: Optional[str] = None,
) -> None:
    """Connect to a remote ZenML server.

    Args:
        url: The URL where the ZenML server is reachable.
        username: The username that is used to authenticate with the ZenML
            server.
        password: The password that is used to authenticate with the ZenML
            server.
        api_key: The API key that is used to authenticate with the ZenML
            server.
        no_verify_ssl: Whether to verify the server's TLS certificate.
        ssl_ca_cert: A path to a CA bundle to use to verify the server's TLS
            certificate or the CA bundle value itself.
    """
    cli_utils.warning(
        "The `zenml connect` command is deprecated and will be removed in a "
        "future release. Please use the `zenml login` command instead. "
    )

    if password is not None or username is not None:
        cli_utils.warning(
            "Connecting to a ZenML server using a username and password is "
            "insecure because the password is locally stored on your "
            "filesystem and is no longer supported. The web login workflow will "
            "be used instead. An alternative for non-interactive environments "
            "is to create and use a service account API key (see "
            "https://docs.zenml.io/how-to/connecting-to-zenml/connect-with-a-service-account "
            "for more information)."
        )

    # Calling the `zenml login` command
    cli_utils.declare("Calling `zenml login`...")
    login.callback(  # type: ignore[misc]
        server=url,
        api_key=api_key,
        no_verify_ssl=no_verify_ssl,
        ssl_ca_cert=ssl_ca_cert,
    )

connect_stack(stack_name_or_id=None, connector=None, interactive=False, no_verify=False)

Connect a service-connector to all components of a stack.

Parameters:

Name Type Description Default
stack_name_or_id Optional[str]

Name of the stack to connect.

None
connector Optional[str]

The name, ID or prefix of the connector to use.

None
interactive bool

Configure a service connector resource interactively.

False
no_verify bool

Skip verification of the connector resource.

False
Source code in src/zenml/cli/stack.py
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
@stack.command(
    "connect",
    help="Connect a service-connector to a stack's components. "
    "Note that this only connects the service-connector to the current "
    "components of the stack and not to the stack itself, which means that "
    "you need to rerun the command after adding new components to the stack.",
)
@click.argument("stack_name_or_id", type=str, required=False)
@click.option(
    "--connector",
    "-c",
    "connector",
    help="The name, ID or prefix of the connector to use.",
    required=False,
    type=str,
)
@click.option(
    "--interactive",
    "-i",
    "interactive",
    is_flag=True,
    default=False,
    help="Configure a service connector resource interactively.",
    type=click.BOOL,
)
@click.option(
    "--no-verify",
    "no_verify",
    is_flag=True,
    default=False,
    help="Skip verification of the connector resource.",
    type=click.BOOL,
)
def connect_stack(
    stack_name_or_id: Optional[str] = None,
    connector: Optional[str] = None,
    interactive: bool = False,
    no_verify: bool = False,
) -> None:
    """Connect a service-connector to all components of a stack.

    Args:
        stack_name_or_id: Name of the stack to connect.
        connector: The name, ID or prefix of the connector to use.
        interactive: Configure a service connector resource interactively.
        no_verify: Skip verification of the connector resource.
    """
    from zenml.cli.stack_components import (
        connect_stack_component_with_service_connector,
    )

    client = Client()
    stack_to_connect = client.get_stack(name_id_or_prefix=stack_name_or_id)
    for component in stack_to_connect.components.values():
        connect_stack_component_with_service_connector(
            component_type=component[0].type,
            name_id_or_prefix=component[0].name,
            connector=connector,
            interactive=interactive,
            no_verify=no_verify,
        )

connect_stack_component_with_service_connector(component_type, name_id_or_prefix=None, connector=None, resource_id=None, interactive=False, no_verify=False)

Connect the stack component to a resource through a service connector.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required
name_id_or_prefix Optional[str]

The name of the stack component to connect.

None
connector Optional[str]

The name, ID or prefix of the connector to use.

None
resource_id Optional[str]

The resource ID to use connect to. Only required for multi-instance connectors that are not already configured with a particular resource ID.

None
interactive bool

Configure a service connector resource interactively.

False
no_verify bool

Do not verify whether the resource is accessible.

False
Source code in src/zenml/cli/stack_components.py
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
def connect_stack_component_with_service_connector(
    component_type: StackComponentType,
    name_id_or_prefix: Optional[str] = None,
    connector: Optional[str] = None,
    resource_id: Optional[str] = None,
    interactive: bool = False,
    no_verify: bool = False,
) -> None:
    """Connect the stack component to a resource through a service connector.

    Args:
        component_type: Type of the component to generate the command for.
        name_id_or_prefix: The name of the stack component to connect.
        connector: The name, ID or prefix of the connector to use.
        resource_id: The resource ID to use connect to. Only
            required for multi-instance connectors that are not already
            configured with a particular resource ID.
        interactive: Configure a service connector resource interactively.
        no_verify: Do not verify whether the resource is accessible.
    """
    display_name = _component_display_name(component_type)

    if not connector and not interactive:
        cli_utils.error(
            "Please provide either a connector ID or set the interactive flag."
        )

    if connector and interactive:
        cli_utils.error(
            "Please provide either a connector ID or set the interactive "
            "flag, not both."
        )

    client = Client()

    try:
        component_model = client.get_stack_component(
            name_id_or_prefix=name_id_or_prefix,
            component_type=component_type,
        )
    except KeyError as err:
        cli_utils.error(str(err))

    requirements = component_model.flavor.connector_requirements

    if not requirements:
        cli_utils.error(
            f"The '{component_model.name}' {display_name} implementation "
            "does not support using a service connector to connect to "
            "resources."
        )

    resource_type = requirements.resource_type
    if requirements.resource_id_attr is not None:
        # Check if an attribute is set in the component configuration
        resource_id = component_model.configuration.get(
            requirements.resource_id_attr
        )

    if interactive:
        # Fetch the list of connectors that have resources compatible with
        # the stack component's flavor's resource requirements
        with console.status(
            "Finding all resources matching the stack component "
            "requirements (this could take a while)...\n"
        ):
            resource_list = client.list_service_connector_resources(
                connector_type=requirements.connector_type,
                resource_type=resource_type,
                resource_id=resource_id,
            )

        resource_list = [
            resource
            for resource in resource_list
            if resource.resources[0].resource_ids
        ]

        error_resource_list = [
            resource
            for resource in resource_list
            if not resource.resources[0].resource_ids
        ]

        if not resource_list:
            # No compatible resources were found
            additional_info = ""
            if error_resource_list:
                additional_info = (
                    f"{len(error_resource_list)} connectors can be used "
                    f"to gain access to {resource_type} resources required "
                    "for the stack component, but they are in an error "
                    "state or they didn't list any matching resources. "
                )
            command_args = ""
            if requirements.connector_type:
                command_args += (
                    f" --connector-type {requirements.connector_type}"
                )
            command_args += f" --resource-type {requirements.resource_type}"
            if resource_id:
                command_args += f" --resource-id {resource_id}"

            cli_utils.error(
                f"No compatible valid resources were found for the "
                f"'{component_model.name}' {display_name} in your "
                f"workspace. {additional_info}You can create a new "
                "connector using the 'zenml service-connector register' "
                "command or list the compatible resources using the "
                f"'zenml service-connector list-resources{command_args}' "
                "command."
            )

        # Prompt the user to select a connector and a resource ID, if
        # applicable
        connector_id, resource_id = prompt_select_resource(resource_list)
        no_verify = False
    else:
        # Non-interactive mode: we need to fetch the connector model first

        assert connector is not None
        try:
            connector_model = client.get_service_connector(connector)
        except KeyError as err:
            cli_utils.error(
                f"Could not find a connector '{connector}': {str(err)}"
            )

        connector_id = connector_model.id

        satisfied, msg = requirements.is_satisfied_by(
            connector_model, component_model
        )
        if not satisfied:
            cli_utils.error(
                f"The connector with ID {connector_id} does not match the "
                f"component's `{name_id_or_prefix}` of type `{component_type}`"
                f" connector requirements: {msg}. Please pick a connector that "
                f"is compatible with the component flavor and try again, or "
                f"use the interactive mode to select a compatible connector."
            )

        if not resource_id:
            if connector_model.resource_id:
                resource_id = connector_model.resource_id
            elif connector_model.supports_instances:
                cli_utils.error(
                    f"Multiple {resource_type} resources are available for "
                    "the selected connector. Please use the "
                    "`--resource-id` command line argument to configure a "
                    f"{resource_type} resource or use the interactive mode "
                    "to select a resource interactively."
                )

    connector_resources: Optional[ServiceConnectorResourcesModel] = None
    if not no_verify:
        with console.status(
            "Validating service connector resource configuration...\n"
        ):
            try:
                connector_resources = client.verify_service_connector(
                    connector_id,
                    resource_type=requirements.resource_type,
                    resource_id=resource_id,
                )
            except (
                KeyError,
                ValueError,
                IllegalOperationError,
                NotImplementedError,
                AuthorizationException,
            ) as e:
                cli_utils.error(
                    f"Access to the resource could not be verified: {e}"
                )
        resources = connector_resources.resources[0]
        if resources.resource_ids:
            if len(resources.resource_ids) > 1:
                cli_utils.error(
                    f"Multiple {resource_type} resources are available for "
                    "the selected connector. Please use the "
                    "`--resource-id` command line argument to configure a "
                    f"{resource_type} resource or use the interactive mode "
                    "to select a resource interactively."
                )
            else:
                resource_id = resources.resource_ids[0]

    with console.status(f"Updating {display_name} '{name_id_or_prefix}'...\n"):
        try:
            client.update_stack_component(
                name_id_or_prefix=name_id_or_prefix,
                component_type=component_type,
                connector_id=connector_id,
                connector_resource_id=resource_id,
            )
        except (KeyError, IllegalOperationError) as err:
            cli_utils.error(str(err))

    if connector_resources is not None:
        cli_utils.declare(
            f"Successfully connected {display_name} "
            f"`{component_model.name}` to the following resources:"
        )

        cli_utils.print_service_connector_resource_table([connector_resources])

    else:
        cli_utils.declare(
            f"Successfully connected {display_name} "
            f"`{component_model.name}` to resource."
        )

connect_to_pro_server(pro_server=None, api_key=None, refresh=False, pro_api_url=None)

Connect the client to a ZenML Pro server.

Parameters:

Name Type Description Default
pro_server Optional[str]

The UUID, name or URL of the ZenML Pro server to connect to. If not provided, the web login flow will be initiated.

None
api_key Optional[str]

The API key to use to authenticate with the ZenML Pro server.

None
refresh bool

Whether to force a new login flow with the ZenML Pro server.

False
pro_api_url Optional[str]

The URL for the ZenML Pro API.

None

Raises:

Type Description
ValueError

If incorrect parameters are provided.

AuthorizationException

If the user does not have access to the ZenML Pro server.

Source code in src/zenml/cli/login.py
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
def connect_to_pro_server(
    pro_server: Optional[str] = None,
    api_key: Optional[str] = None,
    refresh: bool = False,
    pro_api_url: Optional[str] = None,
) -> None:
    """Connect the client to a ZenML Pro server.

    Args:
        pro_server: The UUID, name or URL of the ZenML Pro server to connect to.
            If not provided, the web login flow will be initiated.
        api_key: The API key to use to authenticate with the ZenML Pro server.
        refresh: Whether to force a new login flow with the ZenML Pro server.
        pro_api_url: The URL for the ZenML Pro API.

    Raises:
        ValueError: If incorrect parameters are provided.
        AuthorizationException: If the user does not have access to the ZenML
            Pro server.
    """
    from zenml.login.credentials_store import get_credentials_store
    from zenml.login.pro.client import ZenMLProClient
    from zenml.login.pro.tenant.models import TenantStatus

    pro_api_url = pro_api_url or ZENML_PRO_API_URL
    pro_api_url = pro_api_url.rstrip("/")

    server_id, server_url, server_name = None, None, None
    login = False
    if not pro_server:
        login = True
        if api_key:
            raise ValueError(
                "You must provide the URL of the ZenML Pro server when "
                "connecting with an API key."
            )

    elif not re.match(r"^https?://", pro_server):
        # The server argument is not a URL, so it must be a ZenML Pro server
        # name or UUID.
        try:
            server_id = UUID(pro_server)
        except ValueError:
            # The server argument is not a UUID, so it must be a ZenML Pro
            # server name.
            server_name = pro_server
    else:
        server_url = pro_server

    credentials_store = get_credentials_store()
    if not credentials_store.has_valid_pro_authentication(pro_api_url):
        # Without valid ZenML Pro credentials, we can only connect to a ZenML
        # Pro server with an API key and we also need to know the URL of the
        # server to connect to.
        if api_key:
            if server_url:
                connect_to_server(server_url, api_key=api_key, pro_server=True)
                return
            else:
                raise ValueError(
                    "You must provide the URL of the ZenML Pro server when "
                    "connecting with an API key."
                )
        else:
            login = True

    if login or refresh:
        try:
            token = web_login(
                pro_api_url=pro_api_url,
            )
        except AuthorizationException as e:
            cli_utils.error(f"Authorization error: {e}")

        cli_utils.declare(
            "You can now run 'zenml server list' to view the available ZenML "
            "Pro servers and then 'zenml login <server-url-name-or-id>' to "
            "connect to a specific server without having to log in again until "
            "your session expires."
        )

        tenant_id: Optional[str] = None
        if token.device_metadata:
            tenant_id = token.device_metadata.get("tenant_id")

        if tenant_id is None and pro_server is None:
            # This is not really supposed to happen, because the implementation
            # of the web login workflow should always return a tenant ID, but
            # we're handling it just in case.
            cli_utils.declare(
                "A valid server was not selected during the login process. "
                "Please run `zenml server list` to display a list of available "
                "servers and then `zenml login <server-url-name-or-id>` to "
                "connect to a server."
            )
            return

        # The server selected during the web login process overrides any
        # server argument passed to the command.
        server_id = UUID(tenant_id)

    client = ZenMLProClient(pro_api_url)

    if server_id:
        server = client.tenant.get(server_id)
    elif server_url:
        servers = client.tenant.list(url=server_url, member_only=True)
        if not servers:
            raise AuthorizationException(
                f"The '{server_url}' URL belongs to a ZenML Pro server, "
                "but it doesn't look like you have access to it. Please "
                "check the server URL and your permissions and try again."
            )

        server = servers[0]
    elif server_name:
        servers = client.tenant.list(tenant_name=server_name, member_only=True)
        if not servers:
            raise AuthorizationException(
                f"No ZenML Pro server with the name '{server_name}' exists "
                "or you don't have access to it. Please check the server name "
                "and your permissions and try again."
            )
        server = servers[0]
    else:
        raise ValueError(
            "No server ID, URL, or name was provided. Please provide one of "
            "these values to connect to a ZenML Pro server."
        )

    server_id = server.id

    if server.status == TenantStatus.PENDING:
        with console.status(
            f"Waiting for your `{server.name}` ZenML Pro server to be set up..."
        ):
            timeout = 180  # 3 minutes
            while True:
                time.sleep(5)
                server = client.tenant.get(server_id)
                if server.status != TenantStatus.PENDING:
                    break
                timeout -= 5
                if timeout <= 0:
                    cli_utils.error(
                        f"Your `{server.name}` ZenML Pro server is taking "
                        "longer than expected to set up. Please try again "
                        "later or manage the server state by visiting the "
                        f"ZenML Pro dashboard at {server.dashboard_url}."
                    )

    if server.status == TenantStatus.FAILED:
        cli_utils.error(
            f"Your `{server.name}` ZenML Pro server is currently in a "
            "failed state. Please manage the server state by visiting the "
            f"ZenML Pro dashboard at {server.dashboard_url}, or contact "
            "your server administrator."
        )

    elif server.status == TenantStatus.DEACTIVATED:
        cli_utils.error(
            f"Your `{server.name}` ZenML Pro server is currently "
            "deactivated. Please manage the server state by visiting the "
            f"ZenML Pro dashboard at {server.dashboard_url}, or contact "
            "your server administrator."
        )

    elif server.status == TenantStatus.AVAILABLE:
        if not server.url:
            cli_utils.error(
                f"The ZenML Pro server '{server.name}' is not currently "
                f"running. Visit the ZenML Pro dashboard to manage the server "
                f"status at: {server.dashboard_url}"
            )
    else:
        cli_utils.error(
            f"Your `{server.name}` ZenML Pro server is currently "
            "being deleted. Please select a different server or set up a "
            "new server by visiting the ZenML Pro dashboard at "
            f"{server.dashboard_organization_url}."
        )

    cli_utils.declare(
        f"Connecting to ZenML Pro server: {server.name} [{str(server.id)}] "
    )

    connect_to_server(server.url, api_key=api_key, pro_server=True)

    # Update the stored server info with more accurate data taken from the
    # ZenML Pro tenant object.
    credentials_store.update_server_info(server.url, server)

    cli_utils.declare(f"Connected to ZenML Pro server: {server.name}.")

connect_to_server(url, api_key=None, verify_ssl=True, refresh=False, pro_server=False)

Connect the client to a ZenML server or a SQL database.

Parameters:

Name Type Description Default
url str

The URL of the ZenML server or the SQL database to connect to.

required
api_key Optional[str]

The API key to use to authenticate with the ZenML server.

None
verify_ssl Union[str, bool]

Whether to verify the server's TLS certificate. If a string is passed, it is interpreted as the path to a CA bundle file.

True
refresh bool

Whether to force a new login flow with the ZenML server.

False
pro_server bool

Whether the server is a ZenML Pro server.

False
Source code in src/zenml/cli/login.py
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
def connect_to_server(
    url: str,
    api_key: Optional[str] = None,
    verify_ssl: Union[str, bool] = True,
    refresh: bool = False,
    pro_server: bool = False,
) -> None:
    """Connect the client to a ZenML server or a SQL database.

    Args:
        url: The URL of the ZenML server or the SQL database to connect to.
        api_key: The API key to use to authenticate with the ZenML server.
        verify_ssl: Whether to verify the server's TLS certificate. If a string
            is passed, it is interpreted as the path to a CA bundle file.
        refresh: Whether to force a new login flow with the ZenML server.
        pro_server: Whether the server is a ZenML Pro server.
    """
    from zenml.login.credentials_store import get_credentials_store
    from zenml.zen_stores.base_zen_store import BaseZenStore

    url = url.rstrip("/")

    store_type = BaseZenStore.get_store_type(url)
    if store_type == StoreType.REST:
        from zenml.zen_stores.rest_zen_store import RestZenStoreConfiguration

        credentials_store = get_credentials_store()
        if api_key:
            cli_utils.declare(
                f"Authenticating to ZenML server '{url}' using an API key..."
            )
            credentials_store.set_api_key(url, api_key)
        elif pro_server:
            # We don't have to do anything here assuming the user has already
            # logged in to the ZenML Pro server using the ZenML Pro web login
            # flow.
            cli_utils.declare(f"Authenticating to ZenML server '{url}'...")
        else:
            if refresh or not credentials_store.has_valid_authentication(url):
                cli_utils.declare(
                    f"Authenticating to ZenML server '{url}' using the web "
                    "login..."
                )
                web_login(url=url, verify_ssl=verify_ssl)
            else:
                cli_utils.declare(f"Connecting to ZenML server '{url}'...")

        rest_store_config = RestZenStoreConfiguration(
            url=url,
            verify_ssl=verify_ssl,
        )
        try:
            GlobalConfiguration().set_store(rest_store_config)
        except IllegalOperationError:
            cli_utils.error(
                f"You do not have sufficient permissions to "
                f"access the server at '{url}'."
            )
        except CredentialsNotValid as e:
            cli_utils.error(f"Authorization error: {e}")

    else:
        from zenml.zen_stores.sql_zen_store import SqlZenStoreConfiguration

        # Connect to a SQL database
        sql_store_config = SqlZenStoreConfiguration(
            url=url,
        )
        cli_utils.declare(f"Connecting to SQL database '{url}'...")

        try:
            GlobalConfiguration().set_store(sql_store_config)
        except IllegalOperationError:
            cli_utils.warning(
                f"You do not have sufficient permissions to "
                f"access the SQL database at '{url}'."
            )
        except CredentialsNotValid as e:
            cli_utils.warning(f"Authorization error: {e}")

        cli_utils.declare(f"Connected to SQL database '{url}'")

connected_to_local_server()

Check if the client is connected to a local server.

Returns:

Type Description
bool

True if the client is connected to a local server, False otherwise.

Source code in src/zenml/zen_server/utils.py
287
288
289
290
291
292
293
294
295
296
def connected_to_local_server() -> bool:
    """Check if the client is connected to a local server.

    Returns:
        True if the client is connected to a local server, False otherwise.
    """
    from zenml.zen_server.deploy.deployer import LocalServerDeployer

    deployer = LocalServerDeployer()
    return deployer.is_connected_to_server()

convert_structured_str_to_dict(string)

Convert a structured string (JSON or YAML) into a dict.

Examples:

>>> convert_structured_str_to_dict('{"location": "Nevada", "aliens":"many"}')
{'location': 'Nevada', 'aliens': 'many'}
>>> convert_structured_str_to_dict('location: Nevada \naliens: many')
{'location': 'Nevada', 'aliens': 'many'}
>>> convert_structured_str_to_dict("{'location': 'Nevada', 'aliens': 'many'}")
{'location': 'Nevada', 'aliens': 'many'}

Parameters:

Name Type Description Default
string str

JSON or YAML string value

required

Returns:

Name Type Description
dict_ Dict[str, str]

dict from structured JSON or YAML str

Source code in src/zenml/cli/utils.py
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
def convert_structured_str_to_dict(string: str) -> Dict[str, str]:
    """Convert a structured string (JSON or YAML) into a dict.

    Examples:
        >>> convert_structured_str_to_dict('{"location": "Nevada", "aliens":"many"}')
        {'location': 'Nevada', 'aliens': 'many'}
        >>> convert_structured_str_to_dict('location: Nevada \\naliens: many')
        {'location': 'Nevada', 'aliens': 'many'}
        >>> convert_structured_str_to_dict("{'location': 'Nevada', 'aliens': 'many'}")
        {'location': 'Nevada', 'aliens': 'many'}

    Args:
        string: JSON or YAML string value

    Returns:
        dict_: dict from structured JSON or YAML str
    """
    try:
        dict_: Dict[str, str] = json.loads(string)
        return dict_
    except ValueError:
        pass

    try:
        # Here, Dict type in str is implicitly supported by yaml.safe_load()
        dict_ = yaml.safe_load(string)
        return dict_
    except yaml.YAMLError:
        pass

    error(
        f"Invalid argument: '{string}'. Please provide the value in JSON or YAML format."
    )

copy_dir(source_dir, destination_dir, overwrite=False)

Copies dir from source to destination.

Parameters:

Name Type Description Default
source_dir str

Path to copy from.

required
destination_dir str

Path to copy to.

required
overwrite bool

Boolean. If false, function throws an error before overwrite.

False
Source code in src/zenml/utils/io_utils.py
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
def copy_dir(
    source_dir: str, destination_dir: str, overwrite: bool = False
) -> None:
    """Copies dir from source to destination.

    Args:
        source_dir: Path to copy from.
        destination_dir: Path to copy to.
        overwrite: Boolean. If false, function throws an error before overwrite.
    """
    for source_file in listdir(source_dir):
        source_path = os.path.join(source_dir, convert_to_str(source_file))
        destination_path = os.path.join(
            destination_dir, convert_to_str(source_file)
        )
        if isdir(source_path):
            if source_path == destination_dir:
                # if the destination is a subdirectory of the source, we skip
                # copying it to avoid an infinite loop.
                continue
            copy_dir(source_path, destination_path, overwrite)
        else:
            create_dir_recursive_if_not_exists(
                os.path.dirname(destination_path)
            )
            copy(str(source_path), str(destination_path), overwrite)

copy_stack(source_stack_name_or_id, target_stack)

Copy a stack.

Parameters:

Name Type Description Default
source_stack_name_or_id str

The name or id of the stack to copy.

required
target_stack str

Name of the copied stack.

required
Source code in src/zenml/cli/stack.py
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
@stack.command("copy", help="Copy a stack to a new stack name.")
@click.argument("source_stack_name_or_id", type=str, required=True)
@click.argument("target_stack", type=str, required=True)
def copy_stack(source_stack_name_or_id: str, target_stack: str) -> None:
    """Copy a stack.

    Args:
        source_stack_name_or_id: The name or id of the stack to copy.
        target_stack: Name of the copied stack.
    """
    client = Client()

    with console.status(f"Copying stack `{source_stack_name_or_id}`...\n"):
        try:
            stack_to_copy = client.get_stack(
                name_id_or_prefix=source_stack_name_or_id
            )
        except KeyError as err:
            cli_utils.error(str(err))

        component_mapping: Dict[StackComponentType, Union[str, UUID]] = {}

        for c_type, c_list in stack_to_copy.components.items():
            if c_list:
                component_mapping[c_type] = c_list[0].id

        copied_stack = client.create_stack(
            name=target_stack,
            components=component_mapping,
        )

    print_model_url(get_stack_url(copied_stack))

create_api_key(service_account_name_or_id, name, description, set_key=False, output_file=None)

Create an API key.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account to which the API key should be added.

required
name str

Name of the API key

required
description Optional[str]

The API key description.

required
set_key bool

Configure the local client with the generated key.

False
output_file Optional[str]

Output file to write the API key to.

None
Source code in src/zenml/cli/service_accounts.py
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
@api_key.command(
    "create",
    help="Create an API key and print its value.",
)
@click.argument("name", type=click.STRING)
@click.option(
    "--description",
    "-d",
    type=str,
    required=False,
    help="The API key description.",
)
@click.option(
    "--set-key",
    is_flag=True,
    help="Configure the local client with the generated key.",
)
@click.option(
    "--output-file",
    type=str,
    required=False,
    help="File to write the API key to.",
)
@click.pass_obj
def create_api_key(
    service_account_name_or_id: str,
    name: str,
    description: Optional[str],
    set_key: bool = False,
    output_file: Optional[str] = None,
) -> None:
    """Create an API key.

    Args:
        service_account_name_or_id: The name or ID of the service account to
            which the API key should be added.
        name: Name of the API key
        description: The API key description.
        set_key: Configure the local client with the generated key.
        output_file: Output file to write the API key to.
    """
    _create_api_key(
        service_account_name_or_id=service_account_name_or_id,
        name=name,
        description=description,
        set_key=set_key,
        output_file=output_file,
    )

create_run_template(source, name, config_path=None, stack_name_or_id=None)

Create a run template for a pipeline.

Parameters:

Name Type Description Default
source str

Importable source resolving to a pipeline instance.

required
name str

Name of the run template.

required
config_path Optional[str]

Path to pipeline configuration file.

None
stack_name_or_id Optional[str]

Name or ID of the stack for which the template should be created.

None
Source code in src/zenml/cli/pipeline.py
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
@pipeline.command(
    "create-run-template",
    help="Create a run template for a pipeline. The SOURCE argument needs to "
    "be an importable source path resolving to a ZenML pipeline instance, e.g. "
    "`my_module.my_pipeline_instance`.",
)
@click.argument("source")
@click.option(
    "--name",
    "-n",
    type=str,
    required=True,
    help="Name for the template",
)
@click.option(
    "--config",
    "-c",
    "config_path",
    type=click.Path(exists=True, dir_okay=False),
    required=False,
    help="Path to configuration file for the build.",
)
@click.option(
    "--stack",
    "-s",
    "stack_name_or_id",
    type=str,
    required=False,
    help="Name or ID of the stack to use for the build.",
)
def create_run_template(
    source: str,
    name: str,
    config_path: Optional[str] = None,
    stack_name_or_id: Optional[str] = None,
) -> None:
    """Create a run template for a pipeline.

    Args:
        source: Importable source resolving to a pipeline instance.
        name: Name of the run template.
        config_path: Path to pipeline configuration file.
        stack_name_or_id: Name or ID of the stack for which the template should
            be created.
    """
    if not Client().root:
        cli_utils.warning(
            "You're running the `zenml pipeline create-run-template` command "
            "without a ZenML repository. Your current working directory will "
            "be used as the source root relative to which the registered step "
            "classes will be resolved. To silence this warning, run `zenml "
            "init` at your source code root."
        )

    with cli_utils.temporary_active_stack(stack_name_or_id=stack_name_or_id):
        pipeline_instance = _import_pipeline(source=source)

        pipeline_instance = pipeline_instance.with_options(
            config_path=config_path
        )
        template = pipeline_instance.create_run_template(name=name)

    cli_utils.declare(f"Created run template `{template.id}`.")

create_secret(name, scope, interactive, values, args)

Create a secret.

Parameters:

Name Type Description Default
name str

The name of the secret to create.

required
scope str

The scope of the secret to create.

required
interactive bool

Whether to use interactive mode to enter the secret values.

required
values str

Secret key-value pairs to be passed as JSON or YAML.

required
args List[str]

The arguments to pass to the secret.

required
Source code in src/zenml/cli/secret.py
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
@secret.command(
    "create",
    context_settings={"ignore_unknown_options": True},
    help="Create a new secret.",
)
@click.argument("name", type=click.STRING)
@click.option(
    "--scope",
    "-s",
    "scope",
    type=click.Choice([scope.value for scope in list(SecretScope)]),
    default=SecretScope.WORKSPACE.value,
)
@click.option(
    "--interactive",
    "-i",
    "interactive",
    is_flag=True,
    help="Use interactive mode to enter the secret values.",
    type=click.BOOL,
)
@click.option(
    "--values",
    "-v",
    "values",
    help="Pass one or more values using JSON or YAML format or reference a file by prefixing the filename with the @ "
    "special character.",
    required=False,
    type=str,
)
@click.argument("args", nargs=-1, type=click.UNPROCESSED)
def create_secret(
    name: str, scope: str, interactive: bool, values: str, args: List[str]
) -> None:
    """Create a secret.

    Args:
        name: The name of the secret to create.
        scope: The scope of the secret to create.
        interactive: Whether to use interactive mode to enter the secret values.
        values: Secret key-value pairs to be passed as JSON or YAML.
        args: The arguments to pass to the secret.
    """
    name, parsed_args = parse_name_and_extra_arguments(  # type: ignore[assignment]
        list(args) + [name], expand_args=True
    )
    if values:
        inline_values = expand_argument_value_from_file(SECRET_VALUES, values)
        inline_values_dict = convert_structured_str_to_dict(inline_values)
        parsed_args.update(inline_values_dict)

    if "name" in parsed_args:
        error("You can't use 'name' as the key for one of your secrets.")
    elif name == "name":
        error("Secret names cannot be named 'name'.")

    try:
        client = Client()
        if interactive:
            if parsed_args:
                error(
                    "Cannot pass secret fields as arguments when using "
                    "interactive mode."
                )
            else:
                click.echo("Entering interactive mode:")
                while True:
                    k = click.prompt("Please enter a secret key")
                    if k in parsed_args:
                        warning(
                            f"Key {k} already in this secret. Please restart "
                            f"this process or use 'zenml "
                            f"secret update {name} --values=<JSON/YAML> or --{k}=...' to update this "
                            f"key after the secret is registered. Skipping ..."
                        )
                    else:
                        v = getpass.getpass(
                            f"Please enter the secret value for the key [{k}]:"
                        )
                        parsed_args[k] = v

                    if not confirmation(
                        "Do you want to add another key-value pair to this "
                        "secret?"
                    ):
                        break
        elif not parsed_args:
            error(
                "Secret fields must be passed as arguments when not using "
                "interactive mode."
            )

        for key in parsed_args:
            validate_keys(key)
        declare("The following secret will be registered.")
        pretty_print_secret(secret=parsed_args, hide_secret=True)

        with console.status(f"Saving secret `{name}`..."):
            try:
                client.create_secret(
                    name=name, values=parsed_args, scope=SecretScope(scope)
                )
                declare(f"Secret '{name}' successfully created.")
            except EntityExistsError as e:
                # should never hit this on account of the check above
                error(f"Secret with name already exists. {str(e)}")
    except NotImplementedError as e:
        error(f"Centralized secrets management is disabled: {str(e)}")

create_service_account(service_account_name, description='', create_api_key=True, set_api_key=False, output_file=None)

Create a new service account.

Parameters:

Name Type Description Default
service_account_name str

The name of the service account to create.

required
description str

The API key description.

''
create_api_key bool

Create an API key for the service account.

True
set_api_key bool

Configure the local client to use the generated API key.

False
output_file Optional[str]

Output file to write the API key to.

None
Source code in src/zenml/cli/service_accounts.py
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
@service_account.command(
    "create", help="Create a new service account and optional API key."
)
@click.argument("service_account_name", type=str, required=True)
@click.option(
    "--description",
    "-d",
    type=str,
    required=False,
    default="",
    help="The API key description.",
)
@click.option(
    "--create-api-key",
    help=("Create an API key for the service account."),
    type=bool,
    default=True,
)
@click.option(
    "--set-api-key",
    help=("Configure the local client to use the generated API key."),
    is_flag=True,
)
@click.option(
    "--output-file",
    type=str,
    required=False,
    help="File to write the API key to.",
)
def create_service_account(
    service_account_name: str,
    description: str = "",
    create_api_key: bool = True,
    set_api_key: bool = False,
    output_file: Optional[str] = None,
) -> None:
    """Create a new service account.

    Args:
        service_account_name: The name of the service account to create.
        description: The API key description.
        create_api_key: Create an API key for the service account.
        set_api_key: Configure the local client to use the generated API key.
        output_file: Output file to write the API key to.
    """
    client = Client()
    try:
        service_account = client.create_service_account(
            name=service_account_name,
            description=description,
        )

        cli_utils.declare(f"Created service account '{service_account.name}'.")
    except EntityExistsError as err:
        cli_utils.error(str(err))

    if create_api_key:
        _create_api_key(
            service_account_name_or_id=service_account.name,
            name="default",
            description="Default API key.",
            set_key=set_api_key,
            output_file=output_file,
        )

create_user(user_name, password=None, is_admin=False)

Create a new user.

Parameters:

Name Type Description Default
user_name str

The name of the user to create.

required
password Optional[str]

The password of the user to create.

None
is_admin bool

Whether the user should be an admin.

False
Source code in src/zenml/cli/user_management.py
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
@user.command(
    "create",
    help="Create a new user. If an empty password is configured, an activation "
    "token is generated and a link to the dashboard is provided where the "
    "user can activate the account.",
)
@click.argument("user_name", type=str, required=True)
@click.option(
    "--password",
    help=(
        "The user password. If omitted, a prompt will be shown to enter the "
        "password. If an empty password is entered, an activation token is "
        "generated and a link to the dashboard is provided where the user can "
        "activate the account."
    ),
    required=False,
    type=str,
)
@click.option(
    "--is_admin",
    is_flag=True,
    help=(
        "Whether the user should be an admin. If not specified, the user will "
        "be a regular user."
    ),
    required=False,
    default=False,
)
def create_user(
    user_name: str,
    password: Optional[str] = None,
    is_admin: bool = False,
) -> None:
    """Create a new user.

    Args:
        user_name: The name of the user to create.
        password: The password of the user to create.
        is_admin: Whether the user should be an admin.
    """
    client = Client()
    if not password:
        if client.zen_store.type != StoreType.REST:
            password = click.prompt(
                f"Password for user {user_name}",
                hide_input=True,
            )
        else:
            password = click.prompt(
                f"Password for user {user_name}. Leave empty to generate an "
                f"activation token",
                default="",
                hide_input=True,
            )
    else:
        cli_utils.warning(
            "Supplying password values in the command line is not safe. "
            "Please consider using the prompt option."
        )

    try:
        new_user = client.create_user(
            name=user_name, password=password, is_admin=is_admin
        )

        cli_utils.declare(f"Created user '{new_user.name}'.")
    except EntityExistsError as err:
        cli_utils.error(str(err))
    else:
        if not new_user.active and new_user.activation_token is not None:
            user_info = f"?user={str(new_user.id)}&username={new_user.name}&token={new_user.activation_token}"
            cli_utils.declare(
                f"The created user account is currently inactive. You can "
                f"activate it by visiting the dashboard at the following URL:\n"
                # TODO: keep only `activate-user` once legacy dashboard is gone
                f"{client.zen_store.url}/activate-user{user_info}\n\n"
                "If you are using Legacy dashboard visit the following URL:\n"
                f"{client.zen_store.url}/signup{user_info}\n"
            )

deactivate_user(user_name_or_id)

Reset the password of a user.

Parameters:

Name Type Description Default
user_name_or_id str

The name or ID of the user to reset the password for.

required
Source code in src/zenml/cli/user_management.py
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
@user.command(
    "deactivate",
    help="Generate an activation token to reset the password for a user account",
)
@click.argument("user_name_or_id", type=str, required=True)
def deactivate_user(
    user_name_or_id: str,
) -> None:
    """Reset the password of a user.

    Args:
        user_name_or_id: The name or ID of the user to reset the password for.
    """
    client = Client()

    store = GlobalConfiguration().store_configuration
    if store.type != StoreType.REST:
        cli_utils.error(
            "Deactivating users is only supported when connected to a ZenML "
            "server."
        )

    try:
        if not client.active_user.is_admin:
            cli_utils.error(
                "Only admins can reset the password of other users."
            )

        user = client.deactivate_user(
            name_id_or_prefix=user_name_or_id,
        )
    except (KeyError, IllegalOperationError) as err:
        cli_utils.error(str(err))

    user_info = f"?user={str(user.id)}&username={user.name}&token={user.activation_token}"
    cli_utils.declare(
        f"Successfully deactivated user account '{user.name}'."
        f"To reactivate the account, please visit the dashboard at the "
        "following URL:\n"
        # TODO: keep only `activate-user` once legacy dashboard is gone
        f"{client.zen_store.url}/activate-user{user_info}\n\n"
        "If you are using Legacy dashboard visit the following URL:\n"
        f"{client.zen_store.url}/signup{user_info}\n"
    )

declare(text, bold=None, italic=None, **kwargs)

Echo a declaration on the CLI.

Parameters:

Name Type Description Default
text Union[str, Text]

Input text string.

required
bold Optional[bool]

Optional boolean to bold the text.

None
italic Optional[bool]

Optional boolean to italicize the text.

None
**kwargs Any

Optional kwargs to be passed to console.print().

{}
Source code in src/zenml/cli/utils.py
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
def declare(
    text: Union[str, "Text"],
    bold: Optional[bool] = None,
    italic: Optional[bool] = None,
    **kwargs: Any,
) -> None:
    """Echo a declaration on the CLI.

    Args:
        text: Input text string.
        bold: Optional boolean to bold the text.
        italic: Optional boolean to italicize the text.
        **kwargs: Optional kwargs to be passed to console.print().
    """
    base_style = zenml_style_defaults["info"]
    style = Style.chain(base_style, Style(bold=bold, italic=italic))
    console.print(text, style=style, **kwargs)

delete_api_key(service_account_name_or_id, name_or_id, yes=False)

Delete an API key.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account to which the API key belongs.

required
name_or_id str

The name or ID of the API key to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/service_accounts.py
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
@api_key.command("delete")
@click.argument("name_or_id", type=str, required=True)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
@click.pass_obj
def delete_api_key(
    service_account_name_or_id: str, name_or_id: str, yes: bool = False
) -> None:
    """Delete an API key.

    Args:
        service_account_name_or_id: The name or ID of the service account to
            which the API key belongs.
        name_or_id: The name or ID of the API key to delete.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete API key `{name_or_id}`?"
        )
        if not confirmation:
            cli_utils.declare("API key deletion canceled.")
            return

    try:
        Client().delete_api_key(
            service_account_name_id_or_prefix=service_account_name_or_id,
            name_id_or_prefix=name_or_id,
        )
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Deleted API key `{name_or_id}`.")

delete_authorized_device(id, yes=False)

Delete an authorized device.

Parameters:

Name Type Description Default
id str

The ID of the authorized device to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/authorized_device.py
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
@authorized_device.command("delete")
@click.argument("id", type=str, required=True)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_authorized_device(id: str, yes: bool = False) -> None:
    """Delete an authorized device.

    Args:
        id: The ID of the authorized device to delete.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete authorized device `{id}`?"
        )
        if not confirmation:
            cli_utils.declare("Authorized device deletion canceled.")
            return

    try:
        Client().delete_authorized_device(id_or_prefix=id)
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Deleted authorized device `{id}`.")

delete_code_repository(name_or_id, yes=False)

Delete a code repository.

Parameters:

Name Type Description Default
name_or_id str

The name or ID of the code repository to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/code_repository.py
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
@code_repository.command("delete")
@click.argument("name_or_id", type=str, required=True)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_code_repository(name_or_id: str, yes: bool = False) -> None:
    """Delete a code repository.

    Args:
        name_or_id: The name or ID of the code repository to delete.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete code repository `{name_or_id}`?"
        )
        if not confirmation:
            cli_utils.declare("Code repository deletion canceled.")
            return

    try:
        Client().delete_code_repository(name_id_or_prefix=name_or_id)
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Deleted code repository `{name_or_id}`.")

delete_model(model_name_or_id, yes=False)

Delete an existing model from the Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id str

The ID or name of the model to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/model.py
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
@model.command("delete", help="Delete an existing model.")
@click.argument("model_name_or_id")
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_model(
    model_name_or_id: str,
    yes: bool = False,
) -> None:
    """Delete an existing model from the Model Control Plane.

    Args:
        model_name_or_id: The ID or name of the model to delete.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete model '{model_name_or_id}'?"
        )
        if not confirmation:
            cli_utils.declare("Model deletion canceled.")
            return

    try:
        Client().delete_model(
            model_name_or_id=model_name_or_id,
        )
    except (KeyError, ValueError) as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Model '{model_name_or_id}' deleted.")

delete_model_version(model_name_or_id, model_version_name_or_number_or_id, yes=False)

Delete an existing model version in the Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id str

The ID or name of the model that contains the version.

required
model_version_name_or_number_or_id str

The ID, number or name of the model version.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/model.py
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
@version.command("delete", help="Delete an existing model version.")
@click.argument("model_name_or_id")
@click.argument("model_version_name_or_number_or_id")
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_model_version(
    model_name_or_id: str,
    model_version_name_or_number_or_id: str,
    yes: bool = False,
) -> None:
    """Delete an existing model version in the Model Control Plane.

    Args:
        model_name_or_id: The ID or name of the model that contains the version.
        model_version_name_or_number_or_id: The ID, number or name of the model version.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete model version '{model_version_name_or_number_or_id}' from model '{model_name_or_id}'?"
        )
        if not confirmation:
            cli_utils.declare("Model version deletion canceled.")
            return

    try:
        model_version = Client().get_model_version(
            model_name_or_id=model_name_or_id,
            model_version_name_or_number_or_id=model_version_name_or_number_or_id,
        )
        Client().delete_model_version(
            model_version_id=model_version.id,
        )
    except (KeyError, ValueError) as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(
            f"Model version '{model_version_name_or_number_or_id}' deleted from model '{model_name_or_id}'."
        )

delete_pipeline(pipeline_name_or_id, yes=False)

Delete a pipeline.

Parameters:

Name Type Description Default
pipeline_name_or_id str

The name or ID of the pipeline to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/pipeline.py
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
@pipeline.command("delete")
@click.argument("pipeline_name_or_id", type=str, required=True)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_pipeline(
    pipeline_name_or_id: str,
    yes: bool = False,
) -> None:
    """Delete a pipeline.

    Args:
        pipeline_name_or_id: The name or ID of the pipeline to delete.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete pipeline "
            f"`{pipeline_name_or_id}`? This will change all "
            "existing runs of this pipeline to become unlisted."
        )
        if not confirmation:
            cli_utils.declare("Pipeline deletion canceled.")
            return

    try:
        Client().delete_pipeline(
            name_id_or_prefix=pipeline_name_or_id,
        )
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Deleted pipeline `{pipeline_name_or_id}`.")

delete_pipeline_build(build_id, yes=False)

Delete a pipeline build.

Parameters:

Name Type Description Default
build_id str

The ID of the pipeline build to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/pipeline.py
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
@builds.command("delete")
@click.argument("build_id", type=str, required=True)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_pipeline_build(
    build_id: str,
    yes: bool = False,
) -> None:
    """Delete a pipeline build.

    Args:
        build_id: The ID of the pipeline build to delete.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete pipeline build `{build_id}`?"
        )
        if not confirmation:
            cli_utils.declare("Pipeline build deletion canceled.")
            return

    try:
        Client().delete_build(build_id)
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Deleted pipeline build '{build_id}'.")

delete_pipeline_run(run_name_or_id, yes=False)

Delete a pipeline run.

Parameters:

Name Type Description Default
run_name_or_id str

The name or ID of the pipeline run to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/pipeline.py
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
@runs.command("delete")
@click.argument("run_name_or_id", type=str, required=True)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_pipeline_run(
    run_name_or_id: str,
    yes: bool = False,
) -> None:
    """Delete a pipeline run.

    Args:
        run_name_or_id: The name or ID of the pipeline run to delete.
        yes: If set, don't ask for confirmation.
    """
    # Ask for confirmation to delete run.
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete pipeline run `{run_name_or_id}`?"
        )
        if not confirmation:
            cli_utils.declare("Pipeline run deletion canceled.")
            return

    # Delete run.
    try:
        Client().delete_pipeline_run(
            name_id_or_prefix=run_name_or_id,
        )
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Deleted pipeline run '{run_name_or_id}'.")

delete_schedule(schedule_name_or_id, yes=False)

Delete a pipeline schedule.

Parameters:

Name Type Description Default
schedule_name_or_id str

The name or ID of the schedule to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/pipeline.py
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
@schedule.command("delete", help="Delete a pipeline schedule.")
@click.argument("schedule_name_or_id", type=str, required=True)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_schedule(schedule_name_or_id: str, yes: bool = False) -> None:
    """Delete a pipeline schedule.

    Args:
        schedule_name_or_id: The name or ID of the schedule to delete.
        yes: If set, don't ask for confirmation.
    """
    if not yes:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete schedule "
            f"`{schedule_name_or_id}`?"
        )
        if not confirmation:
            cli_utils.declare("Schedule deletion canceled.")
            return

    try:
        Client().delete_schedule(name_id_or_prefix=schedule_name_or_id)
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Deleted schedule '{schedule_name_or_id}'.")

delete_secret(name_or_id, yes=False)

Delete a secret for a given name or id.

Parameters:

Name Type Description Default
name_or_id str

The name or id of the secret to delete.

required
yes bool

Skip asking for confirmation.

False
Source code in src/zenml/cli/secret.py
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
@secret.command("delete", help="Delete a secret with a given name or id.")
@click.argument(
    "name_or_id",
    type=click.STRING,
)
@click.option(
    "--yes",
    "-y",
    type=click.BOOL,
    default=False,
    is_flag=True,
    help="Skip asking for confirmation.",
)
def delete_secret(name_or_id: str, yes: bool = False) -> None:
    """Delete a secret for a given name or id.

    Args:
        name_or_id: The name or id of the secret to delete.
        yes: Skip asking for confirmation.
    """
    if not yes:
        confirmation_response = confirmation(
            f"This will delete all data associated with the `{name_or_id}` "
            f"secret. Are you sure you want to proceed?"
        )
        if not confirmation_response:
            console.print("Aborting secret deletion...")
            return

    client = Client()

    with console.status(f"Deleting secret `{name_or_id}`..."):
        try:
            client.delete_secret(name_id_or_prefix=name_or_id)
            declare(f"Secret '{name_or_id}' successfully deleted.")
        except KeyError as e:
            error(
                f"Secret with name or id `{name_or_id}` does not exist or "
                f"could not be loaded: {str(e)}."
            )
        except NotImplementedError as e:
            error(f"Centralized secrets management is disabled: {str(e)}")

delete_service_account(service_account_name_or_id)

Delete a service account.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account.

required
Source code in src/zenml/cli/service_accounts.py
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
@service_account.command("delete")
@click.argument("service_account_name_or_id", type=str, required=True)
def delete_service_account(service_account_name_or_id: str) -> None:
    """Delete a service account.

    Args:
        service_account_name_or_id: The name or ID of the service account.
    """
    client = Client()
    try:
        client.delete_service_account(service_account_name_or_id)
    except (KeyError, IllegalOperationError) as err:
        cli_utils.error(str(err))

    cli_utils.declare(
        f"Deleted service account '{service_account_name_or_id}'."
    )

delete_service_connector(name_id_or_prefix)

Deletes a service connector.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name of the service connector to delete.

required
Source code in src/zenml/cli/service_connectors.py
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
@service_connector.command(
    "delete",
    help="""Delete a service connector.
""",
)
@click.argument("name_id_or_prefix", type=str)
def delete_service_connector(name_id_or_prefix: str) -> None:
    """Deletes a service connector.

    Args:
        name_id_or_prefix: The name of the service connector to delete.
    """
    client = Client()

    with console.status(
        f"Deleting service connector '{name_id_or_prefix}'...\n"
    ):
        try:
            client.delete_service_connector(
                name_id_or_prefix=name_id_or_prefix,
            )
        except (KeyError, IllegalOperationError) as err:
            cli_utils.error(str(err))
        cli_utils.declare(f"Deleted service connector: {name_id_or_prefix}")

delete_stack(stack_name_or_id, yes=False, recursive=False)

Delete a stack.

Parameters:

Name Type Description Default
stack_name_or_id str

Name or id of the stack to delete.

required
yes bool

Stack will be deleted without prompting for confirmation.

False
recursive bool

The stack will be deleted along with the corresponding stack associated with it.

False
Source code in src/zenml/cli/stack.py
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
@stack.command("delete", help="Delete a stack given its name.")
@click.argument("stack_name_or_id", type=str)
@click.option("--yes", "-y", is_flag=True, required=False)
@click.option(
    "--recursive",
    "-r",
    is_flag=True,
    help="Recursively delete all stack components",
)
def delete_stack(
    stack_name_or_id: str, yes: bool = False, recursive: bool = False
) -> None:
    """Delete a stack.

    Args:
        stack_name_or_id: Name or id of the stack to delete.
        yes: Stack will be deleted without prompting for
            confirmation.
        recursive: The stack will be deleted along with the corresponding stack
            associated with it.
    """
    recursive_confirmation = False
    if recursive:
        recursive_confirmation = yes or cli_utils.confirmation(
            "If there are stack components present in another stack, "
            "those stack components will be ignored for removal \n"
            "Do you want to continue ?"
        )

        if not recursive_confirmation:
            cli_utils.declare("Stack deletion canceled.")
            return

    confirmation = (
        recursive_confirmation
        or yes
        or cli_utils.confirmation(
            f"This will delete stack '{stack_name_or_id}'. \n"
            "Are you sure you want to proceed?"
        )
    )

    if not confirmation:
        cli_utils.declare("Stack deletion canceled.")
        return

    with console.status(f"Deleting stack '{stack_name_or_id}'...\n"):
        client = Client()

        if recursive and recursive_confirmation:
            client.delete_stack(stack_name_or_id, recursive=True)
            return

        try:
            client.delete_stack(stack_name_or_id)
        except (KeyError, ValueError, IllegalOperationError) as err:
            cli_utils.error(str(err))
        cli_utils.declare(f"Deleted stack '{stack_name_or_id}'.")

delete_tag(tag_name_or_id, yes=False)

Delete an existing tag.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

The ID or name of the tag to delete.

required
yes bool

If set, don't ask for confirmation.

False
Source code in src/zenml/cli/tag.py
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
@tag.command("delete", help="Delete an existing tag.")
@click.argument("tag_name_or_id")
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
def delete_tag(
    tag_name_or_id: Union[str, UUID],
    yes: bool = False,
) -> None:
    """Delete an existing tag.

    Args:
        tag_name_or_id: The ID or name of the tag to delete.
        yes: If set, don't ask for confirmation.
    """
    try:
        tagged_count = Client().get_tag(tag_name_or_id).tagged_count
    except (KeyError, ValueError) as e:
        cli_utils.error(str(e))

    if not yes or tagged_count > 0:
        confirmation = cli_utils.confirmation(
            f"Are you sure you want to delete tag '{tag_name_or_id}'?"
            + (
                ""
                if tagged_count == 0
                else f"\n{tagged_count} objects are tagged with it."
            )
        )
        if not confirmation:
            cli_utils.declare("Tag deletion canceled.")
            return

    try:
        Client().delete_tag(
            tag_name_or_id=tag_name_or_id,
        )
    except (KeyError, ValueError) as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Tag '{tag_name_or_id}' deleted.")

delete_user(user_name_or_id)

Delete a user.

Parameters:

Name Type Description Default
user_name_or_id str

The name or ID of the user to delete.

required
Source code in src/zenml/cli/user_management.py
417
418
419
420
421
422
423
424
425
426
427
428
429
@user.command("delete")
@click.argument("user_name_or_id", type=str, required=True)
def delete_user(user_name_or_id: str) -> None:
    """Delete a user.

    Args:
        user_name_or_id: The name or ID of the user to delete.
    """
    try:
        Client().delete_user(user_name_or_id)
    except (KeyError, IllegalOperationError) as err:
        cli_utils.error(str(err))
    cli_utils.declare(f"Deleted user '{user_name_or_id}'.")

depaginate(list_method, **kwargs)

Depaginate the results from a client or store method that returns pages.

Parameters:

Name Type Description Default
list_method Callable[..., Page[AnyResponse]]

The list method to depaginate.

required
**kwargs Any

Arguments for the list method.

{}

Returns:

Type Description
List[AnyResponse]

A list of the corresponding Response Models.

Source code in src/zenml/utils/pagination_utils.py
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
def depaginate(
    list_method: Callable[..., Page[AnyResponse]], **kwargs: Any
) -> List[AnyResponse]:
    """Depaginate the results from a client or store method that returns pages.

    Args:
        list_method: The list method to depaginate.
        **kwargs: Arguments for the list method.

    Returns:
        A list of the corresponding Response Models.
    """
    page = list_method(**kwargs)
    items = list(page.items)
    while page.index < page.total_pages:
        kwargs["page"] = page.index + 1
        page = list_method(**kwargs)
        items += list(page.items)

    return items

deploy(ctx, provider, stack_name=None, location=None, set_stack=False)

Deploy and register a fully functional cloud ZenML stack.

Parameters:

Name Type Description Default
ctx Context

The click context.

required
provider str

The cloud provider to deploy the stack to.

required
stack_name Optional[str]

A name for the ZenML stack that gets imported as a result of the recipe deployment.

None
location Optional[str]

The location to deploy the stack to.

None
set_stack bool

Immediately set the deployed stack as active.

False

Raises:

Type Description
Abort

If the user aborts the deployment.

KeyboardInterrupt

If the user interrupts the deployment.

Source code in src/zenml/cli/stack.py
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
@stack.command(
    help="""Deploy a fully functional ZenML stack in one of the cloud providers.

Running this command will initiate an assisted process that will walk you
through automatically provisioning all the cloud infrastructure resources
necessary for a fully functional ZenML stack in the cloud provider of your
choice. A corresponding ZenML stack will also be automatically registered along
with all the necessary components and properly authenticated through service
connectors.
"""
)
@click.option(
    "--provider",
    "-p",
    "provider",
    required=True,
    type=click.Choice(StackDeploymentProvider.values()),
)
@click.option(
    "--name",
    "-n",
    "stack_name",
    type=click.STRING,
    required=False,
    help="Custom string to use as a prefix to generate names for the ZenML "
    "stack, its components service connectors as well as provisioned cloud "
    "infrastructure resources. May only contain alphanumeric characters and "
    "hyphens and have a maximum length of 16 characters.",
    callback=validate_name,
)
@click.option(
    "--location",
    "-l",
    type=click.STRING,
    required=False,
    help="The location to deploy the stack to.",
)
@click.option(
    "--set",
    "set_stack",
    is_flag=True,
    help="Immediately set this stack as active.",
    type=click.BOOL,
)
@click.pass_context
def deploy(
    ctx: click.Context,
    provider: str,
    stack_name: Optional[str] = None,
    location: Optional[str] = None,
    set_stack: bool = False,
) -> None:
    """Deploy and register a fully functional cloud ZenML stack.

    Args:
        ctx: The click context.
        provider: The cloud provider to deploy the stack to.
        stack_name: A name for the ZenML stack that gets imported as a result
            of the recipe deployment.
        location: The location to deploy the stack to.
        set_stack: Immediately set the deployed stack as active.

    Raises:
        Abort: If the user aborts the deployment.
        KeyboardInterrupt: If the user interrupts the deployment.
    """
    stack_name = stack_name or f"zenml-{provider}-stack"

    # Set up the markdown renderer to use the old-school markdown heading
    Markdown.elements.update(
        {
            "heading_open": OldSchoolMarkdownHeading,
        }
    )

    client = Client()
    if client.zen_store.is_local_store():
        cli_utils.error(
            "This feature cannot be used with a local ZenML deployment. "
            "ZenML needs to be accessible from the cloud provider to allow the "
            "stack and its components to be registered automatically. "
            "Please deploy ZenML in a remote environment as described in the "
            "documentation: https://docs.zenml.io/getting-started/deploying-zenml "
            "or use a managed ZenML Pro server instance for quick access to "
            "this feature and more: https://www.zenml.io/pro"
        )

    with track_handler(
        event=AnalyticsEvent.DEPLOY_FULL_STACK,
    ) as analytics_handler:
        analytics_handler.metadata = {
            "provider": provider,
        }

        deployment = client.zen_store.get_stack_deployment_info(
            provider=StackDeploymentProvider(provider),
        )

        if location and location not in deployment.locations.values():
            cli_utils.error(
                f"Invalid location '{location}' for provider '{provider}'. "
                f"Valid locations are: {', '.join(deployment.locations.values())}"
            )

        console.print(
            Markdown(
                f"# {provider.upper()} ZenML Cloud Stack Deployment\n"
                + deployment.description
            )
        )
        console.print(Markdown("## Details\n" + deployment.instructions))

        deployment_config = client.zen_store.get_stack_deployment_config(
            provider=StackDeploymentProvider(provider),
            stack_name=stack_name,
            location=location,
        )

        if deployment_config.instructions:
            console.print(
                Markdown("## Instructions\n" + deployment_config.instructions),
                "\n",
            )

        if deployment_config.configuration:
            console.print(
                deployment_config.configuration,
                no_wrap=True,
                overflow="ignore",
                crop=False,
                style=Style(bgcolor="grey15"),
            )

        if not cli_utils.confirmation(
            "\n\nProceed to continue with the deployment. You will be "
            f"automatically redirected to "
            f"{deployment_config.deployment_url_text} in your browser.",
        ):
            raise click.Abort()

        date_start = utc_now_tz_aware()

        webbrowser.open(deployment_config.deployment_url)
        console.print(
            Markdown(
                f"If your browser did not open automatically, please open "
                f"the following URL into your browser to deploy the stack to "
                f"{provider.upper()}: "
                f"[{deployment_config.deployment_url_text}]"
                f"({deployment_config.deployment_url}).\n\n"
            )
        )

        try:
            cli_utils.declare(
                "\n\nWaiting for the deployment to complete and the stack to be "
                "registered. Press CTRL+C to abort...\n"
            )

            while True:
                deployed_stack = client.zen_store.get_stack_deployment_stack(
                    provider=StackDeploymentProvider(provider),
                    stack_name=stack_name,
                    location=location,
                    date_start=date_start,
                )
                if deployed_stack:
                    break
                time.sleep(10)

            analytics_handler.metadata.update(
                {
                    "stack_id": deployed_stack.stack.id,
                }
            )

        except KeyboardInterrupt:
            cli_utils.declare("Stack deployment aborted.")
            raise

    stack_desc = f"""## Stack successfully registered! 🚀
Stack [{deployed_stack.stack.name}]({get_stack_url(deployed_stack.stack)}):\n"""

    for component_type, components in deployed_stack.stack.components.items():
        if components:
            component = components[0]
            stack_desc += (
                f" * `{component.flavor_name}` {component_type.value}: "
                f"[{component.name}]({get_component_url(component)})\n"
            )

    if deployed_stack.service_connector:
        stack_desc += (
            f" * Service Connector: {deployed_stack.service_connector.name}\n"
        )

    console.print(Markdown(stack_desc))

    follow_up = f"""
## Follow-up

{deployment.post_deploy_instructions}

To use the `{deployed_stack.stack.name}` stack to run pipelines:

* install the required ZenML integrations by running: `zenml integration install {" ".join(deployment.integrations)}`
"""
    if set_stack:
        client.activate_stack(deployed_stack.stack.id)
        follow_up += f"""
* the `{deployed_stack.stack.name}` stack has already been set as active
"""
    else:
        follow_up += f"""
* set the `{deployed_stack.stack.name}` stack as active by running: `zenml stack set {deployed_stack.stack.name}`
"""

    console.print(
        Markdown(follow_up),
    )

describe_api_key(service_account_name_or_id, name_or_id)

Describe an API key.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account to which the API key belongs.

required
name_or_id str

The name or ID of the API key to describe.

required
Source code in src/zenml/cli/service_accounts.py
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
@api_key.command("describe", help="Describe an API key.")
@click.argument("name_or_id", type=str, required=True)
@click.pass_obj
def describe_api_key(service_account_name_or_id: str, name_or_id: str) -> None:
    """Describe an API key.

    Args:
        service_account_name_or_id: The name or ID of the service account to
            which the API key belongs.
        name_or_id: The name or ID of the API key to describe.
    """
    with console.status(f"Getting API key `{name_or_id}`...\n"):
        try:
            api_key = Client().get_api_key(
                service_account_name_id_or_prefix=service_account_name_or_id,
                name_id_or_prefix=name_or_id,
            )
        except KeyError as e:
            cli_utils.error(str(e))

        cli_utils.print_pydantic_model(
            title=f"API key '{api_key.name}'",
            model=api_key,
            exclude_columns={
                "key",
            },
        )

describe_authorized_device(id_or_prefix)

Fetch an authorized device.

Parameters:

Name Type Description Default
id_or_prefix str

The ID of the authorized device to fetch.

required
Source code in src/zenml/cli/authorized_device.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
@authorized_device.command("describe")
@click.argument("id_or_prefix", type=str, required=True)
def describe_authorized_device(id_or_prefix: str) -> None:
    """Fetch an authorized device.

    Args:
        id_or_prefix: The ID of the authorized device to fetch.
    """
    try:
        device = Client().get_authorized_device(
            id_or_prefix=id_or_prefix,
        )
    except KeyError as e:
        cli_utils.error(str(e))

    cli_utils.print_pydantic_model(
        title=f"Authorized device `{device.id}`",
        model=device,
        exclude_columns={"user"},
    )

describe_code_repository(name_id_or_prefix)

Describe a code repository.

Parameters:

Name Type Description Default
name_id_or_prefix str

Name, ID or prefix of the code repository.

required
Source code in src/zenml/cli/code_repository.py
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
@code_repository.command("describe", help="Describe a code repository.")
@click.argument(
    "name_id_or_prefix",
    type=str,
    required=True,
)
def describe_code_repository(name_id_or_prefix: str) -> None:
    """Describe a code repository.

    Args:
        name_id_or_prefix: Name, ID or prefix of the code repository.
    """
    client = Client()
    try:
        code_repository = client.get_code_repository(
            name_id_or_prefix=name_id_or_prefix,
        )
    except KeyError as err:
        cli_utils.error(str(err))
    else:
        cli_utils.print_pydantic_model(
            title=f"Code repository '{code_repository.name}'",
            model=code_repository,
        )

describe_service_account(service_account_name_or_id)

Describe a service account.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account.

required
Source code in src/zenml/cli/service_accounts.py
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
@service_account.command("describe")
@click.argument("service_account_name_or_id", type=str, required=True)
def describe_service_account(service_account_name_or_id: str) -> None:
    """Describe a service account.

    Args:
        service_account_name_or_id: The name or ID of the service account.
    """
    client = Client()
    try:
        service_account = client.get_service_account(
            service_account_name_or_id
        )
    except KeyError as err:
        cli_utils.error(str(err))
    else:
        cli_utils.print_pydantic_model(
            title=f"Service account '{service_account.name}'",
            model=service_account,
        )

describe_service_connector(name_id_or_prefix, show_secrets=False, describe_client=False, resource_type=None, resource_id=None)

Prints details about a service connector.

Parameters:

Name Type Description Default
name_id_or_prefix str

Name or id of the service connector to describe.

required
show_secrets bool

Whether to show security sensitive configuration attributes in the terminal.

False
describe_client bool

Fetch and describe a service connector client instead of the base connector if possible.

False
resource_type Optional[str]

Resource type to use when fetching the service connector client.

None
resource_id Optional[str]

Resource ID to use when fetching the service connector client.

None
Source code in src/zenml/cli/service_connectors.py
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
@service_connector.command(
    "describe",
    help="""Show detailed information about a service connector.

Display detailed information about a service connector configuration, or about
a service connector client generated from it to access a specific resource
(explained below).

Service connector clients are connector configurations generated from the
original service connectors that are actually used by clients to access a
specific target resource (e.g. an AWS connector generates a Kubernetes connector
client to access a specific EKS Kubernetes cluster). Unlike main service
connectors, connector clients are not persisted in the database. They have a
limited lifetime and may contain temporary credentials to access the target
resource (e.g. an AWS connector configured with an AWS secret key and IAM role
generates a connector client containing temporary STS credentials).

Asking to see service connector client details is equivalent to asking to see
the final configuration that the client sees, as opposed to the configuration
that was configured by the user. In some cases, they are the same, in others
they are completely different.

To show the details of a service connector client instead of the base connector
use the `--client` flag. If the service connector is configured to provide
access to multiple resources, you also need to use the `--resource-type` and
`--resource-id` flags to specify the scope of the connector client.

Secret configuration attributes are not shown by default. Use the
`-x|--show-secrets` flag to show them.
""",
)
@click.argument(
    "name_id_or_prefix",
    type=str,
    required=True,
)
@click.option(
    "--show-secrets",
    "-x",
    "show_secrets",
    is_flag=True,
    default=False,
    help="Show security sensitive configuration attributes in the terminal.",
    type=click.BOOL,
)
@click.option(
    "--client",
    "-c",
    "describe_client",
    is_flag=True,
    default=False,
    help="Fetch and describe a service connector client instead of the base "
    "connector.",
    type=click.BOOL,
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="Resource type to use when fetching the service connector client.",
    required=False,
    type=str,
)
@click.option(
    "--resource-id",
    "-ri",
    "resource_id",
    help="Resource ID to use when fetching the service connector client.",
    required=False,
    type=str,
)
def describe_service_connector(
    name_id_or_prefix: str,
    show_secrets: bool = False,
    describe_client: bool = False,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
) -> None:
    """Prints details about a service connector.

    Args:
        name_id_or_prefix: Name or id of the service connector to describe.
        show_secrets: Whether to show security sensitive configuration
            attributes in the terminal.
        describe_client: Fetch and describe a service connector client
            instead of the base connector if possible.
        resource_type: Resource type to use when fetching the service connector
            client.
        resource_id: Resource ID to use when fetching the service connector
            client.
    """
    client = Client()

    if resource_type or resource_id:
        describe_client = True

    if describe_client:
        try:
            connector_client = client.get_service_connector_client(
                name_id_or_prefix=name_id_or_prefix,
                resource_type=resource_type,
                resource_id=resource_id,
                verify=True,
            )
        except (
            KeyError,
            ValueError,
            IllegalOperationError,
            NotImplementedError,
            AuthorizationException,
        ) as e:
            resource_type = resource_type or "<unspecified>"
            resource_id = resource_id or "<unspecified>"
            cli_utils.error(
                f"Failed fetching a service connector client for connector "
                f"'{name_id_or_prefix}', resource type '{resource_type}' and "
                f"resource ID '{resource_id}': {e}"
            )

        connector = connector_client.to_response_model(
            workspace=client.active_workspace,
            user=client.active_user,
        )
    else:
        try:
            connector = client.get_service_connector(
                name_id_or_prefix=name_id_or_prefix,
                load_secrets=True,
            )
        except KeyError as err:
            cli_utils.error(str(err))

    with console.status(f"Describing connector '{connector.name}'..."):
        active_stack = client.active_stack_model
        active_connector_ids: List[UUID] = []
        for components in active_stack.components.values():
            active_connector_ids.extend(
                [
                    component.connector.id
                    for component in components
                    if component.connector
                ]
            )

        cli_utils.print_service_connector_configuration(
            connector=connector,
            active_status=connector.id in active_connector_ids,
            show_secrets=show_secrets,
        )

describe_service_connector_type(type, resource_type=None, auth_method=None)

Describes a service connector type.

Parameters:

Name Type Description Default
type str

The connector type to describe.

required
resource_type Optional[str]

The resource type to describe.

None
auth_method Optional[str]

The authentication method to describe.

None
Source code in src/zenml/cli/service_connectors.py
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
@service_connector.command(
    "describe-type",
    help="""Describe a service connector type.
""",
)
@click.argument(
    "type",
    type=str,
    required=True,
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="Resource type to describe.",
    required=False,
    type=str,
)
@click.option(
    "--auth-method",
    "-a",
    "auth_method",
    help="Authentication method to describe.",
    required=False,
    type=str,
)
def describe_service_connector_type(
    type: str,
    resource_type: Optional[str] = None,
    auth_method: Optional[str] = None,
) -> None:
    """Describes a service connector type.

    Args:
        type: The connector type to describe.
        resource_type: The resource type to describe.
        auth_method: The authentication method to describe.
    """
    client = Client()

    try:
        connector_type = client.get_service_connector_type(type)
    except KeyError:
        cli_utils.error(f"Service connector type '{type}' not found.")

    if resource_type:
        if resource_type not in connector_type.resource_type_dict:
            cli_utils.error(
                f"Resource type '{resource_type}' not found for service "
                f"connector type '{type}'."
            )
        cli_utils.print_service_connector_resource_type(
            connector_type.resource_type_dict[resource_type]
        )
    elif auth_method:
        if auth_method not in connector_type.auth_method_dict:
            cli_utils.error(
                f"Authentication method '{auth_method}' not found for service"
                f" connector type '{type}'."
            )
        cli_utils.print_service_connector_auth_method(
            connector_type.auth_method_dict[auth_method]
        )
    else:
        cli_utils.print_service_connector_type(
            connector_type,
            include_resource_types=False,
            include_auth_methods=False,
        )

describe_stack(stack_name_or_id=None)

Show details about a named stack or the active stack.

Parameters:

Name Type Description Default
stack_name_or_id Optional[str]

Name of the stack to describe.

None
Source code in src/zenml/cli/stack.py
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
@stack.command(
    "describe",
    help="Show details about the current active stack.",
)
@click.argument(
    "stack_name_or_id",
    type=click.STRING,
    required=False,
)
def describe_stack(stack_name_or_id: Optional[str] = None) -> None:
    """Show details about a named stack or the active stack.

    Args:
        stack_name_or_id: Name of the stack to describe.
    """
    client = Client()

    with console.status("Describing the stack...\n"):
        try:
            stack_: "StackResponse" = client.get_stack(
                name_id_or_prefix=stack_name_or_id
            )
        except KeyError as err:
            cli_utils.error(str(err))

        cli_utils.print_stack_configuration(
            stack=stack_,
            active=stack_.id == client.active_stack_model.id,
        )

    print_model_url(get_stack_url(stack_))

describe_user(user_name_or_id=None)

Get the user.

Parameters:

Name Type Description Default
user_name_or_id Optional[str]

The name or ID of the user.

None
Source code in src/zenml/cli/user_management.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
@user.command("describe")
@click.argument("user_name_or_id", type=str, required=False)
def describe_user(user_name_or_id: Optional[str] = None) -> None:
    """Get the user.

    Args:
        user_name_or_id: The name or ID of the user.
    """
    client = Client()
    if not user_name_or_id:
        active_user = client.active_user
        cli_utils.print_pydantic_models(
            [active_user],
            exclude_columns=[
                "created",
                "updated",
                "email",
                "email_opted_in",
                "activation_token",
            ],
        )
    else:
        try:
            user = client.get_user(user_name_or_id)
        except KeyError as err:
            cli_utils.error(str(err))
        else:
            cli_utils.print_pydantic_models(
                [user],
                exclude_columns=[
                    "created",
                    "updated",
                    "email",
                    "email_opted_in",
                    "activation_token",
                ],
            )

describe_workspace(workspace_name_or_id=None)

Get the workspace.

Parameters:

Name Type Description Default
workspace_name_or_id Optional[str]

The name or ID of the workspace to set as active.

None
Source code in src/zenml/cli/workspace.py
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
@workspace.command("describe", hidden=True)
@click.argument("workspace_name_or_id", type=str, required=False)
def describe_workspace(workspace_name_or_id: Optional[str] = None) -> None:
    """Get the workspace.

    Args:
        workspace_name_or_id: The name or ID of the workspace to set as active.
    """
    warn_unsupported_non_default_workspace()
    client = Client()
    if not workspace_name_or_id:
        active_workspace = client.active_workspace
        cli_utils.print_pydantic_models(
            [active_workspace], exclude_columns=["created", "updated"]
        )
    else:
        try:
            workspace_ = client.get_workspace(workspace_name_or_id)
        except KeyError as err:
            cli_utils.error(str(err))
        else:
            cli_utils.print_pydantic_models(
                [workspace_], exclude_columns=["created", "updated"]
            )

disconnect_server()

Disconnect from a ZenML server.

Source code in src/zenml/cli/server.py
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
@cli.command(
    "disconnect",
    help="""Disconnect from a ZenML server.

DEPRECATED: Please use `zenml logout` instead.
""",
)
def disconnect_server() -> None:
    """Disconnect from a ZenML server."""
    cli_utils.warning(
        "The `zenml disconnect` command is deprecated and will be removed in a "
        "future release. Please use the `zenml logout` command instead."
    )

    # Calling the `zenml logout` command
    cli_utils.declare("Calling `zenml logout`...")
    logout.callback()  # type: ignore[misc]

down()

Shut down the local ZenML dashboard.

Source code in src/zenml/cli/server.py
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
@cli.command(
    "down",
    help="""Shut down the local ZenML dashboard.

DEPRECATED: Please use `zenml logout local` instead.
""",
)
def down() -> None:
    """Shut down the local ZenML dashboard."""
    cli_utils.warning(
        "The `zenml down` command is deprecated and will be removed in a "
        "future release. Please use the `zenml logout --local` command instead."
    )

    # Calling the `zenml logout` command
    cli_utils.declare("Calling `zenml logout --local`...")
    logout.callback(  # type: ignore[misc]
        local=True
    )

email_opt_int(opted_in, email, source)

Track the event of the users response to the email prompt, identify them.

Parameters:

Name Type Description Default
opted_in bool

Did the user decide to opt-in

required
email Optional[str]

The email the user optionally provided

required
source str

Location when the user replied ["zenml go", "zenml server"]

required
Source code in src/zenml/analytics/utils.py
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
def email_opt_int(opted_in: bool, email: Optional[str], source: str) -> None:
    """Track the event of the users response to the email prompt, identify them.

    Args:
        opted_in: Did the user decide to opt-in
        email: The email the user optionally provided
        source: Location when the user replied ["zenml go", "zenml server"]
    """
    # If the user opted in, associate email with the anonymous distinct ID
    if opted_in and email is not None and email != "":
        identify(metadata={"email": email, "source": source})

    # Track that the user answered the prompt
    track(
        AnalyticsEvent.OPT_IN_OUT_EMAIL,
        {"opted_in": opted_in, "source": source},
    )

error(text)

Echo an error string on the CLI.

Parameters:

Name Type Description Default
text str

Input text string.

required

Raises:

Type Description
ClickException

when called.

Source code in src/zenml/cli/utils.py
157
158
159
160
161
162
163
164
165
166
def error(text: str) -> NoReturn:
    """Echo an error string on the CLI.

    Args:
        text: Input text string.

    Raises:
        ClickException: when called.
    """
    raise click.ClickException(message=click.style(text, fg="red", bold=True))

expand_argument_value_from_file(name, value)

Expands the value of an argument pointing to a file into the contents of that file.

Parameters:

Name Type Description Default
name str

Name of the argument. Used solely for logging purposes.

required
value str

The value of the argument. This is to be interpreted as a filename if it begins with a @ character.

required

Returns:

Type Description
str

The argument value expanded into the contents of the file, if the

str

argument value begins with a @ character. Otherwise, the argument

str

value is returned unchanged.

Raises:

Type Description
ValueError

If the argument value points to a file that doesn't exist, that cannot be read, or is too long(i.e. exceeds MAX_ARGUMENT_VALUE_SIZE bytes).

Source code in src/zenml/cli/utils.py
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
def expand_argument_value_from_file(name: str, value: str) -> str:
    """Expands the value of an argument pointing to a file into the contents of that file.

    Args:
        name: Name of the argument. Used solely for logging purposes.
        value: The value of the argument. This is to be interpreted as a
            filename if it begins with a `@` character.

    Returns:
        The argument value expanded into the contents of the file, if the
        argument value begins with a `@` character. Otherwise, the argument
        value is returned unchanged.

    Raises:
        ValueError: If the argument value points to a file that doesn't exist,
            that cannot be read, or is too long(i.e. exceeds
            `MAX_ARGUMENT_VALUE_SIZE` bytes).
    """
    if value.startswith("@@"):
        return value[1:]
    if not value.startswith("@"):
        return value
    filename = os.path.abspath(os.path.expanduser(value[1:]))
    logger.info(
        f"Expanding argument value `{name}` to contents of file `{filename}`."
    )
    if not os.path.isfile(filename):
        raise ValueError(
            f"Could not load argument '{name}' value: file "
            f"'{filename}' does not exist or is not readable."
        )
    try:
        if os.path.getsize(filename) > MAX_ARGUMENT_VALUE_SIZE:
            raise ValueError(
                f"Could not load argument '{name}' value: file "
                f"'{filename}' is too large (max size is "
                f"{MAX_ARGUMENT_VALUE_SIZE} bytes)."
            )

        with open(filename, "r") as f:
            return f.read()
    except OSError as e:
        raise ValueError(
            f"Could not load argument '{name}' value: file "
            f"'{filename}' could not be accessed: {str(e)}"
        )

export_requirements(stack_name_or_id=None, output_file=None, overwrite=False)

Exports stack requirements so they can be installed using pip.

Parameters:

Name Type Description Default
stack_name_or_id Optional[str]

Stack name or ID. If not given, the active stack will be used.

None
output_file Optional[str]

Optional path to the requirements output file.

None
overwrite bool

Overwrite the output file if it already exists. This option is only valid if the output file is provided.

False
Source code in src/zenml/cli/stack.py
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
@stack.command(
    name="export-requirements", help="Export the stack requirements."
)
@click.argument(
    "stack_name_or_id",
    type=click.STRING,
    required=False,
)
@click.option(
    "--output-file",
    "-o",
    "output_file",
    type=str,
    required=False,
    help="File to which to export the stack requirements. If not "
    "provided, the requirements will be printed to stdout instead.",
)
@click.option(
    "--overwrite",
    "-ov",
    "overwrite",
    type=bool,
    required=False,
    is_flag=True,
    help="Overwrite the output file if it already exists. This option is "
    "only valid if the output file is provided.",
)
def export_requirements(
    stack_name_or_id: Optional[str] = None,
    output_file: Optional[str] = None,
    overwrite: bool = False,
) -> None:
    """Exports stack requirements so they can be installed using pip.

    Args:
        stack_name_or_id: Stack name or ID. If not given, the active stack will
            be used.
        output_file: Optional path to the requirements output file.
        overwrite: Overwrite the output file if it already exists. This option
            is only valid if the output file is provided.
    """
    try:
        stack_model: "StackResponse" = Client().get_stack(
            name_id_or_prefix=stack_name_or_id
        )
    except KeyError as err:
        cli_utils.error(str(err))

    requirements, _ = requirements_utils.get_requirements_for_stack(
        stack_model
    )

    if not requirements:
        cli_utils.declare(f"Stack `{stack_model.name}` has no requirements.")
        return

    if output_file:
        try:
            with open(output_file, "x") as f:
                f.write("\n".join(requirements))
        except FileExistsError:
            if overwrite or cli_utils.confirmation(
                "A file already exists at the specified path. "
                "Would you like to overwrite it?"
            ):
                with open(output_file, "w") as f:
                    f.write("\n".join(requirements))
        cli_utils.declare(
            f"Requirements for stack `{stack_model.name}` exported to {output_file}."
        )
    else:
        click.echo(" ".join(requirements), nl=False)

export_secret(name_id_or_prefix, scope=None, filename=None)

Export a secret as a YAML file.

The resulting YAML file can then be imported as a new secret using the zenml secret create <new_secret_name> -v @<filename> command.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name of the secret to export.

required
scope Optional[str]

The scope of the secret to export.

None
filename Optional[str]

The name of the file to export the secret to.

None
Source code in src/zenml/cli/secret.py
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
@secret.command("export", help="Export a secret as a YAML file.")
@click.argument(
    "name_id_or_prefix",
    type=click.STRING,
)
@click.option(
    "--scope",
    "-s",
    type=click.Choice([scope.value for scope in list(SecretScope)]),
    default=None,
)
@click.option(
    "--filename",
    "-f",
    type=click.STRING,
    default=None,
    help=(
        "The name of the file to export the secret to. Defaults to "
        "<secret_name>.yaml."
    ),
)
def export_secret(
    name_id_or_prefix: str,
    scope: Optional[str] = None,
    filename: Optional[str] = None,
) -> None:
    """Export a secret as a YAML file.

    The resulting YAML file can then be imported as a new secret using the
    `zenml secret create <new_secret_name> -v @<filename>` command.

    Args:
        name_id_or_prefix: The name of the secret to export.
        scope: The scope of the secret to export.
        filename: The name of the file to export the secret to.
    """
    from zenml.utils.yaml_utils import write_yaml

    secret = _get_secret(name_id_or_prefix=name_id_or_prefix, scope=scope)
    if not secret.secret_values:
        warning(f"Secret with name `{name_id_or_prefix}` is empty.")
        return

    filename = filename or f"{secret.name}.yaml"
    write_yaml(filename, secret.secret_values)
    declare(f"Secret '{secret.name}' successfully exported to '{filename}'.")

export_stack(stack_name_or_id=None, filename=None)

Export a stack to YAML.

Parameters:

Name Type Description Default
stack_name_or_id Optional[str]

The name of the stack to export.

None
filename Optional[str]

The filename to export the stack to.

None
Source code in src/zenml/cli/stack.py
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
@stack.command("export", help="Exports a stack to a YAML file.")
@click.argument("stack_name_or_id", type=str, required=False)
@click.argument("filename", type=str, required=False)
def export_stack(
    stack_name_or_id: Optional[str] = None,
    filename: Optional[str] = None,
) -> None:
    """Export a stack to YAML.

    Args:
        stack_name_or_id: The name of the stack to export.
        filename: The filename to export the stack to.
    """
    # Get configuration of given stack
    client = Client()
    try:
        stack_to_export = client.get_stack(name_id_or_prefix=stack_name_or_id)
    except KeyError as err:
        cli_utils.error(str(err))

    # write zenml version and stack dict to YAML
    yaml_data = stack_to_export.to_yaml()
    yaml_data["zenml_version"] = zenml.__version__

    if filename is None:
        filename = stack_to_export.name + ".yaml"
    write_yaml(filename, yaml_data)

    cli_utils.declare(
        f"Exported stack '{stack_to_export.name}' to file '{filename}'."
    )

format_integration_list(integrations)

Formats a list of integrations into a List of Dicts.

This list of dicts can then be printed in a table style using cli_utils.print_table.

Parameters:

Name Type Description Default
integrations List[Tuple[str, Type[Integration]]]

List of tuples containing the name of the integration and the integration metadata.

required

Returns:

Type Description
List[Dict[str, str]]

List of Dicts containing the name of the integration and the integration

Source code in src/zenml/cli/utils.py
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
def format_integration_list(
    integrations: List[Tuple[str, Type["Integration"]]],
) -> List[Dict[str, str]]:
    """Formats a list of integrations into a List of Dicts.

    This list of dicts can then be printed in a table style using
    cli_utils.print_table.

    Args:
        integrations: List of tuples containing the name of the integration and
            the integration metadata.

    Returns:
        List of Dicts containing the name of the integration and the integration
    """
    list_of_dicts = []
    for name, integration_impl in integrations:
        is_installed = integration_impl.check_installation()
        list_of_dicts.append(
            {
                "INSTALLED": ":white_check_mark:" if is_installed else ":x:",
                "INTEGRATION": name,
                "REQUIRED_PACKAGES": ", ".join(
                    integration_impl.get_requirements()
                ),
            }
        )
    return list_of_dicts

generate_stack_component_connect_command(component_type)

Generates a connect command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str, str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
def generate_stack_component_connect_command(
    component_type: StackComponentType,
) -> Callable[[str, str], None]:
    """Generates a `connect` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    _component_display_name(component_type)

    @click.argument(
        "name_id_or_prefix",
        type=str,
        required=False,
    )
    @click.option(
        "--connector",
        "-c",
        "connector",
        help="The name, ID or prefix of the connector to use.",
        required=False,
        type=str,
    )
    @click.option(
        "--resource-id",
        "-r",
        "resource_id",
        help="The resource ID to use with the connector. Only required for "
        "multi-instance connectors that are not already configured with a "
        "particular resource ID.",
        required=False,
        type=str,
    )
    @click.option(
        "--interactive",
        "-i",
        "interactive",
        is_flag=True,
        default=False,
        help="Configure a service connector resource interactively.",
        type=click.BOOL,
    )
    @click.option(
        "--no-verify",
        "no_verify",
        is_flag=True,
        default=False,
        help="Skip verification of the connector resource.",
        type=click.BOOL,
    )
    def connect_stack_component_command(
        name_id_or_prefix: Optional[str],
        connector: Optional[str] = None,
        resource_id: Optional[str] = None,
        interactive: bool = False,
        no_verify: bool = False,
    ) -> None:
        """Connect the stack component to a resource through a service connector.

        Args:
            name_id_or_prefix: The name of the stack component to connect.
            connector: The name, ID or prefix of the connector to use.
            resource_id: The resource ID to use connect to. Only
                required for multi-instance connectors that are not already
                configured with a particular resource ID.
            interactive: Configure a service connector resource interactively.
            no_verify: Do not verify whether the resource is accessible.
        """
        connect_stack_component_with_service_connector(
            component_type=component_type,
            name_id_or_prefix=name_id_or_prefix,
            connector=connector,
            resource_id=resource_id,
            interactive=interactive,
            no_verify=no_verify,
        )

    return connect_stack_component_command

generate_stack_component_copy_command(component_type)

Generates a copy command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str, str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
def generate_stack_component_copy_command(
    component_type: StackComponentType,
) -> Callable[[str, str], None]:
    """Generates a `copy` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "source_component_name_id_or_prefix", type=str, required=True
    )
    @click.argument("target_component", type=str, required=True)
    def copy_stack_component_command(
        source_component_name_id_or_prefix: str,
        target_component: str,
    ) -> None:
        """Copies a stack component.

        Args:
            source_component_name_id_or_prefix: Name or id prefix of the
                                         component to copy.
            target_component: Name of the copied component.
        """
        client = Client()

        with console.status(
            f"Copying {display_name} "
            f"`{source_component_name_id_or_prefix}`..\n"
        ):
            try:
                component_to_copy = client.get_stack_component(
                    name_id_or_prefix=source_component_name_id_or_prefix,
                    component_type=component_type,
                )
            except KeyError as err:
                cli_utils.error(str(err))

            copied_component = client.create_stack_component(
                name=target_component,
                flavor=component_to_copy.flavor_name,
                component_type=component_to_copy.type,
                configuration=component_to_copy.configuration,
                labels=component_to_copy.labels,
            )
            print_model_url(get_component_url(copied_component))

    return copy_stack_component_command

generate_stack_component_delete_command(component_type)

Generates a delete command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
def generate_stack_component_delete_command(
    component_type: StackComponentType,
) -> Callable[[str], None]:
    """Generates a `delete` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument("name_id_or_prefix", type=str)
    def delete_stack_component_command(name_id_or_prefix: str) -> None:
        """Deletes a stack component.

        Args:
            name_id_or_prefix: The name of the stack component to delete.
        """
        client = Client()

        with console.status(
            f"Deleting {display_name} '{name_id_or_prefix}'...\n"
        ):
            try:
                client.delete_stack_component(
                    name_id_or_prefix=name_id_or_prefix,
                    component_type=component_type,
                )
            except (KeyError, IllegalOperationError) as err:
                cli_utils.error(str(err))
            cli_utils.declare(f"Deleted {display_name}: {name_id_or_prefix}")

    return delete_stack_component_command

generate_stack_component_describe_command(component_type)

Generates a describe command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
def generate_stack_component_describe_command(
    component_type: StackComponentType,
) -> Callable[[str], None]:
    """Generates a `describe` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """

    @click.argument(
        "name_id_or_prefix",
        type=str,
        required=False,
    )
    def describe_stack_component_command(name_id_or_prefix: str) -> None:
        """Prints details about the active/specified component.

        Args:
            name_id_or_prefix: Name or id of the component to describe.
        """
        client = Client()
        try:
            component_ = client.get_stack_component(
                name_id_or_prefix=name_id_or_prefix,
                component_type=component_type,
            )
        except KeyError as err:
            cli_utils.error(str(err))

        with console.status(f"Describing component '{component_.name}'..."):
            active_component_id = None
            active_components = client.active_stack_model.components.get(
                component_type, None
            )
            if active_components:
                active_component_id = active_components[0].id

            if component_.connector:
                connector_requirements = (
                    component_.flavor.connector_requirements
                )
            else:
                connector_requirements = None

            cli_utils.print_stack_component_configuration(
                component=component_,
                active_status=component_.id == active_component_id,
                connector_requirements=connector_requirements,
            )

            print_model_url(get_component_url(component_))

    return describe_stack_component_command

generate_stack_component_disconnect_command(component_type)

Generates a disconnect command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
def generate_stack_component_disconnect_command(
    component_type: StackComponentType,
) -> Callable[[str], None]:
    """Generates a `disconnect` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "name_id_or_prefix",
        type=str,
        required=True,
    )
    def disconnect_stack_component_command(name_id_or_prefix: str) -> None:
        """Disconnect a stack component from a service connector.

        Args:
            name_id_or_prefix: The name of the stack component to disconnect.
        """
        client = Client()

        with console.status(
            f"Disconnecting service-connector from {display_name} '{name_id_or_prefix}'...\n"
        ):
            try:
                updated_component = client.update_stack_component(
                    name_id_or_prefix=name_id_or_prefix,
                    component_type=component_type,
                    disconnect=True,
                )
            except (KeyError, IllegalOperationError) as err:
                cli_utils.error(str(err))

            cli_utils.declare(
                f"Successfully disconnected the service-connector from {display_name} `{name_id_or_prefix}`."
            )
            print_model_url(get_component_url(updated_component))

    return disconnect_stack_component_command

generate_stack_component_explain_command(component_type)

Generates an explain command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
def generate_stack_component_explain_command(
    component_type: StackComponentType,
) -> Callable[[], None]:
    """Generates an `explain` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """

    def explain_stack_components_command() -> None:
        """Explains the concept of the stack component."""
        component_module = import_module(f"zenml.{component_type.plural}")

        if component_module.__doc__ is not None:
            md = Markdown(component_module.__doc__)
            console.print(md)
        else:
            console.print(
                "The explain subcommand is yet not available for "
                "this stack component. For more information, you can "
                "visit our docs page: https://docs.zenml.io/ and "
                "stay tuned for future releases."
            )

    return explain_stack_components_command

generate_stack_component_flavor_delete_command(component_type)

Generates a delete command for a single flavor of a component.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
def generate_stack_component_flavor_delete_command(
    component_type: StackComponentType,
) -> Callable[[str], None]:
    """Generates a `delete` command for a single flavor of a component.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "name_or_id",
        type=str,
        required=True,
    )
    def delete_stack_component_flavor_command(name_or_id: str) -> None:
        """Deletes a flavor.

        Args:
            name_or_id: The name of the flavor.
        """
        client = Client()

        with console.status(
            f"Deleting a {display_name} flavor: {name_or_id}`...\n"
        ):
            client.delete_flavor(name_or_id)

            cli_utils.declare(f"Successfully deleted flavor '{name_or_id}'.")

    return delete_stack_component_flavor_command

generate_stack_component_flavor_describe_command(component_type)

Generates a describe command for a single flavor of a component.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
def generate_stack_component_flavor_describe_command(
    component_type: StackComponentType,
) -> Callable[[str], None]:
    """Generates a `describe` command for a single flavor of a component.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "name",
        type=str,
        required=True,
    )
    def describe_stack_component_flavor_command(name: str) -> None:
        """Describes a flavor based on its config schema.

        Args:
            name: The name of the flavor.
        """
        client = Client()

        with console.status(f"Describing {display_name} flavor: {name}`...\n"):
            flavor_model = client.get_flavor_by_name_and_type(
                name=name, component_type=component_type
            )

            cli_utils.describe_pydantic_object(flavor_model.config_schema)
            resources = flavor_model.connector_requirements
            if resources:
                resources_str = f"a '{resources.resource_type}' resource"
                cli_args = f"--resource-type {resources.resource_type}"
                if resources.connector_type:
                    resources_str += (
                        f" provided by a '{resources.connector_type}' "
                        "connector"
                    )
                    cli_args += f"--connector-type {resources.connector_type}"

                cli_utils.declare(
                    f"This flavor supports connecting to external resources "
                    f"with a Service Connector. It requires {resources_str}. "
                    "You can get a list of all available connectors and the "
                    "compatible resources that they can access by running:\n\n"
                    f"'zenml service-connector list-resources {cli_args}'\n"
                    "If no compatible Service Connectors are yet registered, "
                    "you can can register a new one by running:\n\n"
                    f"'zenml service-connector register -i'"
                )
            else:
                cli_utils.declare(
                    "This flavor does not support connecting to external "
                    "resources with a Service Connector."
                )

    return describe_stack_component_flavor_command

generate_stack_component_flavor_list_command(component_type)

Generates a list command for the flavors of a stack component.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
def generate_stack_component_flavor_list_command(
    component_type: StackComponentType,
) -> Callable[[], None]:
    """Generates a `list` command for the flavors of a stack component.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    def list_stack_component_flavor_command() -> None:
        """Lists the flavors for a single type of stack component."""
        client = Client()

        with console.status(f"Listing {display_name} flavors`...\n"):
            flavors = client.get_flavors_by_type(component_type=component_type)

            cli_utils.print_flavor_list(flavors=flavors)
            cli_utils.print_page_info(flavors)

    return list_stack_component_flavor_command

generate_stack_component_flavor_register_command(component_type)

Generates a register command for the flavors of a stack component.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
def generate_stack_component_flavor_register_command(
    component_type: StackComponentType,
) -> Callable[[str], None]:
    """Generates a `register` command for the flavors of a stack component.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    command_name = component_type.value.replace("_", "-")
    display_name = _component_display_name(component_type)

    @click.argument(
        "source",
        type=str,
        required=True,
    )
    def register_stack_component_flavor_command(source: str) -> None:
        """Adds a flavor for a stack component type.

        Example:
            Let's say you create an artifact store flavor class `MyArtifactStoreFlavor`
            in the file path `flavors/my_flavor.py`. You would register it as:

            ```shell
            zenml artifact-store flavor register flavors.my_flavor.MyArtifactStoreFlavor
            ```

        Args:
            source: The source path of the flavor class in dot notation format.
        """
        client = Client()

        if not client.root:
            cli_utils.warning(
                f"You're running the `zenml {command_name} flavor register` "
                "command without a ZenML repository. Your current working "
                "directory will be used as the source root relative to which "
                "the `source` argument is expected. To silence this warning, "
                "run `zenml init` at your source code root."
            )

        with console.status(f"Registering a new {display_name} flavor`...\n"):
            try:
                # Register the new model
                new_flavor = client.create_flavor(
                    source=source,
                    component_type=component_type,
                )
            except ValueError as e:
                source_root = source_utils.get_source_root()

                cli_utils.error(
                    f"Flavor registration failed! ZenML tried loading the "
                    f"module `{source}` from path `{source_root}`. If this is "
                    "not what you expect, then please ensure you have run "
                    "`zenml init` at the root of your repository.\n\n"
                    f"Original exception: {str(e)}"
                )

            cli_utils.declare(
                f"Successfully registered new flavor '{new_flavor.name}' "
                f"for stack component '{new_flavor.type}'."
            )

    return register_stack_component_flavor_command

generate_stack_component_get_command(component_type)

Generates a get command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
def generate_stack_component_get_command(
    component_type: StackComponentType,
) -> Callable[[], None]:
    """Generates a `get` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """

    def get_stack_component_command() -> None:
        """Prints the name of the active component."""
        client = Client()
        display_name = _component_display_name(component_type)

        with console.status(f"Getting the active `{display_name}`...\n"):
            active_stack = client.active_stack_model
            components = active_stack.components.get(component_type, None)

            if components:
                cli_utils.declare(
                    f"Active {display_name}: '{components[0].name}'"
                )
                print_model_url(get_component_url(components[0]))
            else:
                cli_utils.warning(
                    f"No {display_name} set for active stack "
                    f"('{active_stack.name}')."
                )

    return get_stack_component_command

generate_stack_component_list_command(component_type)

Generates a list command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[..., None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
def generate_stack_component_list_command(
    component_type: StackComponentType,
) -> Callable[..., None]:
    """Generates a `list` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """

    @list_options(ComponentFilter)
    @click.pass_context
    def list_stack_components_command(
        ctx: click.Context, **kwargs: Any
    ) -> None:
        """Prints a table of stack components.

        Args:
            ctx: The click context object
            kwargs: Keyword arguments to filter the components.
        """
        client = Client()
        with console.status(f"Listing {component_type.plural}..."):
            kwargs["type"] = component_type
            components = client.list_stack_components(**kwargs)
            if not components:
                cli_utils.declare("No components found for the given filters.")
                return

            cli_utils.print_components_table(
                client=client,
                component_type=component_type,
                components=components.items,
                show_active=not is_sorted_or_filtered(ctx),
            )
            print_page_info(components)

    return list_stack_components_command

generate_stack_component_logs_command(component_type)

Generates a logs command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str, bool], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
def generate_stack_component_logs_command(
    component_type: StackComponentType,
) -> Callable[[str, bool], None]:
    """Generates a `logs` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument("name_id_or_prefix", type=str, required=False)
    @click.option(
        "--follow",
        "-f",
        is_flag=True,
        help="Follow the log file instead of just displaying the current logs.",
    )
    def stack_component_logs_command(
        name_id_or_prefix: str, follow: bool = False
    ) -> None:
        """Displays stack component logs.

        Args:
            name_id_or_prefix: The name of the stack component to display logs
                for.
            follow: Follow the log file instead of just displaying the current
                logs.
        """
        client = Client()

        with console.status(
            f"Fetching the logs for the {display_name} "
            f"'{name_id_or_prefix}'...\n"
        ):
            try:
                component_model = client.get_stack_component(
                    name_id_or_prefix=name_id_or_prefix,
                    component_type=component_type,
                )
            except KeyError as err:
                cli_utils.error(str(err))

            from zenml.stack import StackComponent

            component = StackComponent.from_model(
                component_model=component_model
            )
            log_file = component.log_file

            if not log_file or not fileio.exists(log_file):
                cli_utils.warning(
                    f"Unable to find log file for {display_name} "
                    f"'{name_id_or_prefix}'."
                )
                return

        if not log_file or not fileio.exists(log_file):
            cli_utils.warning(
                f"Unable to find log file for {display_name} "
                f"'{component.name}'."
            )
            return

        if follow:
            try:
                with open(log_file, "r") as f:
                    # seek to the end of the file
                    f.seek(0, 2)

                    while True:
                        line = f.readline()
                        if not line:
                            time.sleep(0.1)
                            continue
                        line = line.rstrip("\n")
                        click.echo(line)
            except KeyboardInterrupt:
                cli_utils.declare(f"Stopped following {display_name} logs.")
        else:
            with open(log_file, "r") as f:
                click.echo(f.read())

    return stack_component_logs_command

generate_stack_component_register_command(component_type)

Generates a register command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str, str, List[str]], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
def generate_stack_component_register_command(
    component_type: StackComponentType,
) -> Callable[[str, str, List[str]], None]:
    """Generates a `register` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "name",
        type=str,
    )
    @click.option(
        "--flavor",
        "-f",
        "flavor",
        help=f"The flavor of the {display_name} to register.",
        required=True,
        type=str,
    )
    @click.option(
        "--label",
        "-l",
        "labels",
        help="Labels to be associated with the component, in the form "
        "-l key1=value1 -l key2=value2.",
        multiple=True,
    )
    @click.option(
        "--connector",
        "-c",
        "connector",
        help="Use this flag to connect this stack component to a service connector.",
        type=str,
    )
    @click.option(
        "--resource-id",
        "-r",
        "resource_id",
        help="The resource ID to use with the connector. Only required for "
        "multi-instance connectors that are not already configured with a "
        "particular resource ID.",
        required=False,
        type=str,
    )
    @click.argument("args", nargs=-1, type=click.UNPROCESSED)
    def register_stack_component_command(
        name: str,
        flavor: str,
        args: List[str],
        labels: Optional[List[str]] = None,
        connector: Optional[str] = None,
        resource_id: Optional[str] = None,
    ) -> None:
        """Registers a stack component.

        Args:
            name: Name of the component to register.
            flavor: Flavor of the component to register.
            args: Additional arguments to pass to the component.
            labels: Labels to be associated with the component.
            connector: Name of the service connector to connect the component to.
            resource_id: The resource ID to use with the connector.
        """
        client = Client()

        # Parse the given args
        # name is guaranteed to be set by parse_name_and_extra_arguments
        name, parsed_args = cli_utils.parse_name_and_extra_arguments(  # type: ignore[assignment]
            list(args) + [name], expand_args=True
        )

        parsed_labels = cli_utils.get_parsed_labels(labels)

        if connector:
            try:
                client.get_service_connector(connector)
            except KeyError as err:
                cli_utils.error(
                    f"Could not find a connector '{connector}': {str(err)}"
                )

        with console.status(f"Registering {display_name} '{name}'...\n"):
            # Create a new stack component model
            component = client.create_stack_component(
                name=name,
                flavor=flavor,
                component_type=component_type,
                configuration=parsed_args,
                labels=parsed_labels,
            )

            cli_utils.declare(
                f"Successfully registered {component.type} `{component.name}`."
            )
            print_model_url(get_component_url(component))

        if connector:
            connect_stack_component_with_service_connector(
                component_type=component_type,
                name_id_or_prefix=name,
                connector=connector,
                interactive=False,
                no_verify=False,
                resource_id=resource_id,
            )

    return register_stack_component_command

generate_stack_component_remove_attribute_command(component_type)

Generates remove_attribute command for a specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str, List[str]], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
def generate_stack_component_remove_attribute_command(
    component_type: StackComponentType,
) -> Callable[[str, List[str]], None]:
    """Generates `remove_attribute` command for a specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "name_id_or_prefix",
        type=str,
        required=True,
    )
    @click.option(
        "--label",
        "-l",
        "labels",
        help="Labels to be removed from the component.",
        multiple=True,
    )
    @click.argument("args", nargs=-1, type=click.UNPROCESSED)
    def remove_attribute_stack_component_command(
        name_id_or_prefix: str,
        args: List[str],
        labels: Optional[List[str]] = None,
    ) -> None:
        """Removes one or more attributes from a stack component.

        Args:
            name_id_or_prefix: The name of the stack component to remove the
                attribute from.
            args: Additional arguments to pass to the remove_attribute command.
            labels: Labels to be removed from the component.
        """
        client = Client()

        with console.status(
            f"Updating {display_name} '{name_id_or_prefix}'...\n"
        ):
            try:
                updated_component = client.update_stack_component(
                    name_id_or_prefix=name_id_or_prefix,
                    component_type=component_type,
                    configuration={k: None for k in args},
                    labels={k: None for k in labels} if labels else None,
                )
            except (KeyError, IllegalOperationError) as err:
                cli_utils.error(str(err))

            cli_utils.declare(
                f"Successfully updated {display_name} `{name_id_or_prefix}`."
            )
            print_model_url(get_component_url(updated_component))

    return remove_attribute_stack_component_command

generate_stack_component_rename_command(component_type)

Generates a rename command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str, str], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
def generate_stack_component_rename_command(
    component_type: StackComponentType,
) -> Callable[[str, str], None]:
    """Generates a `rename` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "name_id_or_prefix",
        type=str,
        required=True,
    )
    @click.argument(
        "new_name",
        type=str,
        required=True,
    )
    def rename_stack_component_command(
        name_id_or_prefix: str, new_name: str
    ) -> None:
        """Rename a stack component.

        Args:
            name_id_or_prefix: The name of the stack component to rename.
            new_name: The new name of the stack component.
        """
        client = Client()

        with console.status(
            f"Renaming {display_name} '{name_id_or_prefix}'...\n"
        ):
            try:
                updated_component = client.update_stack_component(
                    name_id_or_prefix=name_id_or_prefix,
                    component_type=component_type,
                    name=new_name,
                )
            except (KeyError, IllegalOperationError) as err:
                cli_utils.error(str(err))

            cli_utils.declare(
                f"Successfully renamed {display_name} `{name_id_or_prefix}` to"
                f" `{new_name}`."
            )
            print_model_url(get_component_url(updated_component))

    return rename_stack_component_command

generate_stack_component_update_command(component_type)

Generates an update command for the specific stack component type.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required

Returns:

Type Description
Callable[[str, List[str]], None]

A function that can be used as a click command.

Source code in src/zenml/cli/stack_components.py
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
def generate_stack_component_update_command(
    component_type: StackComponentType,
) -> Callable[[str, List[str]], None]:
    """Generates an `update` command for the specific stack component type.

    Args:
        component_type: Type of the component to generate the command for.

    Returns:
        A function that can be used as a `click` command.
    """
    display_name = _component_display_name(component_type)

    @click.argument(
        "name_id_or_prefix",
        type=str,
        required=False,
    )
    @click.option(
        "--label",
        "-l",
        "labels",
        help="Labels to be associated with the component, in the form "
        "-l key1=value1 -l key2=value2.",
        multiple=True,
    )
    @click.argument("args", nargs=-1, type=click.UNPROCESSED)
    def update_stack_component_command(
        name_id_or_prefix: Optional[str],
        args: List[str],
        labels: Optional[List[str]] = None,
    ) -> None:
        """Updates a stack component.

        Args:
            name_id_or_prefix: The name or id of the stack component to update.
            args: Additional arguments to pass to the update command.
            labels: Labels to be associated with the component.
        """
        client = Client()

        # Parse the given args
        args = list(args)
        if name_id_or_prefix:
            args.append(name_id_or_prefix)

        name_or_id, parsed_args = cli_utils.parse_name_and_extra_arguments(
            args,
            expand_args=True,
            name_mandatory=False,
        )

        parsed_labels = cli_utils.get_parsed_labels(labels)

        with console.status(f"Updating {display_name}...\n"):
            try:
                updated_component = client.update_stack_component(
                    name_id_or_prefix=name_or_id,
                    component_type=component_type,
                    configuration=parsed_args,
                    labels=parsed_labels,
                )
            except KeyError as err:
                cli_utils.error(str(err))

            cli_utils.declare(
                f"Successfully updated {display_name} "
                f"`{updated_component.name}`."
            )
            print_model_url(get_component_url(updated_component))

    return update_stack_component_command

get_active_stack()

Gets the active stack.

Source code in src/zenml/cli/stack.py
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
@stack.command("get")
def get_active_stack() -> None:
    """Gets the active stack."""
    scope = "repository" if Client().uses_local_configuration else "global"

    with console.status("Getting the active stack..."):
        client = Client()
        try:
            cli_utils.declare(
                f"The {scope} active stack is: '{client.active_stack_model.name}'"
            )
        except KeyError as err:
            cli_utils.error(str(err))

get_component_url(component)

Function to get the dashboard URL of a given component model.

Parameters:

Name Type Description Default
component ComponentResponse

the response model of the given component.

required

Returns:

Type Description
Optional[str]

the URL to the component if the dashboard is available, else None.

Source code in src/zenml/utils/dashboard_utils.py
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
def get_component_url(component: ComponentResponse) -> Optional[str]:
    """Function to get the dashboard URL of a given component model.

    Args:
        component: the response model of the given component.

    Returns:
        the URL to the component if the dashboard is available, else None.
    """
    base_url = get_server_dashboard_url()

    if base_url:
        return base_url + constants.STACKS

    return None

get_environment()

Returns a string representing the execution environment of the pipeline.

Returns:

Name Type Description
str str

the execution environment

Source code in src/zenml/environment.py
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
def get_environment() -> str:
    """Returns a string representing the execution environment of the pipeline.

    Returns:
        str: the execution environment
    """
    # Order is important here
    if Environment.in_kubernetes():
        return EnvironmentType.KUBERNETES
    elif Environment.in_github_actions():
        return EnvironmentType.GITHUB_ACTION
    elif Environment.in_gitlab_ci():
        return EnvironmentType.GITLAB_CI
    elif Environment.in_circle_ci():
        return EnvironmentType.CIRCLE_CI
    elif Environment.in_bitbucket_ci():
        return EnvironmentType.BITBUCKET_CI
    elif Environment.in_ci():
        return EnvironmentType.GENERIC_CI
    elif Environment.in_github_codespaces():
        return EnvironmentType.GITHUB_CODESPACES
    elif Environment.in_vscode_remote_container():
        return EnvironmentType.VSCODE_REMOTE_CONTAINER
    elif Environment.in_lightning_ai_studio():
        return EnvironmentType.LIGHTNING_AI_STUDIO
    elif Environment.in_docker():
        return EnvironmentType.DOCKER
    elif Environment.in_container():
        return EnvironmentType.CONTAINER
    elif Environment.in_google_colab():
        return EnvironmentType.COLAB
    elif Environment.in_paperspace_gradient():
        return EnvironmentType.PAPERSPACE
    elif Environment.in_notebook():
        return EnvironmentType.NOTEBOOK
    elif Environment.in_wsl():
        return EnvironmentType.WSL
    else:
        return EnvironmentType.NATIVE

get_global_config_directory()

Gets the global config directory for ZenML.

Returns:

Type Description
str

The global config directory for ZenML.

Source code in src/zenml/utils/io_utils.py
53
54
55
56
57
58
59
60
61
62
def get_global_config_directory() -> str:
    """Gets the global config directory for ZenML.

    Returns:
        The global config directory for ZenML.
    """
    env_var_path = os.getenv(ENV_ZENML_CONFIG_PATH)
    if env_var_path:
        return str(Path(env_var_path).resolve())
    return click.get_app_dir(APP_NAME)

get_local_server()

Get the active local server.

Call this function to retrieve the local server deployed on this machine.

Returns:

Type Description
Optional[LocalServerDeployment]

The local server deployment or None, if no local server deployment was

Optional[LocalServerDeployment]

found.

Source code in src/zenml/zen_server/utils.py
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
def get_local_server() -> Optional["LocalServerDeployment"]:
    """Get the active local server.

    Call this function to retrieve the local server deployed on this machine.

    Returns:
        The local server deployment or None, if no local server deployment was
        found.
    """
    from zenml.zen_server.deploy.deployer import LocalServerDeployer

    deployer = LocalServerDeployer()
    try:
        return deployer.get_server()
    except ServerDeploymentNotFoundError:
        return None

get_logger(logger_name)

Main function to get logger name,.

Parameters:

Name Type Description Default
logger_name str

Name of logger to initialize.

required

Returns:

Type Description
Logger

A logger object.

Source code in src/zenml/logger.py
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
def get_logger(logger_name: str) -> logging.Logger:
    """Main function to get logger name,.

    Args:
        logger_name: Name of logger to initialize.

    Returns:
        A logger object.
    """
    logger = logging.getLogger(logger_name)
    logger.setLevel(get_logging_level().value)
    logger.addHandler(get_console_handler())

    logger.propagate = False
    return logger

get_requirements(integration_name=None)

List all requirements for the chosen integration.

Parameters:

Name Type Description Default
integration_name Optional[str]

The name of the integration to list the requirements for.

None
Source code in src/zenml/cli/integration.py
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
@integration.command(
    name="requirements", help="List all requirements for an integration."
)
@click.argument("integration_name", required=False, default=None)
def get_requirements(integration_name: Optional[str] = None) -> None:
    """List all requirements for the chosen integration.

    Args:
        integration_name: The name of the integration to list the requirements
            for.
    """
    from zenml.integrations.registry import integration_registry

    try:
        requirements = integration_registry.select_integration_requirements(
            integration_name
        )
    except KeyError as e:
        error(str(e))
    else:
        if requirements:
            title(
                f"Requirements for {integration_name or 'all integrations'}:\n"
            )
            declare(f"{requirements}")
            warning(
                "\n" + "To install the dependencies of a "
                "specific integration, type: "
            )
            warning("zenml integration install INTEGRATION_NAME")

get_resources_options_from_resource_model_for_full_stack(connector_details)

Get the resource options from the resource model for the full stack.

Parameters:

Name Type Description Default
connector_details Union[UUID, ServiceConnectorInfo]

The service connector details (UUID or Info).

required

Returns:

Type Description
ServiceConnectorResourcesInfo

All available service connector resource options.

Source code in src/zenml/service_connectors/service_connector_utils.py
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
def get_resources_options_from_resource_model_for_full_stack(
    connector_details: Union[UUID, ServiceConnectorInfo],
) -> ServiceConnectorResourcesInfo:
    """Get the resource options from the resource model for the full stack.

    Args:
        connector_details: The service connector details (UUID or Info).

    Returns:
        All available service connector resource options.
    """
    client = Client()
    zen_store = client.zen_store

    if isinstance(connector_details, UUID):
        resource_model = zen_store.verify_service_connector(
            connector_details,
            list_resources=True,
        )
    else:
        resource_model = zen_store.verify_service_connector_config(
            service_connector=ServiceConnectorRequest(
                user=client.active_user.id,
                workspace=client.active_workspace.id,
                name="fake",
                connector_type=connector_details.type,
                auth_method=connector_details.auth_method,
                configuration=connector_details.configuration,
                secrets={},
                labels={},
            ),
            list_resources=True,
        )

    resources = resource_model.resources

    if isinstance(
        resource_model.connector_type,
        str,
    ):
        connector_type = resource_model.connector_type
    else:
        connector_type = resource_model.connector_type.connector_type

    artifact_stores: List[ResourcesInfo] = []
    orchestrators: List[ResourcesInfo] = []
    container_registries: List[ResourcesInfo] = []

    if connector_type == "aws":
        for each in resources:
            if each.resource_ids:
                if each.resource_type == "s3-bucket":
                    artifact_stores.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ARTIFACT_STORE,
                            flavor="s3",
                            required_configuration={"path": "Path"},
                            use_resource_value_as_fixed_config=True,
                            flavor_display_name="S3 Bucket",
                        )
                    )
                if each.resource_type == "aws-generic":
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="sagemaker",
                            required_configuration={
                                "execution_role": "execution role ARN"
                            },
                            flavor_display_name="AWS Sagemaker",
                        )
                    )
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="vm_aws",
                            required_configuration={"region": "region"},
                            use_resource_value_as_fixed_config=True,
                            flavor_display_name="Skypilot (EC2)",
                        )
                    )

                if each.resource_type == "kubernetes-cluster":
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="kubernetes",
                            required_configuration={},
                            flavor_display_name="Kubernetes",
                        )
                    )
                if each.resource_type == "docker-registry":
                    container_registries.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.CONTAINER_REGISTRY,
                            flavor="aws",
                            required_configuration={"uri": "URI"},
                            use_resource_value_as_fixed_config=True,
                            flavor_display_name="ECR",
                        )
                    )

    elif connector_type == "gcp":
        for each in resources:
            if each.resource_ids:
                if each.resource_type == "gcs-bucket":
                    artifact_stores.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ARTIFACT_STORE,
                            flavor="gcp",
                            required_configuration={"path": "Path"},
                            use_resource_value_as_fixed_config=True,
                            flavor_display_name="GCS Bucket",
                        )
                    )
                if each.resource_type == "gcp-generic":
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="vertex",
                            required_configuration={"location": "region name"},
                            flavor_display_name="Vertex AI",
                        )
                    )
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="vm_gcp",
                            required_configuration={"region": "region name"},
                            flavor_display_name="Skypilot (Compute)",
                        )
                    )

                if each.resource_type == "kubernetes-cluster":
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="kubernetes",
                            required_configuration={},
                            flavor_display_name="Kubernetes",
                        )
                    )
                if each.resource_type == "docker-registry":
                    container_registries.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.CONTAINER_REGISTRY,
                            flavor="gcp",
                            required_configuration={"uri": "URI"},
                            use_resource_value_as_fixed_config=True,
                            flavor_display_name="GCR",
                        )
                    )

    elif connector_type == "azure":
        for each in resources:
            if each.resource_ids:
                if each.resource_type == "blob-container":
                    artifact_stores.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ARTIFACT_STORE,
                            flavor="azure",
                            required_configuration={"path": "Path"},
                            use_resource_value_as_fixed_config=True,
                            flavor_display_name="Blob container",
                        )
                    )
                if each.resource_type == "azure-generic":
                    # No native orchestrator ATM
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="vm_azure",
                            required_configuration={"region": "region name"},
                            flavor_display_name="Skypilot (VM)",
                        )
                    )
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="azureml",
                            required_configuration={
                                "subscription_id": "subscription ID",
                                "resource_group": "resource group",
                                "workspace": "workspace",
                            },
                            flavor_display_name="AzureML",
                        )
                    )

                if each.resource_type == "kubernetes-cluster":
                    orchestrators.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.ORCHESTRATOR,
                            flavor="kubernetes",
                            required_configuration={},
                            flavor_display_name="Kubernetes",
                        )
                    )
                if each.resource_type == "docker-registry":
                    container_registries.append(
                        _prepare_resource_info(
                            connector_details=connector_details,
                            resource_ids=each.resource_ids,
                            stack_component_type=StackComponentType.CONTAINER_REGISTRY,
                            flavor="azure",
                            required_configuration={"uri": "URI"},
                            use_resource_value_as_fixed_config=True,
                            flavor_display_name="ACR",
                        )
                    )

    _raise_specific_cloud_exception_if_needed(
        cloud_provider=connector_type,
        artifact_stores=artifact_stores,
        orchestrators=orchestrators,
        container_registries=container_registries,
    )

    return ServiceConnectorResourcesInfo(
        connector_type=connector_type,
        components_resources_info={
            StackComponentType.ARTIFACT_STORE: artifact_stores,
            StackComponentType.ORCHESTRATOR: orchestrators,
            StackComponentType.CONTAINER_REGISTRY: container_registries,
        },
    )

get_secret(name_id_or_prefix, scope=None)

Get a secret and print it to the console.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name of the secret to get.

required
scope Optional[str]

The scope of the secret to get.

None
Source code in src/zenml/cli/secret.py
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
@secret.command("get", help="Get a secret with a given name, prefix or id.")
@click.argument(
    "name_id_or_prefix",
    type=click.STRING,
)
@click.option(
    "--scope",
    "-s",
    type=click.Choice([scope.value for scope in list(SecretScope)]),
    default=None,
)
def get_secret(name_id_or_prefix: str, scope: Optional[str] = None) -> None:
    """Get a secret and print it to the console.

    Args:
        name_id_or_prefix: The name of the secret to get.
        scope: The scope of the secret to get.
    """
    secret = _get_secret(name_id_or_prefix, scope)
    declare(
        f"Fetched secret with name `{secret.name}` and ID `{secret.id}` in "
        f"scope `{secret.scope.value}`:"
    )
    if not secret.secret_values:
        warning(f"Secret with name `{name_id_or_prefix}` is empty.")
    else:
        pretty_print_secret(secret.secret_values, hide_secret=False)

get_stack_url(stack)

Function to get the dashboard URL of a given stack model.

Parameters:

Name Type Description Default
stack StackResponse

the response model of the given stack.

required

Returns:

Type Description
Optional[str]

the URL to the stack if the dashboard is available, else None.

Source code in src/zenml/utils/dashboard_utils.py
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
def get_stack_url(stack: StackResponse) -> Optional[str]:
    """Function to get the dashboard URL of a given stack model.

    Args:
        stack: the response model of the given stack.

    Returns:
        the URL to the stack if the dashboard is available, else None.
    """
    base_url = get_server_dashboard_url()

    if base_url:
        return base_url + constants.STACKS

    return None

go()

Quickly explore ZenML with this walk-through.

Raises:

Type Description
GitNotFoundError

If git is not installed.

e

when Jupyter Notebook fails to launch.

Source code in src/zenml/cli/base.py
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
@cli.command("go")
def go() -> None:
    """Quickly explore ZenML with this walk-through.

    Raises:
        GitNotFoundError: If git is not installed.
        e: when Jupyter Notebook fails to launch.
    """
    from zenml.cli.text_utils import (
        zenml_cli_privacy_message,
        zenml_cli_welcome_message,
        zenml_go_notebook_tutorial_message,
    )

    metadata = {}

    console.print(zenml_cli_welcome_message, width=80)

    client = Client()

    # Only ask them if they haven't been asked before and the email
    # hasn't been supplied by other means
    if (
        not GlobalConfiguration().user_email
        and client.active_user.email_opted_in is None
    ):
        gave_email = _prompt_email(AnalyticsEventSource.ZENML_GO)
        metadata = {"gave_email": gave_email}

    zenml_tutorial_path = os.path.join(os.getcwd(), "zenml_tutorial")

    if not is_jupyter_installed():
        cli_utils.error(
            "Jupyter Notebook or JupyterLab is not installed. "
            "Please install the 'notebook' package with `pip` "
            "first so you can run the tutorial notebooks."
        )

    with track_handler(event=AnalyticsEvent.RUN_ZENML_GO, metadata=metadata):
        console.print(zenml_cli_privacy_message, width=80)

        if not os.path.isdir(zenml_tutorial_path):
            try:
                from git.repo.base import Repo
            except ImportError as e:
                cli_utils.error(
                    "At this point we would want to clone our tutorial repo "
                    "onto your machine to let you dive right into our code. "
                    "However, this machine has no installation of Git. Feel "
                    "free to install git and rerun this command. Alternatively "
                    "you can also download the repo manually here: "
                    f"{TUTORIAL_REPO}. The tutorial is in the "
                    f"'examples/quickstart/notebooks' directory."
                )
                raise GitNotFoundError(e)

            with tempfile.TemporaryDirectory() as tmpdirname:
                tmp_cloned_dir = os.path.join(tmpdirname, "zenml_repo")
                with console.status(
                    "Cloning tutorial. This sometimes takes a minute..."
                ):
                    Repo.clone_from(
                        TUTORIAL_REPO,
                        tmp_cloned_dir,
                        branch=f"release/{zenml_version}",
                        depth=1,  # to prevent timeouts when downloading
                    )
                example_dir = os.path.join(
                    tmp_cloned_dir, "examples/quickstart"
                )
                copy_dir(example_dir, zenml_tutorial_path)
        else:
            cli_utils.warning(
                f"{zenml_tutorial_path} already exists! Continuing without "
                "cloning."
            )

        # get list of all .ipynb files in zenml_tutorial_path
        ipynb_files = []
        for dirpath, _, filenames in os.walk(zenml_tutorial_path):
            for filename in filenames:
                if filename.endswith(".ipynb"):
                    ipynb_files.append(os.path.join(dirpath, filename))

        ipynb_files.sort()
        console.print(
            zenml_go_notebook_tutorial_message(ipynb_files), width=80
        )
        input("Press ENTER to continue...")

    try:
        subprocess.check_call(
            ["jupyter", "notebook", "--ContentsManager.allow_hidden=True"],
            cwd=zenml_tutorial_path,
        )
    except subprocess.CalledProcessError as e:
        cli_utils.error(
            "An error occurred while launching Jupyter Notebook. "
            "Please make sure Jupyter is properly installed and try again."
        )
        raise e

import_stack(stack_name, filename, ignore_version_mismatch=False)

Import a stack from YAML.

Parameters:

Name Type Description Default
stack_name str

The name of the stack to import.

required
filename Optional[str]

The filename to import the stack from.

required
ignore_version_mismatch bool

Import stack components even if the installed version of ZenML is different from the one specified in the stack YAML file.

False
Source code in src/zenml/cli/stack.py
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
@stack.command("import", help="Import a stack from YAML.")
@click.argument("stack_name", type=str, required=True)
@click.option("--filename", "-f", type=str, required=False)
@click.option(
    "--ignore-version-mismatch",
    is_flag=True,
    help="Import stack components even if the installed version of ZenML "
    "is different from the one specified in the stack YAML file",
)
def import_stack(
    stack_name: str,
    filename: Optional[str],
    ignore_version_mismatch: bool = False,
) -> None:
    """Import a stack from YAML.

    Args:
        stack_name: The name of the stack to import.
        filename: The filename to import the stack from.
        ignore_version_mismatch: Import stack components even if
            the installed version of ZenML is different from the
            one specified in the stack YAML file.
    """
    # handle 'zenml stack import file.yaml' calls
    if stack_name.endswith(".yaml") and filename is None:
        filename = stack_name
        data = read_yaml(filename)
        stack_name = data["stack_name"]  # read stack_name from export

    # standard 'zenml stack import stack_name [file.yaml]' calls
    else:
        # if filename is not given, assume default export name
        # "<stack_name>.yaml"
        if filename is None:
            filename = stack_name + ".yaml"
        data = read_yaml(filename)
        cli_utils.declare(
            f"Using '{filename}' to import '{stack_name}' stack."
        )

    # assert zenml version is the same if force is false
    if data["zenml_version"] != zenml.__version__:
        if ignore_version_mismatch:
            cli_utils.warning(
                f"The stack that will be installed is using ZenML version "
                f"{data['zenml_version']}. You have version "
                f"{zenml.__version__} installed. Some components might not "
                "work as expected."
            )
        else:
            cli_utils.error(
                f"Cannot import stacks from other ZenML versions. "
                f"The stack was created using ZenML version "
                f"{data['zenml_version']}, you have version "
                f"{zenml.__version__} installed. You can "
                "retry using the `--ignore-version-mismatch` "
                "flag. However, be aware that this might "
                "fail or lead to other unexpected behavior."
            )

    # ask user for a new stack_name if current one already exists
    client = Client()
    if client.list_stacks(name=stack_name):
        stack_name = click.prompt(
            f"Stack `{stack_name}` already exists. Please choose a different "
            f"name",
            type=str,
        )

    # import stack components
    component_ids = {}
    for component_type_str, component_config in data["components"].items():
        component_type = StackComponentType(component_type_str)

        component_id = _import_stack_component(
            component_type=component_type,
            component_dict=component_config,
        )
        component_ids[component_type] = component_id

    imported_stack = Client().create_stack(
        name=stack_name, components=component_ids
    )

    print_model_url(get_stack_url(imported_stack))

info(packages, all=False, file='', stack=False)

Show information about the current user setup.

Parameters:

Name Type Description Default
packages Tuple[str]

List of packages to show information about.

required
all bool

Flag to show information about all installed packages.

False
file str

Flag to output to a file.

''
stack bool

Flag to output information about active stack and components

False
Source code in src/zenml/cli/base.py
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
@cli.command(
    "info", help="Show information about the current user setup.", hidden=True
)
@click.option(
    "--all",
    "-a",
    is_flag=True,
    default=False,
    help="Output information about all installed packages.",
    type=bool,
)
@click.option(
    "--file",
    "-f",
    default="",
    help="Path to export to a .yaml file.",
    type=click.Path(exists=False, dir_okay=False),
)
@click.option(
    "--packages",
    "-p",
    multiple=True,
    help="Select specific installed packages.",
    type=str,
)
@click.option(
    "--stack",
    "-s",
    is_flag=True,
    default=False,
    help="Output information about active stack and components.",
    type=bool,
)
def info(
    packages: Tuple[str],
    all: bool = False,
    file: str = "",
    stack: bool = False,
) -> None:
    """Show information about the current user setup.

    Args:
        packages: List of packages to show information about.
        all: Flag to show information about all installed packages.
        file: Flag to output to a file.
        stack: Flag to output information about active stack and components
    """
    gc = GlobalConfiguration()
    environment = Environment()
    client = Client()
    store_info = client.zen_store.get_store_info()

    store_cfg = gc.store_configuration

    user_info = {
        "zenml_local_version": zenml_version,
        "zenml_server_version": store_info.version,
        "zenml_server_database": str(store_info.database_type),
        "zenml_server_deployment_type": str(store_info.deployment_type),
        "zenml_config_dir": gc.config_directory,
        "zenml_local_store_dir": gc.local_stores_path,
        "zenml_server_url": store_cfg.url,
        "zenml_active_repository_root": str(client.root),
        "python_version": environment.python_version(),
        "environment": get_environment(),
        "system_info": environment.get_system_info(),
        "active_workspace": client.active_workspace.name,
        "active_stack": client.active_stack_model.name,
        "active_user": client.active_user.name,
        "telemetry_status": "enabled" if gc.analytics_opt_in else "disabled",
        "analytics_client_id": str(gc.user_id),
        "analytics_user_id": str(client.active_user.id),
        "analytics_server_id": str(client.zen_store.get_store_info().id),
        "integrations": integration_registry.get_installed_integrations(),
        "packages": {},
        "query_packages": {},
    }

    if all:
        user_info["packages"] = cli_utils.get_package_information()
    if packages:
        if user_info.get("packages"):
            if isinstance(user_info["packages"], dict):
                user_info["query_packages"] = {
                    p: v
                    for p, v in user_info["packages"].items()
                    if p in packages
                }
        else:
            user_info["query_packages"] = cli_utils.get_package_information(
                list(packages)
            )
    if file:
        file_write_path = os.path.abspath(file)
        write_yaml(file, user_info)
        declare(f"Wrote user debug info to file at '{file_write_path}'.")
    else:
        cli_utils.print_user_info(user_info)

    if stack:
        try:
            cli_utils.print_debug_stack()
        except ModuleNotFoundError as e:
            cli_utils.warning(
                "Could not print debug stack information. Please make sure "
                "you have the necessary dependencies and integrations "
                "installed for all your stack components."
            )
            cli_utils.warning(f"The missing package is: '{e.name}'")

init(path, template=None, template_tag=None, template_with_defaults=False, test=False)

Initialize ZenML on given path.

Parameters:

Name Type Description Default
path Optional[Path]

Path to the repository.

required
template Optional[str]

Optional name or URL of the ZenML project template to use to initialize the repository. Can be a string like e2e_batch, nlp, starter or a copier URL like gh:owner/repo_name. If not specified, no template is used.

None
template_tag Optional[str]

Optional tag of the ZenML project template to use to initialize the repository. If template is a pre-defined template, then this is ignored.

None
template_with_defaults bool

Whether to use default parameters of the ZenML project template

False
test bool

Whether to skip interactivity when testing.

False
Source code in src/zenml/cli/base.py
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
@cli.command("init", help="Initialize a ZenML repository.")
@click.option(
    "--path",
    type=click.Path(
        exists=True, file_okay=False, dir_okay=True, path_type=Path
    ),
)
@click.option(
    "--template",
    type=str,
    required=False,
    help="Name or URL of the ZenML project template to use to initialize the "
    "repository, Can be a string like `e2e_batch`, `nlp`, `llm_finetuning`, "
    "`starter` etc. or a copier URL like gh:owner/repo_name. If not specified, "
    "no template is used.",
)
@click.option(
    "--template-tag",
    type=str,
    required=False,
    help="Optional tag of the ZenML project template to use to initialize the "
    "repository.",
)
@click.option(
    "--template-with-defaults",
    is_flag=True,
    default=False,
    required=False,
    help="Whether to use default parameters of the ZenML project template",
)
@click.option(
    "--test",
    is_flag=True,
    default=False,
    help="To skip interactivity when testing.",
    hidden=True,
)
def init(
    path: Optional[Path],
    template: Optional[str] = None,
    template_tag: Optional[str] = None,
    template_with_defaults: bool = False,
    test: bool = False,
) -> None:
    """Initialize ZenML on given path.

    Args:
        path: Path to the repository.
        template: Optional name or URL of the ZenML project template to use to
            initialize the repository. Can be a string like `e2e_batch`,
            `nlp`, `starter` or a copier URL like `gh:owner/repo_name`. If
            not specified, no template is used.
        template_tag: Optional tag of the ZenML project template to use to
            initialize the repository. If template is a pre-defined template,
            then this is ignored.
        template_with_defaults: Whether to use default parameters of
            the ZenML project template
        test: Whether to skip interactivity when testing.
    """
    if path is None:
        path = Path.cwd()

    os.environ[ENV_ZENML_ENABLE_REPO_INIT_WARNINGS] = "False"

    if template:
        try:
            from copier import Worker
        except ImportError:
            error(
                "You need to install the ZenML project template requirements "
                "to use templates. Please run `pip install zenml[templates]` "
                "and try again."
            )
            return

        from zenml.cli.text_utils import (
            zenml_cli_privacy_message,
            zenml_cli_welcome_message,
        )

        console.print(zenml_cli_welcome_message, width=80)

        client = Client()
        # Only ask them if they haven't been asked before and the email
        # hasn't been supplied by other means
        if (
            not GlobalConfiguration().user_email
            and client.active_user.email_opted_in is None
            and not test
        ):
            _prompt_email(AnalyticsEventSource.ZENML_INIT)

        email = GlobalConfiguration().user_email or ""
        metadata = {
            "email": email,
            "template": template,
            "prompt": not template_with_defaults,
        }

        with track_handler(
            event=AnalyticsEvent.GENERATE_TEMPLATE,
            metadata=metadata,
        ):
            console.print(zenml_cli_privacy_message, width=80)

            if not template_with_defaults:
                from rich.markdown import Markdown

                prompt_message = Markdown(
                    """
## 🧑‍🏫 Project template parameters
"""
                )

                console.print(prompt_message, width=80)

            # Check if template is a URL or a preset template name
            vcs_ref: Optional[str] = None
            if template in ZENML_PROJECT_TEMPLATES:
                declare(f"Using the {template} template...")
                zenml_project_template = ZENML_PROJECT_TEMPLATES[template]
                src_path = zenml_project_template.copier_github_url
                # template_tag is ignored in this case
                vcs_ref = zenml_project_template.github_tag
            else:
                declare(
                    f"List of known templates is: {', '.join(ZENML_PROJECT_TEMPLATES.keys())}"
                )
                declare(
                    f"No known templates specified. Using `{template}` as URL."
                    "If this is not a valid copier template URL, this will "
                    "fail."
                )

                src_path = template
                vcs_ref = template_tag

            with Worker(
                src_path=src_path,
                vcs_ref=vcs_ref,
                dst_path=path,
                data=dict(
                    email=email,
                    template=template,
                ),
                defaults=template_with_defaults,
                user_defaults=dict(
                    email=email,
                ),
                overwrite=template_with_defaults,
                unsafe=True,
            ) as worker:
                worker.run_copy()

    with console.status(f"Initializing ZenML repository at {path}.\n"):
        try:
            Client.initialize(root=path)
            declare(f"ZenML repository initialized at {path}.")
        except InitializationException as e:
            declare(f"{e}")
            return

    declare(
        f"The local active stack was initialized to "
        f"'{Client().active_stack_model.name}'. This local configuration "
        f"will only take effect when you're running ZenML from the initialized "
        f"repository root, or from a subdirectory. For more information on "
        f"repositories and configurations, please visit "
        f"https://docs.zenml.io/user-guide/production-guide/understand-stacks."
    )

install(integrations, ignore_integration, force=False, uv=False)

Installs the required packages for a given integration.

If no integration is specified all required packages for all integrations are installed using pip or uv.

Parameters:

Name Type Description Default
integrations Tuple[str]

The name of the integration to install the requirements for.

required
ignore_integration Tuple[str]

Integrations to ignore explicitly (passed in separately).

required
force bool

Force the installation of the required packages.

False
uv bool

Use uv for package installation (experimental).

False
Source code in src/zenml/cli/integration.py
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
@integration.command(
    help="Install the required packages for the integration of choice."
)
@click.argument("integrations", nargs=-1, required=False)
@click.option(
    "--ignore-integration",
    "-i",
    multiple=True,
    help="Integrations to ignore explicitly (passed in separately).",
)
@click.option(
    "--yes",
    "-y",
    "force",
    is_flag=True,
    help="Force the installation of the required packages. This will skip the "
    "confirmation step and reinstall existing packages as well",
)
@click.option(
    "--uv",
    "uv",
    is_flag=True,
    help="Experimental: Use uv for package installation.",
    default=False,
)
def install(
    integrations: Tuple[str],
    ignore_integration: Tuple[str],
    force: bool = False,
    uv: bool = False,
) -> None:
    """Installs the required packages for a given integration.

    If no integration is specified all required packages for all integrations
    are installed using pip or uv.

    Args:
        integrations: The name of the integration to install the requirements
            for.
        ignore_integration: Integrations to ignore explicitly (passed in
            separately).
        force: Force the installation of the required packages.
        uv: Use uv for package installation (experimental).
    """
    from zenml.cli.utils import is_pip_installed, is_uv_installed
    from zenml.integrations.registry import integration_registry

    if uv and not is_uv_installed():
        error(
            "UV is not installed but the uv flag was passed in. Please install "
            "uv or remove the uv flag."
        )

    if not uv and not is_pip_installed():
        error(
            "Pip is not installed. Please install pip or use the uv flag "
            "(--uv) for package installation."
        )

    if not integrations:
        # no integrations specified, use all registered integrations
        integration_set = set(integration_registry.integrations.keys())

        for i in ignore_integration:
            try:
                integration_set.remove(i)
            except KeyError:
                error(
                    f"Integration {i} does not exist. Available integrations: "
                    f"{list(integration_registry.integrations.keys())}"
                )
    else:
        integration_set = set(integrations)

    if sys.version_info.minor == 12 and "tensorflow" in integration_set:
        warning(
            "The TensorFlow integration is not yet compatible with Python "
            "3.12, thus its installation is skipped. Consider using a "
            "different version of Python and stay in touch for further updates."
        )
        integration_set.remove("tensorflow")

    if sys.version_info.minor == 12 and "deepchecks" in integration_set:
        warning(
            "The Deepchecks integration is not yet compatible with Python "
            "3.12, thus its installation is skipped. Consider using a "
            "different version of Python and stay in touch for further updates."
        )
        integration_set.remove("deepchecks")

    requirements = []
    integrations_to_install = []
    for integration_name in integration_set:
        try:
            if force or not integration_registry.is_installed(
                integration_name
            ):
                requirements += (
                    integration_registry.select_integration_requirements(
                        integration_name
                    )
                )
                integrations_to_install.append(integration_name)
            else:
                declare(
                    f"All required packages for integration "
                    f"'{integration_name}' are already installed."
                )
        except KeyError:
            warning(f"Unable to find integration '{integration_name}'.")

    if requirements and (
        force
        or confirmation(
            "Are you sure you want to install the following "
            "packages to the current environment?\n"
            f"{requirements}"
        )
    ):
        with console.status("Installing integrations..."):
            install_packages(requirements, use_uv=uv)

install_packages(packages, upgrade=False, use_uv=False)

Installs pypi packages into the current environment with pip or uv.

When using with uv, a virtual environment is required.

Parameters:

Name Type Description Default
packages List[str]

List of packages to install.

required
upgrade bool

Whether to upgrade the packages if they are already installed.

False
use_uv bool

Whether to use uv for package installation.

False

Raises:

Type Description
e

If the package installation fails.

Source code in src/zenml/cli/utils.py
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
def install_packages(
    packages: List[str],
    upgrade: bool = False,
    use_uv: bool = False,
) -> None:
    """Installs pypi packages into the current environment with pip or uv.

    When using with `uv`, a virtual environment is required.

    Args:
        packages: List of packages to install.
        upgrade: Whether to upgrade the packages if they are already installed.
        use_uv: Whether to use uv for package installation.

    Raises:
        e: If the package installation fails.
    """
    if "neptune" in packages:
        declare(
            "Uninstalling legacy `neptune-client` package to avoid version "
            "conflicts with new `neptune` package..."
        )
        uninstall_package("neptune-client")

    if "prodigy" in packages:
        packages.remove("prodigy")
        declare(
            "The `prodigy` package should be installed manually using your "
            "license key. Please visit https://prodi.gy/docs/install for more "
            "information."
        )
    if not packages:
        # if user only tried to install prodigy, we can
        # just return without doing anything
        return

    if use_uv and not is_installed_in_python_environment("uv"):
        # If uv is installed globally, don't run as a python module
        command = []
    else:
        command = [sys.executable, "-m"]

    command += ["uv", "pip", "install"] if use_uv else ["pip", "install"]

    if upgrade:
        command += ["--upgrade"]

    command += packages

    if not IS_DEBUG_ENV:
        quiet_flag = "-q" if use_uv else "-qqq"
        command.append(quiet_flag)
        if not use_uv:
            command.append("--no-warn-conflicts")

    try:
        subprocess.check_call(command)
    except subprocess.CalledProcessError as e:
        if (
            use_uv
            and "Failed to locate a virtualenv or Conda environment" in str(e)
        ):
            error(
                "Failed to locate a virtualenv or Conda environment. "
                "When using uv, a virtual environment is required. "
                "Run `uv venv` to create a virtualenv and retry."
            )
        else:
            raise e

integration()

Interact with external integrations.

Source code in src/zenml/cli/integration.py
41
42
43
44
45
46
@cli.group(
    cls=TagGroup,
    tag=CliCategories.INTEGRATIONS,
)
def integration() -> None:
    """Interact with external integrations."""

is_analytics_opted_in()

Check whether user is opt-in or opt-out of analytics.

Source code in src/zenml/cli/config.py
32
33
34
35
36
@analytics.command("get")
def is_analytics_opted_in() -> None:
    """Check whether user is opt-in or opt-out of analytics."""
    gc = GlobalConfiguration()
    cli_utils.declare(f"Analytics opt-in: {gc.analytics_opt_in}")

is_jupyter_installed()

Checks if Jupyter notebook is installed.

Returns:

Name Type Description
bool bool

True if Jupyter notebook is installed, False otherwise.

Source code in src/zenml/cli/utils.py
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
def is_jupyter_installed() -> bool:
    """Checks if Jupyter notebook is installed.

    Returns:
        bool: True if Jupyter notebook is installed, False otherwise.
    """
    try:
        pkg_resources.get_distribution("notebook")
        return True
    except pkg_resources.DistributionNotFound:
        return False

is_pro_server(url)

Check if the server at the given URL is a ZenML Pro server.

Parameters:

Name Type Description Default
url str

The URL of the server to check.

required

Returns:

Type Description
Optional[bool]

True if the server is a ZenML Pro server, False otherwise, and the

Optional[str]

extracted pro API URL if the server is a ZenML Pro server, or None if

Tuple[Optional[bool], Optional[str]]

no information could be extracted.

Source code in src/zenml/cli/login.py
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
def is_pro_server(
    url: str,
) -> Tuple[Optional[bool], Optional[str]]:
    """Check if the server at the given URL is a ZenML Pro server.

    Args:
        url: The URL of the server to check.

    Returns:
        True if the server is a ZenML Pro server, False otherwise, and the
        extracted pro API URL if the server is a ZenML Pro server, or None if
        no information could be extracted.
    """
    from zenml.login.credentials_store import get_credentials_store
    from zenml.login.server_info import get_server_info

    url = url.rstrip("/")
    # First, check the credentials store
    credentials_store = get_credentials_store()
    credentials = credentials_store.get_credentials(url)
    if credentials:
        if credentials.type == ServerType.PRO:
            return True, credentials.pro_api_url
        else:
            return False, None

    # Next, make a request to the server itself
    server_info = get_server_info(url)
    if not server_info:
        return None, None

    if server_info.is_pro_server():
        return True, server_info.pro_api_url

    return False, None

is_sorted_or_filtered(ctx)

Decides whether any filtering/sorting happens during a 'list' CLI call.

Parameters:

Name Type Description Default
ctx Context

the Click context of the CLI call.

required

Returns:

Type Description
bool

a boolean indicating whether any sorting or filtering parameters were

bool

used during the list CLI call.

Source code in src/zenml/cli/utils.py
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
def is_sorted_or_filtered(ctx: click.Context) -> bool:
    """Decides whether any filtering/sorting happens during a 'list' CLI call.

    Args:
        ctx: the Click context of the CLI call.

    Returns:
        a boolean indicating whether any sorting or filtering parameters were
        used during the list CLI call.
    """
    try:
        for _, source in ctx._parameter_source.items():
            if source != click.core.ParameterSource.DEFAULT:
                return True
        return False

    except Exception as e:
        logger.debug(
            f"There was a problem accessing the parameter source for "
            f'the "sort_by" option: {e}'
        )
        return False

legacy_show(ngrok_token=None)

Show the ZenML dashboard.

Parameters:

Name Type Description Default
ngrok_token Optional[str]

An ngrok auth token to use for exposing the ZenML dashboard on a public domain. Primarily used for accessing the dashboard in Colab.

None
Source code in src/zenml/cli/server.py
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
@cli.command(
    "show",
    help="""Show the ZenML dashboard.

DEPRECATED: Please use `zenml server show` instead.             
""",
)
@click.option(
    "--ngrok-token",
    type=str,
    default=None,
    help="Specify an ngrok auth token to use for exposing the ZenML server.",
)
def legacy_show(ngrok_token: Optional[str] = None) -> None:
    """Show the ZenML dashboard.

    Args:
        ngrok_token: An ngrok auth token to use for exposing the ZenML dashboard
            on a public domain. Primarily used for accessing the dashboard in
            Colab.
    """
    cli_utils.warning(
        "The `zenml show` command is deprecated and will be removed in a "
        "future release. Please use the `zenml server show` command "
        "instead."
    )

    # Calling the `zenml server show` command
    cli_utils.declare("Calling `zenml server show`...")
    show(local=False, ngrok_token=ngrok_token)

list_api_keys(service_account_name_or_id, **kwargs)

List all API keys.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account for which to list the API keys.

required
**kwargs Any

Keyword arguments to filter API keys.

{}
Source code in src/zenml/cli/service_accounts.py
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
@api_key.command("list", help="List all API keys.")
@list_options(APIKeyFilter)
@click.pass_obj
def list_api_keys(service_account_name_or_id: str, **kwargs: Any) -> None:
    """List all API keys.

    Args:
        service_account_name_or_id: The name or ID of the service account for
            which to list the API keys.
        **kwargs: Keyword arguments to filter API keys.
    """
    with console.status("Listing API keys...\n"):
        try:
            api_keys = Client().list_api_keys(
                service_account_name_id_or_prefix=service_account_name_or_id,
                **kwargs,
            )
        except KeyError as e:
            cli_utils.error(str(e))

        if not api_keys.items:
            cli_utils.declare("No API keys found for this filter.")
            return

        cli_utils.print_pydantic_models(
            api_keys,
            exclude_columns=[
                "created",
                "updated",
                "workspace",
                "key",
                "retain_period_minutes",
            ],
        )

list_artifact_versions(**kwargs)

List all artifact versions.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter artifact versions by.

{}
Source code in src/zenml/cli/artifact.py
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
@cli_utils.list_options(ArtifactVersionFilter)
@version.command("list", help="List all artifact versions.")
def list_artifact_versions(**kwargs: Any) -> None:
    """List all artifact versions.

    Args:
        **kwargs: Keyword arguments to filter artifact versions by.
    """
    artifact_versions = Client().list_artifact_versions(**kwargs)

    if not artifact_versions:
        cli_utils.declare("No artifact versions found.")
        return

    to_print = []
    for artifact_version in artifact_versions:
        to_print.append(_artifact_version_to_print(artifact_version))

    cli_utils.print_table(to_print)

list_artifacts(**kwargs)

List all artifacts.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter artifacts by.

{}
Source code in src/zenml/cli/artifact.py
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
@cli_utils.list_options(ArtifactFilter)
@artifact.command("list", help="List all artifacts.")
def list_artifacts(**kwargs: Any) -> None:
    """List all artifacts.

    Args:
        **kwargs: Keyword arguments to filter artifacts by.
    """
    artifacts = Client().list_artifacts(**kwargs)

    if not artifacts:
        cli_utils.declare("No artifacts found.")
        return

    to_print = []
    for artifact in artifacts:
        to_print.append(_artifact_to_print(artifact))

    cli_utils.print_table(to_print)

list_authorized_devices(**kwargs)

List all authorized devices.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter authorized devices.

{}
Source code in src/zenml/cli/authorized_device.py
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
@authorized_device.command(
    "list", help="List all authorized devices for the current user."
)
@list_options(OAuthDeviceFilter)
def list_authorized_devices(**kwargs: Any) -> None:
    """List all authorized devices.

    Args:
        **kwargs: Keyword arguments to filter authorized devices.
    """
    with console.status("Listing authorized devices...\n"):
        devices = Client().list_authorized_devices(**kwargs)

        if not devices.items:
            cli_utils.declare("No authorized devices found for this filter.")
            return

        cli_utils.print_pydantic_models(
            devices,
            columns=["id", "status", "ip_address", "hostname", "os"],
        )

list_code_repositories(**kwargs)

List all connected code repositories.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter code repositories.

{}
Source code in src/zenml/cli/code_repository.py
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
@code_repository.command("list", help="List all connected code repositories.")
@list_options(CodeRepositoryFilter)
def list_code_repositories(**kwargs: Any) -> None:
    """List all connected code repositories.

    Args:
        **kwargs: Keyword arguments to filter code repositories.
    """
    with console.status("Listing code repositories...\n"):
        repos = Client().list_code_repositories(**kwargs)

        if not repos.items:
            cli_utils.declare("No code repositories found for this filter.")
            return

        cli_utils.print_pydantic_models(
            repos,
            exclude_columns=["created", "updated", "user", "workspace"],
        )

list_integrations()

List all available integrations with their installation status.

Source code in src/zenml/cli/integration.py
49
50
51
52
53
54
55
56
57
58
59
60
61
@integration.command(name="list", help="List the available integrations.")
def list_integrations() -> None:
    """List all available integrations with their installation status."""
    from zenml.integrations.registry import integration_registry

    formatted_table = format_integration_list(
        sorted(list(integration_registry.integrations.items()))
    )
    print_table(formatted_table)
    warning(
        "\n" + "To install the dependencies of a specific integration, type: "
    )
    warning("zenml integration install INTEGRATION_NAME")

list_model_version_data_artifacts(model_name, model_version=None, **kwargs)

List data artifacts linked to a model version in the Model Control Plane.

Parameters:

Name Type Description Default
model_name str

The ID or name of the model containing version.

required
model_version Optional[str]

The name, number or ID of the model version. If not provided, the latest version is used.

None
**kwargs Any

Keyword arguments to filter models.

{}
Source code in src/zenml/cli/model.py
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
@model.command(
    "data_artifacts",
    help="List data artifacts linked to a model version.",
)
@click.argument("model_name")
@click.option("--model_version", "-v", default=None)
@cli_utils.list_options(ModelVersionArtifactFilter)
def list_model_version_data_artifacts(
    model_name: str,
    model_version: Optional[str] = None,
    **kwargs: Any,
) -> None:
    """List data artifacts linked to a model version in the Model Control Plane.

    Args:
        model_name: The ID or name of the model containing version.
        model_version: The name, number or ID of the model version. If not
            provided, the latest version is used.
        **kwargs: Keyword arguments to filter models.
    """
    _print_artifacts_links_generic(
        model_name_or_id=model_name,
        model_version_name_or_number_or_id=model_version,
        only_data_artifacts=True,
        **kwargs,
    )

list_model_version_deployment_artifacts(model_name, model_version=None, **kwargs)

List deployment artifacts linked to a model version in the Model Control Plane.

Parameters:

Name Type Description Default
model_name str

The ID or name of the model containing version.

required
model_version Optional[str]

The name, number or ID of the model version. If not provided, the latest version is used.

None
**kwargs Any

Keyword arguments to filter models.

{}
Source code in src/zenml/cli/model.py
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
@model.command(
    "deployment_artifacts",
    help="List deployment artifacts linked to a model version.",
)
@click.argument("model_name")
@click.option("--model_version", "-v", default=None)
@cli_utils.list_options(ModelVersionArtifactFilter)
def list_model_version_deployment_artifacts(
    model_name: str,
    model_version: Optional[str] = None,
    **kwargs: Any,
) -> None:
    """List deployment artifacts linked to a model version in the Model Control Plane.

    Args:
        model_name: The ID or name of the model containing version.
        model_version: The name, number or ID of the model version. If not
            provided, the latest version is used.
        **kwargs: Keyword arguments to filter models.
    """
    _print_artifacts_links_generic(
        model_name_or_id=model_name,
        model_version_name_or_number_or_id=model_version,
        only_deployment_artifacts=True,
        **kwargs,
    )

list_model_version_model_artifacts(model_name, model_version=None, **kwargs)

List model artifacts linked to a model version in the Model Control Plane.

Parameters:

Name Type Description Default
model_name str

The ID or name of the model containing version.

required
model_version Optional[str]

The name, number or ID of the model version. If not provided, the latest version is used.

None
**kwargs Any

Keyword arguments to filter models.

{}
Source code in src/zenml/cli/model.py
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
@model.command(
    "model_artifacts",
    help="List model artifacts linked to a model version.",
)
@click.argument("model_name")
@click.option("--model_version", "-v", default=None)
@cli_utils.list_options(ModelVersionArtifactFilter)
def list_model_version_model_artifacts(
    model_name: str,
    model_version: Optional[str] = None,
    **kwargs: Any,
) -> None:
    """List model artifacts linked to a model version in the Model Control Plane.

    Args:
        model_name: The ID or name of the model containing version.
        model_version: The name, number or ID of the model version. If not
            provided, the latest version is used.
        **kwargs: Keyword arguments to filter models.
    """
    _print_artifacts_links_generic(
        model_name_or_id=model_name,
        model_version_name_or_number_or_id=model_version,
        only_model_artifacts=True,
        **kwargs,
    )

list_model_version_pipeline_runs(model_name, model_version=None, **kwargs)

List pipeline runs of a model version in the Model Control Plane.

Parameters:

Name Type Description Default
model_name str

The ID or name of the model containing version.

required
model_version Optional[str]

The name, number or ID of the model version. If not provided, the latest version is used.

None
**kwargs Any

Keyword arguments to filter models.

{}
Source code in src/zenml/cli/model.py
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
@model.command(
    "runs",
    help="List pipeline runs of a model version.",
)
@click.argument("model_name")
@click.option("--model_version", "-v", default=None)
@cli_utils.list_options(ModelVersionPipelineRunFilter)
def list_model_version_pipeline_runs(
    model_name: str,
    model_version: Optional[str] = None,
    **kwargs: Any,
) -> None:
    """List pipeline runs of a model version in the Model Control Plane.

    Args:
        model_name: The ID or name of the model containing version.
        model_version: The name, number or ID of the model version. If not
            provided, the latest version is used.
        **kwargs: Keyword arguments to filter models.
    """
    model_version_response_model = Client().get_model_version(
        model_name_or_id=model_name,
        model_version_name_or_number_or_id=model_version,
    )

    if not model_version_response_model.pipeline_run_ids:
        cli_utils.declare("No pipeline runs attached to model version found.")
        return
    cli_utils.title(
        f"Pipeline runs linked to the model version `{model_version_response_model.name}[{model_version_response_model.number}]`:"
    )

    links = Client().list_model_version_pipeline_run_links(
        model_version_id=model_version_response_model.id,
        **kwargs,
    )

    cli_utils.print_pydantic_models(
        links,
        columns=[
            "pipeline_run",
            "created",
        ],
    )

list_model_versions(model_name, **kwargs)

List model versions with filter in the Model Control Plane.

Parameters:

Name Type Description Default
model_name str

The name of the parent model.

required
**kwargs Any

Keyword arguments to filter models.

{}
Source code in src/zenml/cli/model.py
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
@cli_utils.list_options(ModelVersionFilter)
@click.option(
    "--model-name",
    "-n",
    help="The name of the parent model.",
    type=str,
    required=False,
)
@version.command("list", help="List model versions with filter.")
def list_model_versions(model_name: str, **kwargs: Any) -> None:
    """List model versions with filter in the Model Control Plane.

    Args:
        model_name: The name of the parent model.
        **kwargs: Keyword arguments to filter models.
    """
    model_versions = Client().zen_store.list_model_versions(
        model_name_or_id=model_name,
        model_version_filter_model=ModelVersionFilter(**kwargs),
    )

    if not model_versions:
        cli_utils.declare("No model versions found.")
        return

    to_print = []
    for model_version in model_versions:
        to_print.append(_model_version_to_print(model_version))

    cli_utils.print_table(to_print)

list_models(**kwargs)

List models with filter in the Model Control Plane.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter models.

{}
Source code in src/zenml/cli/model.py
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
@cli_utils.list_options(ModelFilter)
@model.command("list", help="List models with filter.")
def list_models(**kwargs: Any) -> None:
    """List models with filter in the Model Control Plane.

    Args:
        **kwargs: Keyword arguments to filter models.
    """
    models = Client().zen_store.list_models(
        model_filter_model=ModelFilter(**kwargs)
    )

    if not models:
        cli_utils.declare("No models found.")
        return
    to_print = []
    for model in models:
        to_print.append(_model_to_print(model))
    cli_utils.print_table(to_print)

list_options(filter_model)

Create a decorator to generate the correct list of filter parameters.

The Outer decorator (list_options) is responsible for creating the inner decorator. This is necessary so that the type of FilterModel can be passed in as a parameter.

Based on the filter model, the inner decorator extracts all the click options that should be added to the decorated function (wrapper).

Parameters:

Name Type Description Default
filter_model Type[BaseFilter]

The filter model based on which to decorate the function.

required

Returns:

Type Description
Callable[[F], F]

The inner decorator.

Source code in src/zenml/cli/utils.py
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
def list_options(filter_model: Type[BaseFilter]) -> Callable[[F], F]:
    """Create a decorator to generate the correct list of filter parameters.

    The Outer decorator (`list_options`) is responsible for creating the inner
    decorator. This is necessary so that the type of `FilterModel` can be passed
    in as a parameter.

    Based on the filter model, the inner decorator extracts all the click
    options that should be added to the decorated function (wrapper).

    Args:
        filter_model: The filter model based on which to decorate the function.

    Returns:
        The inner decorator.
    """

    def inner_decorator(func: F) -> F:
        options = []
        data_type_descriptors = set()
        for k, v in filter_model.model_fields.items():
            if k not in filter_model.CLI_EXCLUDE_FIELDS:
                options.append(
                    click.option(
                        f"--{k}",
                        type=str,
                        default=v.default,
                        required=False,
                        help=create_filter_help_text(filter_model, k),
                    )
                )
            if k not in filter_model.FILTER_EXCLUDE_FIELDS:
                data_type_descriptors.add(
                    create_data_type_help_text(filter_model, k)
                )

        def wrapper(function: F) -> F:
            for option in reversed(options):
                function = option(function)
            return function

        func.__doc__ = (
            f"{func.__doc__} By default all filters are "
            f"interpreted as a check for equality. However advanced "
            f"filter operators can be used to tune the filtering by "
            f"writing the operator and separating it from the "
            f"query parameter with a colon `:`, e.g. "
            f"--field='operator:query'."
        )

        if data_type_descriptors:
            joined_data_type_descriptors = "\n\n".join(data_type_descriptors)

            func.__doc__ = (
                f"{func.__doc__} \n\n"
                f"\b Each datatype supports a specific "
                f"set of filter operations, here are the relevant "
                f"ones for the parameters of this command: \n\n"
                f"{joined_data_type_descriptors}"
            )

        return wrapper(func)

    return inner_decorator

list_pipeline_builds(**kwargs)

List all pipeline builds for the filter.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter pipeline builds.

{}
Source code in src/zenml/cli/pipeline.py
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
@builds.command("list", help="List all pipeline builds.")
@list_options(PipelineBuildFilter)
def list_pipeline_builds(**kwargs: Any) -> None:
    """List all pipeline builds for the filter.

    Args:
        **kwargs: Keyword arguments to filter pipeline builds.
    """
    client = Client()
    try:
        with console.status("Listing pipeline builds...\n"):
            pipeline_builds = client.list_builds(hydrate=True, **kwargs)
    except KeyError as err:
        cli_utils.error(str(err))
    else:
        if not pipeline_builds.items:
            cli_utils.declare("No pipeline builds found for this filter.")
            return

        cli_utils.print_pydantic_models(
            pipeline_builds,
            exclude_columns=[
                "created",
                "updated",
                "user",
                "workspace",
                "images",
                "stack_checksum",
            ],
        )

list_pipeline_runs(**kwargs)

List all registered pipeline runs for the filter.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter pipeline runs.

{}
Source code in src/zenml/cli/pipeline.py
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
@runs.command("list", help="List all registered pipeline runs.")
@list_options(PipelineRunFilter)
def list_pipeline_runs(**kwargs: Any) -> None:
    """List all registered pipeline runs for the filter.

    Args:
        **kwargs: Keyword arguments to filter pipeline runs.
    """
    client = Client()
    try:
        with console.status("Listing pipeline runs...\n"):
            pipeline_runs = client.list_pipeline_runs(**kwargs)
    except KeyError as err:
        cli_utils.error(str(err))
    else:
        if not pipeline_runs.items:
            cli_utils.declare("No pipeline runs found for this filter.")
            return

        cli_utils.print_pipeline_runs_table(pipeline_runs=pipeline_runs.items)
        cli_utils.print_page_info(pipeline_runs)

list_pipelines(**kwargs)

List all registered pipelines.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter pipelines.

{}
Source code in src/zenml/cli/pipeline.py
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
@pipeline.command("list", help="List all registered pipelines.")
@list_options(PipelineFilter)
def list_pipelines(**kwargs: Any) -> None:
    """List all registered pipelines.

    Args:
        **kwargs: Keyword arguments to filter pipelines.
    """
    client = Client()
    with console.status("Listing pipelines...\n"):
        pipelines = client.list_pipelines(**kwargs)

        if not pipelines.items:
            cli_utils.declare("No pipelines found for this filter.")
            return

        cli_utils.print_pydantic_models(
            pipelines,
            exclude_columns=["id", "created", "updated", "user", "workspace"],
        )

list_schedules(**kwargs)

List all pipeline schedules.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter schedules.

{}
Source code in src/zenml/cli/pipeline.py
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
@schedule.command("list", help="List all pipeline schedules.")
@list_options(ScheduleFilter)
def list_schedules(**kwargs: Any) -> None:
    """List all pipeline schedules.

    Args:
        **kwargs: Keyword arguments to filter schedules.
    """
    client = Client()

    schedules = client.list_schedules(**kwargs)

    if not schedules:
        cli_utils.declare("No schedules found for this filter.")
        return

    cli_utils.print_pydantic_models(
        schedules,
        exclude_columns=["id", "created", "updated", "user", "workspace"],
    )

list_secrets(**kwargs)

List all secrets that fulfill the filter criteria.

Parameters:

Name Type Description Default
kwargs Any

Keyword arguments to filter the secrets.

{}
Source code in src/zenml/cli/secret.py
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
@secret.command(
    "list", help="List all registered secrets that match the filter criteria."
)
@list_options(SecretFilter)
def list_secrets(**kwargs: Any) -> None:
    """List all secrets that fulfill the filter criteria.

    Args:
        kwargs: Keyword arguments to filter the secrets.
    """
    client = Client()
    with console.status("Listing secrets..."):
        try:
            secrets = client.list_secrets(**kwargs)
        except NotImplementedError as e:
            error(f"Centralized secrets management is disabled: {str(e)}")
        if not secrets.items:
            warning("No secrets found for the given filters.")
            return

        secret_rows = [
            dict(
                name=secret.name,
                id=str(secret.id),
                scope=secret.scope.value,
            )
            for secret in secrets.items
        ]
        print_table(secret_rows)
        print_page_info(secrets)

list_service_accounts(ctx, **kwargs)

List all users.

Parameters:

Name Type Description Default
ctx Context

The click context object

required
kwargs Any

Keyword arguments to filter the list of users.

{}
Source code in src/zenml/cli/service_accounts.py
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
@service_account.command("list")
@list_options(ServiceAccountFilter)
@click.pass_context
def list_service_accounts(ctx: click.Context, **kwargs: Any) -> None:
    """List all users.

    Args:
        ctx: The click context object
        kwargs: Keyword arguments to filter the list of users.
    """
    client = Client()
    with console.status("Listing service accounts...\n"):
        service_accounts = client.list_service_accounts(**kwargs)
        if not service_accounts:
            cli_utils.declare(
                "No service accounts found for the given filters."
            )
            return

        cli_utils.print_pydantic_models(
            service_accounts,
            exclude_columns=[
                "created",
                "updated",
            ],
        )

list_service_connector_resources(connector_type=None, resource_type=None, resource_id=None, exclude_errors=False)

List resources that can be accessed by service connectors.

Parameters:

Name Type Description Default
connector_type Optional[str]

The type of service connector to filter by.

None
resource_type Optional[str]

The type of resource to filter by.

None
resource_id Optional[str]

The name of a resource to filter by.

None
exclude_errors bool

Exclude resources that cannot be accessed due to errors.

False
Source code in src/zenml/cli/service_connectors.py
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
@service_connector.command(
    "list-resources",
    help="""List all resources accessible by service connectors.

This command can be used to list all resources that can be accessed by service
connectors configured in your workspace. You can filter the list by connector
type and/or resource type.

Use this command to answer questions like:

- show a list of all Kubernetes clusters that can be accessed by way of service
connectors configured in my workspace
- show a list of all connectors configured for my workspace along with all the
resources they can access or the error state they are in, if any

NOTE: since this command exercises all service connectors in your workspace, it
may take a while to complete.

Examples:

- show a list of all S3 buckets that can be accessed by service connectors
configured in your workspace:

    $ zenml service-connector list-resources --resource-type s3-bucket

- show a list of all resources that the AWS connectors in your workspace can
access:

    $ zenml service-connector list-resources --connector-type aws

""",
)
@click.option(
    "--connector-type",
    "-c",
    "connector_type",
    help="The type of service connector to filter by.",
    required=False,
    type=str,
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="The type of resource to filter by.",
    required=False,
    type=str,
)
@click.option(
    "--resource-id",
    "-ri",
    "resource_id",
    help="The name of a resource to filter by.",
    required=False,
    type=str,
)
@click.option(
    "--exclude-errors",
    "-e",
    "exclude_errors",
    help="Exclude resources that cannot be accessed due to errors.",
    required=False,
    is_flag=True,
)
def list_service_connector_resources(
    connector_type: Optional[str] = None,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    exclude_errors: bool = False,
) -> None:
    """List resources that can be accessed by service connectors.

    Args:
        connector_type: The type of service connector to filter by.
        resource_type: The type of resource to filter by.
        resource_id: The name of a resource to filter by.
        exclude_errors: Exclude resources that cannot be accessed due to
            errors.
    """
    client = Client()

    if not resource_type and not resource_id:
        cli_utils.warning(
            "Fetching all service connector resources can take a long time, "
            "depending on the number of connectors configured in your "
            "workspace. Consider using the '--connector-type', "
            "'--resource-type' and '--resource-id' options to narrow down the "
            "list of resources to fetch."
        )

    with console.status(
        "Fetching all service connector resources (this could take a while)...\n"
    ):
        try:
            resource_list = client.list_service_connector_resources(
                connector_type=connector_type,
                resource_type=resource_type,
                resource_id=resource_id,
            )
        except (
            KeyError,
            ValueError,
            IllegalOperationError,
            NotImplementedError,
            AuthorizationException,
        ) as e:
            cli_utils.error(
                f"Could not fetch service connector resources: {e}"
            )

        if exclude_errors:
            resource_list = [r for r in resource_list if r.error is None]

        if not resource_list:
            cli_utils.declare(
                "No service connector resources match the given filters."
            )
            return

    resource_str = ""
    if resource_type:
        resource_str = f" '{resource_type}'"
    connector_str = ""
    if connector_type:
        connector_str = f" '{connector_type}'"
    if resource_id:
        resource_str = f"{resource_str} resource with name '{resource_id}'"
    else:
        resource_str = f"following{resource_str} resources"

    click.echo(
        f"The {resource_str} can be accessed by"
        f"{connector_str} service connectors configured in your workspace:"
    )

    cli_utils.print_service_connector_resource_table(
        resources=resource_list,
    )

list_service_connector_types(type=None, resource_type=None, auth_method=None, detailed=False)

List service connector types.

Parameters:

Name Type Description Default
type Optional[str]

Filter by service connector type.

None
resource_type Optional[str]

Filter by the type of resource to connect to.

None
auth_method Optional[str]

Filter by the supported authentication method.

None
detailed bool

Show detailed information about the service connectors.

False
Source code in src/zenml/cli/service_connectors.py
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
@service_connector.command(
    "list-types",
    help="""List available service connector types.
""",
)
@click.option(
    "--type",
    "-t",
    "type",
    help="Filter by service connector type.",
    required=False,
    type=str,
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="Filter by the type of resource to connect to.",
    required=False,
    type=str,
)
@click.option(
    "--auth-method",
    "-a",
    "auth_method",
    help="Filter by the supported authentication method.",
    required=False,
    type=str,
)
@click.option(
    "--detailed",
    "-d",
    "detailed",
    help="Show detailed information about the service connector types.",
    required=False,
    is_flag=True,
)
def list_service_connector_types(
    type: Optional[str] = None,
    resource_type: Optional[str] = None,
    auth_method: Optional[str] = None,
    detailed: bool = False,
) -> None:
    """List service connector types.

    Args:
        type: Filter by service connector type.
        resource_type: Filter by the type of resource to connect to.
        auth_method: Filter by the supported authentication method.
        detailed: Show detailed information about the service connectors.
    """
    client = Client()

    service_connector_types = client.list_service_connector_types(
        connector_type=type,
        resource_type=resource_type,
        auth_method=auth_method,
    )

    if not service_connector_types:
        cli_utils.error(
            "No service connector types found matching the criteria."
        )

    if detailed:
        for connector_type in service_connector_types:
            cli_utils.print_service_connector_type(connector_type)
    else:
        cli_utils.print_service_connector_types_table(
            connector_types=service_connector_types
        )

list_service_connectors(ctx, labels=None, **kwargs)

List all service connectors.

Parameters:

Name Type Description Default
ctx Context

The click context object

required
labels Optional[List[str]]

Labels to filter by.

None
kwargs Any

Keyword arguments to filter the components.

{}
Source code in src/zenml/cli/service_connectors.py
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
@service_connector.command(
    "list",
    help="""List available service connectors.
""",
)
@list_options(ServiceConnectorFilter)
@click.option(
    "--label",
    "-l",
    "labels",
    help="Label to filter by. Takes the form `-l key1=value1` or `-l key` and "
    "can be used multiple times.",
    multiple=True,
)
@click.pass_context
def list_service_connectors(
    ctx: click.Context, labels: Optional[List[str]] = None, **kwargs: Any
) -> None:
    """List all service connectors.

    Args:
        ctx: The click context object
        labels: Labels to filter by.
        kwargs: Keyword arguments to filter the components.
    """
    client = Client()

    if labels:
        kwargs["labels"] = cli_utils.get_parsed_labels(
            labels, allow_label_only=True
        )

    connectors = client.list_service_connectors(**kwargs)
    if not connectors:
        cli_utils.declare("No service connectors found for the given filters.")
        return

    cli_utils.print_service_connectors_table(
        client=client,
        connectors=connectors.items,
        show_active=not is_sorted_or_filtered(ctx),
    )
    print_page_info(connectors)

list_stacks(ctx, **kwargs)

List all stacks that fulfill the filter requirements.

Parameters:

Name Type Description Default
ctx Context

the Click context

required
kwargs Any

Keyword arguments to filter the stacks.

{}
Source code in src/zenml/cli/stack.py
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
@stack.command("list")
@list_options(StackFilter)
@click.pass_context
def list_stacks(ctx: click.Context, **kwargs: Any) -> None:
    """List all stacks that fulfill the filter requirements.

    Args:
        ctx: the Click context
        kwargs: Keyword arguments to filter the stacks.
    """
    client = Client()
    with console.status("Listing stacks...\n"):
        stacks = client.list_stacks(**kwargs)
        if not stacks:
            cli_utils.declare("No stacks found for the given filters.")
            return
        print_stacks_table(
            client=client,
            stacks=stacks.items,
            show_active=not is_sorted_or_filtered(ctx),
        )
        print_page_info(stacks)

list_tags(**kwargs)

List tags with filter.

Parameters:

Name Type Description Default
**kwargs Any

Keyword arguments to filter models.

{}
Source code in src/zenml/cli/tag.py
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
@cli_utils.list_options(TagFilter)
@tag.command("list", help="List tags with filter.")
def list_tags(**kwargs: Any) -> None:
    """List tags with filter.

    Args:
        **kwargs: Keyword arguments to filter models.
    """
    tags = Client().list_tags(TagFilter(**kwargs))

    if not tags:
        cli_utils.declare("No tags found.")
        return

    cli_utils.print_pydantic_models(
        tags,
        exclude_columns=["created"],
    )

list_users(ctx, **kwargs)

List all users.

Parameters:

Name Type Description Default
ctx Context

The click context object

required
kwargs Any

Keyword arguments to filter the list of users.

{}
Source code in src/zenml/cli/user_management.py
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
@user.command("list")
@list_options(UserFilter)
@click.pass_context
def list_users(ctx: click.Context, **kwargs: Any) -> None:
    """List all users.

    Args:
        ctx: The click context object
        kwargs: Keyword arguments to filter the list of users.
    """
    client = Client()
    with console.status("Listing stacks...\n"):
        users = client.list_users(**kwargs)
        if not users:
            cli_utils.declare("No users found for the given filters.")
            return

        cli_utils.print_pydantic_models(
            users,
            exclude_columns=[
                "created",
                "updated",
                "email",
                "email_opted_in",
                "activation_token",
            ],
            active_models=[Client().active_user],
            show_active=not is_sorted_or_filtered(ctx),
        )

list_workspaces(ctx, **kwargs)

List all workspaces.

Parameters:

Name Type Description Default
ctx Context

The click context object

required
**kwargs Any

Keyword arguments to filter the list of workspaces.

{}
Source code in src/zenml/cli/workspace.py
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
@workspace.command("list", hidden=True)
@list_options(WorkspaceFilter)
@click.pass_context
def list_workspaces(ctx: click.Context, **kwargs: Any) -> None:
    """List all workspaces.

    Args:
        ctx: The click context object
        **kwargs: Keyword arguments to filter the list of workspaces.
    """
    warn_unsupported_non_default_workspace()
    client = Client()
    with console.status("Listing workspaces...\n"):
        workspaces = client.list_workspaces(**kwargs)
        if workspaces:
            cli_utils.print_pydantic_models(
                workspaces,
                exclude_columns=["id", "created", "updated"],
                active_models=[Client().active_workspace],
                show_active=not is_sorted_or_filtered(ctx),
            )
        else:
            cli_utils.declare("No workspaces found for the given filter.")

lock_authorized_device(id)

Lock an authorized device.

Parameters:

Name Type Description Default
id str

The ID of the authorized device to lock.

required
Source code in src/zenml/cli/authorized_device.py
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
@authorized_device.command("lock")
@click.argument("id", type=str, required=True)
def lock_authorized_device(id: str) -> None:
    """Lock an authorized device.

    Args:
        id: The ID of the authorized device to lock.
    """
    try:
        Client().update_authorized_device(
            id_or_prefix=id,
            locked=True,
        )
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Locked authorized device `{id}`.")

logging()

Configuration of logging for ZenML pipelines.

Source code in src/zenml/cli/config.py
62
63
64
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def logging() -> None:
    """Configuration of logging for ZenML pipelines."""

login_service_connector(name_id_or_prefix, resource_type=None, resource_id=None)

Authenticate the local client/SDK with connector credentials.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or id of the service connector to use.

required
resource_type Optional[str]

The type of resource to connect to.

None
resource_id Optional[str]

Explicit resource ID to connect to.

None
Source code in src/zenml/cli/service_connectors.py
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
@service_connector.command(
    "login",
    help="""Configure the local client/SDK with credentials.

Some service connectors have the ability to configure clients or SDKs installed
on your local machine with credentials extracted from or generated by the
service connector. This command can be used to do that.

For connectors that are configured to access multiple types of resources or 
multiple resource instances, the resource type and resource ID must be
specified to indicate which resource is targeted by this command.

Examples:

- configure the local Kubernetes (kubectl) CLI with credentials generated from
a generic, multi-type, multi-instance AWS service connector:

    $ zenml service-connector login my-generic-aws-connector \\             
--resource-type kubernetes-cluster --resource-id my-eks-cluster

- configure the local Docker CLI with credentials configured in a Docker
service connector:

    $ zenml service-connector login my-docker-connector
""",
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="The type of resource to connect to.",
    required=False,
    type=str,
)
@click.option(
    "--resource-id",
    "-ri",
    "resource_id",
    help="Explicit resource ID to connect to.",
    required=False,
    type=str,
)
@click.argument("name_id_or_prefix", type=str, required=True)
def login_service_connector(
    name_id_or_prefix: str,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
) -> None:
    """Authenticate the local client/SDK with connector credentials.

    Args:
        name_id_or_prefix: The name or id of the service connector to use.
        resource_type: The type of resource to connect to.
        resource_id: Explicit resource ID to connect to.
    """
    client = Client()

    with console.status(
        "Attempting to configure local client using service connector "
        f"'{name_id_or_prefix}'...\n"
    ):
        try:
            connector = client.login_service_connector(
                name_id_or_prefix=name_id_or_prefix,
                resource_type=resource_type,
                resource_id=resource_id,
            )
        except (
            KeyError,
            ValueError,
            IllegalOperationError,
            NotImplementedError,
            AuthorizationException,
        ) as e:
            cli_utils.error(
                f"Service connector '{name_id_or_prefix}' could not configure "
                f"the local client/SDK: {e}"
            )

        spec = connector.get_type()
        resource_type = resource_type or connector.resource_type
        assert resource_type is not None
        resource_name = spec.resource_type_dict[resource_type].name
        cli_utils.declare(
            f"The '{name_id_or_prefix}' {spec.name} connector was used to "
            f"successfully configure the local {resource_name} client/SDK."
        )

logs(follow=False, raw=False, tail=None)

Display the logs for a ZenML server.

Parameters:

Name Type Description Default
follow bool

Continue to output new log data as it becomes available.

False
tail Optional[int]

Only show the last NUM lines of log output.

None
raw bool

Show raw log contents (don't pretty-print logs).

False
Source code in src/zenml/cli/server.py
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
@cli.command("logs", help="Show the logs for the local ZenML server.")
@click.option(
    "--follow",
    "-f",
    is_flag=True,
    help="Continue to output new log data as it becomes available.",
)
@click.option(
    "--tail",
    "-t",
    type=click.INT,
    default=None,
    help="Only show the last NUM lines of log output.",
)
@click.option(
    "--raw",
    "-r",
    is_flag=True,
    help="Show raw log contents (don't pretty-print logs).",
)
def logs(
    follow: bool = False,
    raw: bool = False,
    tail: Optional[int] = None,
) -> None:
    """Display the logs for a ZenML server.

    Args:
        follow: Continue to output new log data as it becomes available.
        tail: Only show the last NUM lines of log output.
        raw: Show raw log contents (don't pretty-print logs).
    """
    server = get_local_server()
    if server is None:
        cli_utils.error(
            "The local ZenML dashboard is not running. Please call `zenml "
            "login --local` first to start the ZenML dashboard locally."
        )

    from zenml.zen_server.deploy.deployer import LocalServerDeployer

    deployer = LocalServerDeployer()

    cli_utils.declare(
        f"Showing logs for the local {server.config.provider} server"
    )

    from zenml.zen_server.deploy.exceptions import (
        ServerDeploymentNotFoundError,
    )

    try:
        logs = deployer.get_server_logs(follow=follow, tail=tail)
    except ServerDeploymentNotFoundError as e:
        cli_utils.error(f"Server not found: {e}")

    for line in logs:
        # don't pretty-print log lines that are already pretty-printed
        if raw or line.startswith("\x1b["):
            console.print(line, markup=False)
        else:
            try:
                console.print(line)
            except MarkupError:
                console.print(line, markup=False)

migrate_database(skip_default_registrations=False)

Migrate the ZenML database.

Parameters:

Name Type Description Default
skip_default_registrations bool

If True, registration of default components will be skipped.

False
Source code in src/zenml/cli/base.py
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
@cli.command(
    "migrate-database", help="Migrate the ZenML database.", hidden=True
)
@click.option(
    "--skip_default_registrations",
    is_flag=True,
    default=False,
    help="Skip registering default workspace, user and stack.",
    type=bool,
)
def migrate_database(skip_default_registrations: bool = False) -> None:
    """Migrate the ZenML database.

    Args:
        skip_default_registrations: If `True`, registration of default
            components will be skipped.
    """
    from zenml.zen_stores.base_zen_store import BaseZenStore

    store_config = GlobalConfiguration().store_configuration
    if store_config.type == StoreType.SQL:
        BaseZenStore.create_store(
            store_config, skip_default_registrations=skip_default_registrations
        )
        cli_utils.declare("Database migration finished.")
    else:
        cli_utils.warning(
            "Unable to migrate database while connected to a ZenML server."
        )

model()

Interact with models and model versions in the Model Control Plane.

Source code in src/zenml/cli/model.py
81
82
83
@cli.group(cls=TagGroup, tag=CliCategories.MODEL_CONTROL_PLANE)
def model() -> None:
    """Interact with models and model versions in the Model Control Plane."""

opt_in()

Opt-in to analytics.

Source code in src/zenml/cli/config.py
39
40
41
42
43
44
45
46
47
@analytics.command(
    "opt-in", context_settings=dict(ignore_unknown_options=True)
)
@track_decorator(AnalyticsEvent.OPT_IN_ANALYTICS)
def opt_in() -> None:
    """Opt-in to analytics."""
    gc = GlobalConfiguration()
    gc.analytics_opt_in = True
    cli_utils.declare("Opted in to analytics.")

opt_out()

Opt-out of analytics.

Source code in src/zenml/cli/config.py
50
51
52
53
54
55
56
57
58
@analytics.command(
    "opt-out", context_settings=dict(ignore_unknown_options=True)
)
@track_decorator(AnalyticsEvent.OPT_OUT_ANALYTICS)
def opt_out() -> None:
    """Opt-out of analytics."""
    gc = GlobalConfiguration()
    gc.analytics_opt_in = False
    cli_utils.declare("Opted out of analytics.")

parse_name_and_extra_arguments(args, expand_args=False, name_mandatory=True)

Parse a name and extra arguments from the CLI.

This is a utility function used to parse a variable list of optional CLI arguments of the form --key=value that must also include one mandatory free-form name argument. There is no restriction as to the order of the arguments.

Examples:

>>> parse_name_and_extra_arguments(['foo']])
('foo', {})
>>> parse_name_and_extra_arguments(['foo', '--bar=1'])
('foo', {'bar': '1'})
>>> parse_name_and_extra_arguments(['--bar=1', 'foo', '--baz=2'])
('foo', {'bar': '1', 'baz': '2'})
>>> parse_name_and_extra_arguments(['--bar=1'])
Traceback (most recent call last):
    ...
    ValueError: Missing required argument: name

Parameters:

Name Type Description Default
args List[str]

A list of command line arguments from the CLI.

required
expand_args bool

Whether to expand argument values into the contents of the files they may be pointing at using the special @ character.

False
name_mandatory bool

Whether the name argument is mandatory.

True

Returns:

Type Description
Tuple[Optional[str], Dict[str, str]]

The name and a dict of parsed args.

Source code in src/zenml/cli/utils.py
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
def parse_name_and_extra_arguments(
    args: List[str],
    expand_args: bool = False,
    name_mandatory: bool = True,
) -> Tuple[Optional[str], Dict[str, str]]:
    """Parse a name and extra arguments from the CLI.

    This is a utility function used to parse a variable list of optional CLI
    arguments of the form `--key=value` that must also include one mandatory
    free-form name argument. There is no restriction as to the order of the
    arguments.

    Examples:
        >>> parse_name_and_extra_arguments(['foo']])
        ('foo', {})
        >>> parse_name_and_extra_arguments(['foo', '--bar=1'])
        ('foo', {'bar': '1'})
        >>> parse_name_and_extra_arguments(['--bar=1', 'foo', '--baz=2'])
        ('foo', {'bar': '1', 'baz': '2'})
        >>> parse_name_and_extra_arguments(['--bar=1'])
        Traceback (most recent call last):
            ...
            ValueError: Missing required argument: name

    Args:
        args: A list of command line arguments from the CLI.
        expand_args: Whether to expand argument values into the contents of the
            files they may be pointing at using the special `@` character.
        name_mandatory: Whether the name argument is mandatory.

    Returns:
        The name and a dict of parsed args.
    """
    name: Optional[str] = None
    # The name was not supplied as the first argument, we have to
    # search the other arguments for the name.
    for i, arg in enumerate(args):
        if not arg:
            # Skip empty arguments.
            continue
        if arg.startswith("--"):
            continue
        name = args.pop(i)
        break
    else:
        if name_mandatory:
            error(
                "A name must be supplied. Please see the command help for more "
                "information."
            )

    message = (
        "Please provide args with a proper "
        "identifier as the key and the following structure: "
        '--custom_argument="value"'
    )
    args_dict: Dict[str, str] = {}
    for a in args:
        if not a:
            # Skip empty arguments.
            continue
        if not a.startswith("--") or "=" not in a:
            error(f"Invalid argument: '{a}'. {message}")
        key, value = a[2:].split("=", maxsplit=1)
        if not key.isidentifier():
            error(f"Invalid argument: '{a}'. {message}")
        args_dict[key] = value

    if expand_args:
        args_dict = {
            k: expand_argument_value_from_file(k, v)
            for k, v in args_dict.items()
        }

    return name, args_dict

pipeline()

Interact with pipelines, runs and schedules.

Source code in src/zenml/cli/pipeline.py
72
73
74
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def pipeline() -> None:
    """Interact with pipelines, runs and schedules."""

pretty_print_model_deployer(model_services, model_deployer)

Given a list of served_models, print all associated key-value pairs.

Parameters:

Name Type Description Default
model_services List[BaseService]

list of model deployment services

required
model_deployer BaseModelDeployer

Active model deployer

required
Source code in src/zenml/cli/utils.py
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
def pretty_print_model_deployer(
    model_services: List["BaseService"], model_deployer: "BaseModelDeployer"
) -> None:
    """Given a list of served_models, print all associated key-value pairs.

    Args:
        model_services: list of model deployment services
        model_deployer: Active model deployer
    """
    model_service_dicts = []
    for model_service in model_services:
        dict_uuid = str(model_service.uuid)
        dict_pl_name = model_service.config.pipeline_name
        dict_pl_stp_name = model_service.config.pipeline_step_name
        dict_model_name = model_service.config.model_name
        type = model_service.SERVICE_TYPE.type
        flavor = model_service.SERVICE_TYPE.flavor
        model_service_dicts.append(
            {
                "STATUS": get_service_state_emoji(model_service.status.state),
                "UUID": dict_uuid,
                "TYPE": type,
                "FLAVOR": flavor,
                "PIPELINE_NAME": dict_pl_name,
                "PIPELINE_STEP_NAME": dict_pl_stp_name,
                "MODEL_NAME": dict_model_name,
            }
        )
    print_table(
        model_service_dicts, UUID=table.Column(header="UUID", min_width=36)
    )

pretty_print_secret(secret, hide_secret=True)

Print all key-value pairs associated with a secret.

Parameters:

Name Type Description Default
secret Dict[str, str]

Secret values to print.

required
hide_secret bool

boolean that configures if the secret values are shown on the CLI

True
Source code in src/zenml/cli/utils.py
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
def pretty_print_secret(
    secret: Dict[str, str],
    hide_secret: bool = True,
) -> None:
    """Print all key-value pairs associated with a secret.

    Args:
        secret: Secret values to print.
        hide_secret: boolean that configures if the secret values are shown
            on the CLI
    """
    title: Optional[str] = None

    def get_secret_value(value: Any) -> str:
        if value is None:
            return ""
        return "***" if hide_secret else str(value)

    stack_dicts = [
        {
            "SECRET_KEY": key,
            "SECRET_VALUE": get_secret_value(value),
        }
        for key, value in secret.items()
    ]

    print_table(stack_dicts, title=title)

print_model_url(url)

Pretty prints a given URL on the CLI.

Parameters:

Name Type Description Default
url Optional[str]

optional str, the URL to display.

required
Source code in src/zenml/cli/utils.py
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
def print_model_url(url: Optional[str]) -> None:
    """Pretty prints a given URL on the CLI.

    Args:
        url: optional str, the URL to display.
    """
    if url:
        declare(f"Dashboard URL: {url}")
    else:
        warning(
            "You can display various ZenML entities including pipelines, "
            "runs, stacks and much more on the ZenML Dashboard. "
            "You can try it locally, by running `zenml login --local`, or "
            "remotely, by deploying ZenML on the infrastructure of your choice."
        )

print_page_info(page)

Print all page information showing the number of items and pages.

Parameters:

Name Type Description Default
page Page[T]

The page to print the information for.

required
Source code in src/zenml/cli/utils.py
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
def print_page_info(page: Page[T]) -> None:
    """Print all page information showing the number of items and pages.

    Args:
        page: The page to print the information for.
    """
    declare(
        f"Page `({page.index}/{page.total_pages})`, `{page.total}` items "
        f"found for the applied filters."
    )

print_served_model_configuration(model_service, model_deployer)

Prints the configuration of a model_service.

Parameters:

Name Type Description Default
model_service BaseService

Specific service instance to

required
model_deployer BaseModelDeployer

Active model deployer

required
Source code in src/zenml/cli/utils.py
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
def print_served_model_configuration(
    model_service: "BaseService", model_deployer: "BaseModelDeployer"
) -> None:
    """Prints the configuration of a model_service.

    Args:
        model_service: Specific service instance to
        model_deployer: Active model deployer
    """
    title_ = f"Properties of Served Model {model_service.uuid}"

    rich_table = table.Table(
        box=box.HEAVY_EDGE,
        title=title_,
        show_lines=True,
    )
    rich_table.add_column("MODEL SERVICE PROPERTY", overflow="fold")
    rich_table.add_column("VALUE", overflow="fold")

    # Get implementation specific info
    served_model_info = model_deployer.get_model_server_info(model_service)

    served_model_info = {
        **served_model_info,
        "UUID": str(model_service.uuid),
        "STATUS": get_service_state_emoji(model_service.status.state),
        "TYPE": model_service.SERVICE_TYPE.type,
        "FLAVOR": model_service.SERVICE_TYPE.flavor,
        "STATUS_MESSAGE": model_service.status.last_error,
        "PIPELINE_NAME": model_service.config.pipeline_name,
        "PIPELINE_STEP_NAME": model_service.config.pipeline_step_name,
    }

    # Sort fields alphabetically
    sorted_items = {k: v for k, v in sorted(served_model_info.items())}

    for item in sorted_items.items():
        rich_table.add_row(*[str(elem) for elem in item])

    # capitalize entries in first column
    rich_table.columns[0]._cells = [
        component.upper()  # type: ignore[union-attr]
        for component in rich_table.columns[0]._cells
    ]
    console.print(rich_table)

print_stacks_table(client, stacks, show_active=False)

Print a prettified list of all stacks supplied to this method.

Parameters:

Name Type Description Default
client Client

Repository instance

required
stacks Sequence[StackResponse]

List of stacks

required
show_active bool

Flag to decide whether to append the active stack on the top of the list.

False
Source code in src/zenml/cli/utils.py
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
def print_stacks_table(
    client: "Client",
    stacks: Sequence["StackResponse"],
    show_active: bool = False,
) -> None:
    """Print a prettified list of all stacks supplied to this method.

    Args:
        client: Repository instance
        stacks: List of stacks
        show_active: Flag to decide whether to append the active stack on the
            top of the list.
    """
    stack_dicts = []

    stacks = list(stacks)
    active_stack = client.active_stack_model
    if show_active:
        if active_stack.id not in [s.id for s in stacks]:
            stacks.append(active_stack)

        stacks = [s for s in stacks if s.id == active_stack.id] + [
            s for s in stacks if s.id != active_stack.id
        ]

    active_stack_model_id = client.active_stack_model.id
    for stack in stacks:
        is_active = stack.id == active_stack_model_id

        if stack.user:
            user_name = stack.user.name
        else:
            user_name = "-"

        stack_config = {
            "ACTIVE": ":point_right:" if is_active else "",
            "STACK NAME": stack.name,
            "STACK ID": stack.id,
            "OWNER": user_name,
            **{
                component_type.upper(): components[0].name
                for component_type, components in stack.components.items()
            },
        }
        stack_dicts.append(stack_config)

    print_table(stack_dicts)

print_table(obj, title=None, caption=None, **columns)

Prints the list of dicts in a table format.

The input object should be a List of Dicts. Each item in that list represent a line in the Table. Each dict should have the same keys. The keys of the dict will be used as headers of the resulting table.

Parameters:

Name Type Description Default
obj List[Dict[str, Any]]

A List containing dictionaries.

required
title Optional[str]

Title of the table.

None
caption Optional[str]

Caption of the table.

None
columns Column

Optional column configurations to be used in the table.

{}
Source code in src/zenml/cli/utils.py
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
def print_table(
    obj: List[Dict[str, Any]],
    title: Optional[str] = None,
    caption: Optional[str] = None,
    **columns: table.Column,
) -> None:
    """Prints the list of dicts in a table format.

    The input object should be a List of Dicts. Each item in that list represent
    a line in the Table. Each dict should have the same keys. The keys of the
    dict will be used as headers of the resulting table.

    Args:
        obj: A List containing dictionaries.
        title: Title of the table.
        caption: Caption of the table.
        columns: Optional column configurations to be used in the table.
    """
    from rich.text import Text

    column_keys = {key: None for dict_ in obj for key in dict_}
    column_names = [columns.get(key, key.upper()) for key in column_keys]
    rich_table = table.Table(
        box=box.HEAVY_EDGE, show_lines=True, title=title, caption=caption
    )
    for col_name in column_names:
        if isinstance(col_name, str):
            rich_table.add_column(str(col_name), overflow="fold")
        else:
            rich_table.add_column(
                str(col_name.header).upper(), overflow="fold"
            )
    for dict_ in obj:
        values = []
        for key in column_keys:
            if key is None:
                values.append(None)
            else:
                v = dict_.get(key) or " "
                if isinstance(v, str) and (
                    v.startswith("http://") or v.startswith("https://")
                ):
                    # Display the URL as a hyperlink in a way that doesn't break
                    # the URL when it needs to be wrapped over multiple lines
                    value: Union[str, Text] = Text(v, style=f"link {v}")
                else:
                    value = str(v)
                    # Escape text when square brackets are used, but allow
                    # links to be decorated as rich style links
                    if "[" in value and "[link=" not in value:
                        value = escape(value)
                values.append(value)
        rich_table.add_row(*values)
    if len(rich_table.columns) > 1:
        rich_table.columns[0].justify = "center"
    console.print(rich_table)

prompt_connector_name(default_name=None, connector=None)

Prompt the user for a service connector name.

Parameters:

Name Type Description Default
default_name Optional[str]

The default name to use if the user doesn't provide one.

None
connector Optional[UUID]

The UUID of a service connector being renamed.

None

Returns:

Type Description
str

The name provided by the user.

Source code in src/zenml/cli/service_connectors.py
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
def prompt_connector_name(
    default_name: Optional[str] = None, connector: Optional[UUID] = None
) -> str:
    """Prompt the user for a service connector name.

    Args:
        default_name: The default name to use if the user doesn't provide one.
        connector: The UUID of a service connector being renamed.

    Returns:
        The name provided by the user.
    """
    client = Client()

    while True:
        # Ask for a name
        title = "Please enter a name for the service connector"
        if connector:
            title += " or press Enter to keep the current name"

        name = click.prompt(
            title,
            type=str,
            default=default_name,
        )
        if not name:
            cli_utils.warning("The name cannot be empty")
            continue
        assert isinstance(name, str)

        # Check if the name is taken
        try:
            existing_connector = client.get_service_connector(
                name_id_or_prefix=name, allow_name_prefix_match=False
            )
        except KeyError:
            break
        else:
            if existing_connector.id == connector:
                break
            cli_utils.warning(
                f"A service connector with the name '{name}' already "
                "exists. Please choose a different name."
            )

    return name

prompt_expiration_time(min=None, max=None, default=None)

Prompt the user for an expiration time.

Parameters:

Name Type Description Default
min Optional[int]

The minimum allowed expiration time.

None
max Optional[int]

The maximum allowed expiration time.

None
default Optional[int]

The default expiration time.

None

Returns:

Type Description
int

The expiration time provided by the user.

Source code in src/zenml/cli/service_connectors.py
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
def prompt_expiration_time(
    min: Optional[int] = None,
    max: Optional[int] = None,
    default: Optional[int] = None,
) -> int:
    """Prompt the user for an expiration time.

    Args:
        min: The minimum allowed expiration time.
        max: The maximum allowed expiration time.
        default: The default expiration time.

    Returns:
        The expiration time provided by the user.
    """
    if min is None:
        min = 0
    min_str = f"min: {min} = {seconds_to_human_readable(min)}; "
    if max is not None:
        max_str = str(max)
        max_str = f"max: {max} = {seconds_to_human_readable(max)}"
    else:
        max = -1
        max_str = "max: unlimited"
    if default:
        default_str = (
            f"; default: {default} = {seconds_to_human_readable(default)}"
        )
    else:
        default_str = ""

    while True:
        expiration_seconds = click.prompt(
            "The authentication method involves generating "
            "temporary credentials. Please enter the time that "
            "the credentials should be valid for, in seconds "
            f"({min_str}{max_str}{default_str})",
            type=int,
            default=default,
        )

        assert expiration_seconds is not None
        assert isinstance(expiration_seconds, int)
        if expiration_seconds < min:
            cli_utils.warning(
                f"The expiration time must be at least "
                f"{min} seconds. Please enter a larger value."
            )
            continue
        if max > 0 and expiration_seconds > max:
            cli_utils.warning(
                f"The expiration time must not exceed "
                f"{max} seconds. Please enter a smaller value."
            )
            continue

        confirm = click.confirm(
            f"Credentials will be valid for "
            f"{seconds_to_human_readable(expiration_seconds)}. Keep this "
            "value?",
            default=True,
        )
        if confirm:
            break

    return expiration_seconds

prompt_expires_at(default=None)

Prompt the user for an expiration timestamp.

Parameters:

Name Type Description Default
default Optional[datetime]

The default expiration time.

None

Returns:

Type Description
Optional[datetime]

The expiration time provided by the user.

Source code in src/zenml/cli/service_connectors.py
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
def prompt_expires_at(
    default: Optional[datetime] = None,
) -> Optional[datetime]:
    """Prompt the user for an expiration timestamp.

    Args:
        default: The default expiration time.

    Returns:
        The expiration time provided by the user.
    """
    if default is None:
        confirm = click.confirm(
            "Are the credentials you configured temporary? If so, you'll be asked "
            "to provide an expiration time in the next step.",
            default=False,
        )
        if not confirm:
            return None

    while True:
        default_str = ""
        if default is not None:
            seconds = int(
                (default - utc_now(tz_aware=default)).total_seconds()
            )
            default_str = (
                f" [{str(default)} i.e. in "
                f"{seconds_to_human_readable(seconds)}]"
            )

        expires_at = click.prompt(
            "Please enter the exact UTC date and time when the credentials "
            f"will expire e.g. '2023-12-31 23:59:59'{default_str}",
            type=click.DateTime(),
            default=default,
            show_default=False,
        )

        assert expires_at is not None
        assert isinstance(expires_at, datetime)
        if expires_at < utc_now(tz_aware=expires_at):
            cli_utils.warning(
                "The expiration time must be in the future. Please enter a "
                "later date and time."
            )
            continue

        seconds = int(
            (expires_at - utc_now(tz_aware=expires_at)).total_seconds()
        )

        confirm = click.confirm(
            f"Credentials will be valid until {str(expires_at)} UTC (i.e. "
            f"in {seconds_to_human_readable(seconds)}. Keep this value?",
            default=True,
        )
        if confirm:
            break

    return expires_at

prompt_resource_id(resource_name, resource_ids)

Prompt the user for a resource ID.

Parameters:

Name Type Description Default
resource_name str

The name of the resource.

required
resource_ids List[str]

The list of available resource IDs.

required

Returns:

Type Description
Optional[str]

The resource ID provided by the user.

Source code in src/zenml/cli/service_connectors.py
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
def prompt_resource_id(
    resource_name: str, resource_ids: List[str]
) -> Optional[str]:
    """Prompt the user for a resource ID.

    Args:
        resource_name: The name of the resource.
        resource_ids: The list of available resource IDs.

    Returns:
        The resource ID provided by the user.
    """
    resource_id: Optional[str] = None
    if resource_ids:
        resource_ids_list = "\n - " + "\n - ".join(resource_ids)
        prompt = (
            f"The following {resource_name} instances "
            "are reachable through this connector:"
            f"{resource_ids_list}\n"
            "Please select one or leave it empty to create a "
            "connector that can be used to access any of them"
        )
        while True:
            # Ask the user to enter an optional resource ID
            resource_id = click.prompt(
                prompt,
                default="",
                type=str,
            )
            if (
                not resource_ids
                or not resource_id
                or resource_id in resource_ids
            ):
                break

            cli_utils.warning(
                f"The selected '{resource_id}' value is not one of "
                "the listed values. Please try again."
            )
    else:
        prompt = (
            "The connector configuration can be used to access "
            f"multiple {resource_name} instances. If you "
            "would like to limit the scope of the connector to one "
            "instance, please enter the name of a particular "
            f"{resource_name} instance. Or leave it "
            "empty to create a multi-instance connector that can "
            f"be used to access any {resource_name}"
        )
        resource_id = click.prompt(
            prompt,
            default="",
            type=str,
        )

    if resource_id == "":
        resource_id = None

    return resource_id

prompt_resource_type(available_resource_types)

Prompt the user for a resource type.

Parameters:

Name Type Description Default
available_resource_types List[str]

The list of available resource types.

required

Returns:

Type Description
Optional[str]

The resource type provided by the user.

Source code in src/zenml/cli/service_connectors.py
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
def prompt_resource_type(available_resource_types: List[str]) -> Optional[str]:
    """Prompt the user for a resource type.

    Args:
        available_resource_types: The list of available resource types.

    Returns:
        The resource type provided by the user.
    """
    resource_type = None
    if len(available_resource_types) == 1:
        # Default to the first resource type if only one type is available
        click.echo(
            "Only one resource type is available for this connector"
            f" ({available_resource_types[0]})."
        )
        resource_type = available_resource_types[0]
    else:
        # Ask the user to select a resource type
        while True:
            resource_type = click.prompt(
                "Please select a resource type or leave it empty to create "
                "a connector that can be used to access any of the "
                "supported resource types "
                f"({', '.join(available_resource_types)}).",
                type=str,
                default="",
            )
            if resource_type and resource_type not in available_resource_types:
                cli_utils.warning(
                    f"The entered resource type '{resource_type}' is not "
                    "one of the listed values. Please try again."
                )
                continue
            break

        if resource_type == "":
            resource_type = None

    return resource_type

prompt_select_resource(resource_list)

Prompts the user to select a resource ID from a list of resources.

Parameters:

Name Type Description Default
resource_list List[ServiceConnectorResourcesModel]

List of resources to select from.

required

Returns:

Type Description
UUID

The ID of a selected connector and the ID of the selected resource

str

instance.

Source code in src/zenml/cli/stack_components.py
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
def prompt_select_resource(
    resource_list: List[ServiceConnectorResourcesModel],
) -> Tuple[UUID, str]:
    """Prompts the user to select a resource ID from a list of resources.

    Args:
        resource_list: List of resources to select from.

    Returns:
        The ID of a selected connector and the ID of the selected resource
        instance.
    """
    if len(resource_list) == 1:
        click.echo("Only one connector has compatible resources:")
    else:
        click.echo("The following connectors have compatible resources:")

    cli_utils.print_service_connector_resource_table(resource_list)

    if len(resource_list) == 1:
        connect = click.confirm(
            "Would you like to use this connector?",
            default=True,
        )
        if not connect:
            cli_utils.error("Aborting.")
        resources = resource_list[0]
    else:
        # Prompt the user to select a connector by its name or ID
        while True:
            connector_id = click.prompt(
                "Please enter the name or ID of the connector you want to use",
                type=click.Choice(
                    [
                        str(connector.id)
                        for connector in resource_list
                        if connector.id is not None
                    ]
                    + [
                        connector.name
                        for connector in resource_list
                        if connector.name is not None
                    ]
                ),
                show_choices=False,
            )
            matches = [
                c
                for c in resource_list
                if str(c.id) == connector_id or c.name == connector_id
            ]
            if len(matches) > 1:
                cli_utils.declare(
                    f"Multiple connectors with name '{connector_id}' "
                    "were found. Please try again."
                )
            else:
                resources = matches[0]
                break

    connector_uuid = resources.id
    assert connector_uuid is not None

    assert len(resources.resources) == 1
    resource_name = resources.resources[0].resource_type
    if not isinstance(resources.connector_type, str):
        resource_type_spec = resources.connector_type.resource_type_dict[
            resource_name
        ]
        resource_name = resource_type_spec.name

    resource_id = prompt_select_resource_id(
        resources.resources[0].resource_ids or [], resource_name=resource_name
    )

    return connector_uuid, resource_id

prompt_select_resource_id(resource_ids, resource_name, interactive=True)

Prompts the user to select a resource ID from a list of available IDs.

Parameters:

Name Type Description Default
resource_ids List[str]

A list of available resource IDs.

required
resource_name str

The name of the resource type to select.

required
interactive bool

Whether to prompt the user for input or error out if user input is required.

True

Returns:

Type Description
str

The selected resource ID.

Source code in src/zenml/cli/stack_components.py
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
def prompt_select_resource_id(
    resource_ids: List[str],
    resource_name: str,
    interactive: bool = True,
) -> str:
    """Prompts the user to select a resource ID from a list of available IDs.

    Args:
        resource_ids: A list of available resource IDs.
        resource_name: The name of the resource type to select.
        interactive: Whether to prompt the user for input or error out if
            user input is required.

    Returns:
        The selected resource ID.
    """
    if len(resource_ids) == 1:
        # Only one resource ID is available, so we can select it
        # without prompting the user
        return resource_ids[0]

    if len(resource_ids) > 1:
        resource_ids_list = "\n - " + "\n - ".join(resource_ids)
        msg = (
            f"Multiple {resource_name} resources are available for the "
            f"selected connector:\n{resource_ids_list}\n"
        )
        # User needs to select a resource ID from the list
        if not interactive:
            cli_utils.error(
                f"{msg}Please use the `--resource-id` command line "  # nosec
                f"argument to select a {resource_name} resource from the "
                "list."
            )
        resource_id = click.prompt(
            f"{msg}Please select the {resource_name} that you want to use",
            type=click.Choice(resource_ids),
            show_choices=False,
        )

        return cast(str, resource_id)

    # We should never get here, but just in case...
    cli_utils.error(
        "Could not determine which resource to use. Please select a "
        "different connector."
    )

prune_artifacts(only_artifact=False, only_metadata=False, yes=False, ignore_errors=False)

Delete all unused artifacts and artifact versions.

Unused artifact versions are those that are no longer referenced by any pipeline runs. Similarly, unused artifacts are those that no longer have any used artifact versions.

Parameters:

Name Type Description Default
only_artifact bool

If set, only delete the actual artifact object from the artifact store but keep the metadata.

False
only_metadata bool

If set, only delete metadata and not the actual artifact objects stored in the artifact store.

False
yes bool

If set, don't ask for confirmation.

False
ignore_errors bool

If set, ignore errors and continue with the next artifact version.

False
Source code in src/zenml/cli/artifact.py
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
@artifact.command(
    "prune",
    help=(
        "Delete all unused artifacts and artifact versions that are no longer "
        "referenced by any pipeline runs."
    ),
)
@click.option(
    "--only-artifact",
    "-a",
    is_flag=True,
    help=(
        "Only delete the actual artifact object from the artifact store but "
        "keep the metadata."
    ),
)
@click.option(
    "--only-metadata",
    "-m",
    is_flag=True,
    help=(
        "Only delete metadata and not the actual artifact object stored in "
        "the artifact store."
    ),
)
@click.option(
    "--yes",
    "-y",
    is_flag=True,
    help="Don't ask for confirmation.",
)
@click.option(
    "--ignore-errors",
    "-i",
    is_flag=True,
    help="Ignore errors and continue with the next artifact version.",
)
def prune_artifacts(
    only_artifact: bool = False,
    only_metadata: bool = False,
    yes: bool = False,
    ignore_errors: bool = False,
) -> None:
    """Delete all unused artifacts and artifact versions.

    Unused artifact versions are those that are no longer referenced by any
    pipeline runs. Similarly, unused artifacts are those that no longer have
    any used artifact versions.

    Args:
        only_artifact: If set, only delete the actual artifact object from the
            artifact store but keep the metadata.
        only_metadata: If set, only delete metadata and not the actual artifact
            objects stored in the artifact store.
        yes: If set, don't ask for confirmation.
        ignore_errors: If set, ignore errors and continue with the next
            artifact version.
    """
    client = Client()
    unused_artifact_versions = depaginate(
        client.list_artifact_versions, only_unused=True
    )

    if not unused_artifact_versions:
        cli_utils.declare("No unused artifact versions found.")
        return

    if not yes:
        confirmation = cli_utils.confirmation(
            f"Found {len(unused_artifact_versions)} unused artifact versions. "
            f"Do you want to delete them?"
        )
        if not confirmation:
            cli_utils.declare("Artifact deletion canceled.")
            return

    for unused_artifact_version in unused_artifact_versions:
        try:
            Client().delete_artifact_version(
                name_id_or_prefix=unused_artifact_version.id,
                delete_metadata=not only_artifact,
                delete_from_artifact_store=not only_metadata,
            )
            unused_artifact = unused_artifact_version.artifact
            if not unused_artifact.versions and not only_artifact:
                Client().delete_artifact(unused_artifact.id)

        except Exception as e:
            if ignore_errors:
                cli_utils.warning(
                    f"Failed to delete artifact version {unused_artifact_version.id}: {str(e)}"
                )
            else:
                cli_utils.error(
                    f"Failed to delete artifact version {unused_artifact_version.id}: {str(e)}"
                )
    cli_utils.declare("All unused artifacts and artifact versions deleted.")

read_yaml(file_path)

Read YAML on file path and returns contents as dict.

Parameters:

Name Type Description Default
file_path str

Path to YAML file.

required

Returns:

Type Description
Any

Contents of the file in a dict.

Raises:

Type Description
FileNotFoundError

if file does not exist.

Source code in src/zenml/utils/yaml_utils.py
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
def read_yaml(file_path: str) -> Any:
    """Read YAML on file path and returns contents as dict.

    Args:
        file_path: Path to YAML file.

    Returns:
        Contents of the file in a dict.

    Raises:
        FileNotFoundError: if file does not exist.
    """
    if fileio.exists(file_path):
        contents = io_utils.read_file_contents_as_string(file_path)
        # TODO: [LOW] consider adding a default empty dict to be returned
        #   instead of None
        return yaml.safe_load(contents)
    else:
        raise FileNotFoundError(f"{file_path} does not exist.")

refresh_pipeline_run(run_name_or_id)

Refresh the status of a pipeline run.

Parameters:

Name Type Description Default
run_name_or_id str

The name or ID of the pipeline run to refresh.

required
Source code in src/zenml/cli/pipeline.py
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
@runs.command("refresh")
@click.argument("run_name_or_id", type=str, required=True)
def refresh_pipeline_run(run_name_or_id: str) -> None:
    """Refresh the status of a pipeline run.

    Args:
        run_name_or_id: The name or ID of the pipeline run to refresh.
    """
    try:
        # Fetch and update the run
        run = Client().get_pipeline_run(name_id_or_prefix=run_name_or_id)
        run.refresh_run_status()

    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(
            f"Refreshed the status of pipeline run '{run.name}'."
        )

register_all_stack_component_cli_commands()

Registers CLI commands for all stack components.

Source code in src/zenml/cli/stack_components.py
1314
1315
1316
1317
1318
1319
def register_all_stack_component_cli_commands() -> None:
    """Registers CLI commands for all stack components."""
    for component_type in StackComponentType:
        register_single_stack_component_cli_commands(
            component_type, parent_group=cli
        )

register_annotator_subcommands()

Registers CLI subcommands for the annotator.

Source code in src/zenml/cli/annotator.py
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
def register_annotator_subcommands() -> None:
    """Registers CLI subcommands for the annotator."""
    annotator_group = cast(TagGroup, cli.commands.get("annotator"))
    if not annotator_group:
        return

    @annotator_group.group(
        cls=TagGroup,
        help="Commands for interacting with annotation datasets.",
    )
    @click.pass_context
    def dataset(ctx: click.Context) -> None:
        """Interact with ZenML annotator datasets.

        Args:
            ctx: The click Context object.
        """
        from zenml.client import Client

        annotator_models = Client().active_stack_model.components.get(
            StackComponentType.ANNOTATOR
        )
        if annotator_models is None:
            cli_utils.error(
                "No active annotator found. Please register an annotator "
                "first and add it to your stack."
            )
            return

        from zenml.stack.stack_component import StackComponent

        ctx.obj = StackComponent.from_model(annotator_models[0])

    @dataset.command(
        "list",
        help="List the available datasets.",
    )
    @click.pass_obj
    def dataset_list(annotator: "BaseAnnotator") -> None:
        """List the available datasets.

        Args:
            annotator: The annotator stack component.
        """
        dataset_names = annotator.get_dataset_names()
        if not dataset_names:
            cli_utils.warning("No datasets found.")
            return
        cli_utils.print_list_items(
            list_items=dataset_names,
            column_title="DATASETS",
        )

    @dataset.command("stats")
    @click.argument("dataset_name", type=click.STRING)
    @click.pass_obj
    def dataset_stats(annotator: "BaseAnnotator", dataset_name: str) -> None:
        """Display statistics about a dataset.

        Args:
            annotator: The annotator stack component.
            dataset_name: The name of the dataset.
        """
        try:
            stats = annotator.get_dataset_stats(dataset_name)
            labeled_task_count, unlabeled_task_count = stats
        except IndexError:
            cli_utils.error(
                f"Dataset {dataset_name} does not exist. Please use `zenml "
                f"annotator dataset list` to list the available datasets."
            )
            return

        total_task_count = unlabeled_task_count + labeled_task_count
        cli_utils.declare(
            f"Annotation stats for '{dataset_name}' dataset:", bold=True
        )
        cli_utils.declare(f"Total annotation tasks: {total_task_count}")
        cli_utils.declare(f"Labeled annotation tasks: {labeled_task_count}")
        if annotator.flavor != "prodigy":
            # Prodigy doesn't allow you to get the unlabeled task count
            cli_utils.declare(
                f"Unlabeled annotation tasks: {unlabeled_task_count}"
            )

    @dataset.command("delete")
    @click.argument("dataset_name", type=click.STRING)
    @click.option(
        "--all",
        "-a",
        "all_",
        is_flag=True,
        help="Use this flag to delete all datasets.",
        type=click.BOOL,
    )
    @click.pass_obj
    def dataset_delete(
        annotator: "BaseAnnotator", dataset_name: str, all_: bool
    ) -> None:
        """Delete a dataset.

        If the --all flag is used, all datasets will be deleted.

        Args:
            annotator: The annotator stack component.
            dataset_name: Name of the dataset to delete.
            all_: Whether to delete all datasets.
        """
        if not cli_utils.confirmation(
            f"Are you sure you want to delete dataset '{dataset_name}'?"
        ):
            return
        cli_utils.declare(f"Deleting your dataset '{dataset_name}'")
        dataset_names = (
            annotator.get_dataset_names() if all_ else [dataset_name]
        )
        for dataset_name in dataset_names:
            try:
                annotator.delete_dataset(dataset_name=dataset_name)
                cli_utils.declare(
                    f"Dataset '{dataset_name}' has now been deleted."
                )
            except ValueError as e:
                cli_utils.error(
                    f"Failed to delete dataset '{dataset_name}': {e}"
                )

    @dataset.command(
        "annotate", context_settings={"ignore_unknown_options": True}
    )
    @click.argument("dataset_name", type=click.STRING)
    @click.argument("kwargs", nargs=-1, type=click.UNPROCESSED)
    @click.pass_obj
    def dataset_annotate(
        annotator: "BaseAnnotator",
        dataset_name: str,
        kwargs: Tuple[str, ...],
    ) -> None:
        """Command to launch the annotation interface for a dataset.

        Args:
            annotator: The annotator stack component.
            dataset_name: Name of the dataset
            kwargs: Additional keyword arguments to pass to the
                annotation client.

        Raises:
            ValueError: If the dataset does not exist.
        """
        cli_utils.declare(
            f"Launching the annotation interface for dataset '{dataset_name}'."
        )

        # Process the arbitrary keyword arguments
        kwargs_dict = {}
        for arg in kwargs:
            if arg.startswith("--"):
                key, value = arg.removeprefix("--").split("=", 1)
                kwargs_dict[key] = value

        if annotator.flavor == "prodigy":
            command = kwargs_dict.get("command")
            if not command:
                raise ValueError(
                    "The 'command' keyword argument is required for launching the Prodigy interface."
                )
            annotator.launch(**kwargs_dict)
        else:
            try:
                annotator.get_dataset(dataset_name=dataset_name)
                annotator.launch(
                    url=annotator.get_url_for_dataset(dataset_name)
                )
            except ValueError as e:
                raise ValueError("Dataset does not exist.") from e

register_code_repository(name, type_, source_path, description, logo_url, args)

Register a code repository.

Register a code repository with ZenML. This will allow ZenML to pull code from a remote repository and use it when running pipelines remotely. The configuration of the code repository can be different depending on the type of code repository. For more information, please refer to the documentation.

Parameters:

Name Type Description Default
name str

Name of the code repository

required
type_ str

Type of the code repository

required
source_path Optional[str]

Path to the source module if type is custom

required
description Optional[str]

The code repository description.

required
logo_url Optional[str]

URL of a logo (png, jpg or svg) for the code repository.

required
args List[str]

Additional arguments to be passed to the code repository

required
Source code in src/zenml/cli/code_repository.py
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
@code_repository.command(
    "register",
    context_settings={"ignore_unknown_options": True},
    help="Register a code repository.",
)
@click.argument("name", type=str)
@click.option(
    "--type",
    "-t",
    "type_",
    type=click.Choice(["github", "gitlab", "custom"]),
    required=True,
    help="Type of the code repository.",
)
@click.option(
    "--source",
    "-s",
    "source_path",
    type=str,
    required=False,
    help="Module containing the code repository implementation if type is custom.",
)
@click.option(
    "--description",
    "-d",
    type=str,
    required=False,
    help="The code repository description.",
)
@click.option(
    "--logo-url",
    "-l",
    type=str,
    required=False,
    help="URL of a logo (png, jpg or svg) for the code repository.",
)
@click.argument(
    "args",
    nargs=-1,
    type=click.UNPROCESSED,
)
def register_code_repository(
    name: str,
    type_: str,
    source_path: Optional[str],
    description: Optional[str],
    logo_url: Optional[str],
    args: List[str],
) -> None:
    """Register a code repository.

    Register a code repository with ZenML. This will allow ZenML to pull
    code from a remote repository and use it when running pipelines remotely.
    The configuration of the code repository can be different depending on the
    type of code repository. For more information, please refer to the
    documentation.

    Args:
        name: Name of the code repository
        type_: Type of the code repository
        source_path: Path to the source module if type is custom
        description: The code repository description.
        logo_url: URL of a logo (png, jpg or svg) for the code repository.
        args: Additional arguments to be passed to the code repository
    """
    parsed_name, parsed_args = cli_utils.parse_name_and_extra_arguments(
        list(args) + [name], expand_args=True
    )
    assert parsed_name
    name = parsed_name

    if type_ == "github":
        try:
            from zenml.integrations.github.code_repositories import (
                GitHubCodeRepository,
            )
        except ImportError:
            cli_utils.error(
                "You need to install the GitHub integration to use a GitHub "
                "code repository. Please run `zenml integration install "
                "github` and try again."
            )
        source = source_utils.resolve(GitHubCodeRepository)
    elif type_ == "gitlab":
        try:
            from zenml.integrations.gitlab.code_repositories import (
                GitLabCodeRepository,
            )
        except ImportError:
            cli_utils.error(
                "You need to install the GitLab integration to use a GitLab "
                "code repository. Please run `zenml integration install "
                "gitlab` and try again."
            )
        source = source_utils.resolve(GitLabCodeRepository)
    elif type_ == "custom":
        if not source_path:
            cli_utils.error(
                "When using a custom code repository type, you need to provide "
                "a path to the implementation class using the --source option: "
                "`zenml code-repository register --type=custom --source=<...>"
            )
        if not source_utils.validate_source_class(
            source_path, expected_class=BaseCodeRepository
        ):
            cli_utils.error(
                f"Your source {source_path} does not point to a "
                f"`{BaseCodeRepository.__name__}` subclass and can't be used "
                "to register a code repository."
            )

        source = Source.from_import_path(source_path)

    with console.status(f"Registering code repository '{name}'...\n"):
        Client().create_code_repository(
            name=name,
            config=parsed_args,
            source=source,
            description=description,
            logo_url=logo_url,
        )

        cli_utils.declare(f"Successfully registered code repository `{name}`.")

register_feature_store_subcommands()

Registers CLI subcommands for the Feature Store.

Source code in src/zenml/cli/feature.py
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
def register_feature_store_subcommands() -> None:
    """Registers CLI subcommands for the Feature Store."""
    feature_store_group = cast(TagGroup, cli.commands.get("feature-store"))
    if not feature_store_group:
        return

    @feature_store_group.group(
        cls=TagGroup,
        help="Commands for interacting with your features.",
    )
    @click.pass_context
    def feature(ctx: click.Context) -> None:
        """Features as obtained from a feature store.

        Args:
            ctx: The click context.
        """
        from zenml.client import Client
        from zenml.stack.stack_component import StackComponent

        client = Client()
        feature_store_models = client.active_stack_model.components[
            StackComponentType.FEATURE_STORE
        ]
        if feature_store_models is None:
            error(
                "No active feature store found. Please create a feature store "
                "first and add it to your stack."
            )
            return
        ctx.obj = StackComponent.from_model(feature_store_models[0])

    @feature.command("get-data-sources")
    @click.pass_obj
    def get_data_sources(feature_store: "BaseFeatureStore") -> None:
        """Get all data sources from the feature store.

        Args:
            feature_store: The feature store.
        """
        data_sources = feature_store.get_data_sources()  # type: ignore[attr-defined]
        declare(f"Data sources: {data_sources}")

    @feature.command("get-entities")
    @click.pass_obj
    def get_entities(feature_store: "BaseFeatureStore") -> None:
        """Get all entities from the feature store.

        Args:
            feature_store: The feature store.
        """
        entities = feature_store.get_entities()  # type: ignore[attr-defined]
        declare(f"Entities: {entities}")

    @feature.command("get-feature-services")
    @click.pass_obj
    def get_feature_services(feature_store: "BaseFeatureStore") -> None:
        """Get all feature services from the feature store.

        Args:
            feature_store: The feature store.
        """
        feature_services = feature_store.get_feature_services()  # type: ignore[attr-defined]
        declare(f"Feature services: {feature_services}")

    @feature.command("get-feature-views")
    @click.pass_obj
    def get_feature_views(feature_store: "BaseFeatureStore") -> None:
        """Get all feature views from the feature store.

        Args:
            feature_store: The feature store.
        """
        feature_views = feature_store.get_feature_views()  # type: ignore[attr-defined]
        declare(f"Feature views: {feature_views}")

    @feature.command("get-project")
    @click.pass_obj
    def get_project(feature_store: "BaseFeatureStore") -> None:
        """Get the current project name from the feature store.

        Args:
            feature_store: The feature store.
        """
        project = feature_store.get_project()  # type: ignore[attr-defined]
        declare(f"Project name: {project}")

    @feature.command("get-feast-version")
    @click.pass_obj
    def get_feast_version(feature_store: "BaseFeatureStore") -> None:
        """Get the current Feast version being used.

        Args:
            feature_store: The feature store.
        """
        version = feature_store.get_feast_version()  # type: ignore[attr-defined]
        declare(f"Feast version: {version}")

register_model(name, license, description, audience, use_cases, tradeoffs, ethical, limitations, tag, save_models_to_registry)

Register a new model in the Model Control Plane.

Parameters:

Name Type Description Default
name str

The name of the model.

required
license Optional[str]

The license model created under.

required
description Optional[str]

The description of the model.

required
audience Optional[str]

The target audience of the model.

required
use_cases Optional[str]

The use cases of the model.

required
tradeoffs Optional[str]

The tradeoffs of the model.

required
ethical Optional[str]

The ethical implications of the model.

required
limitations Optional[str]

The know limitations of the model.

required
tag Optional[List[str]]

Tags associated with the model.

required
save_models_to_registry Optional[bool]

Whether to save the model to the registry.

required
Source code in src/zenml/cli/model.py
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
@model.command("register", help="Register a new model.")
@click.option(
    "--name",
    "-n",
    help="The name of the model.",
    type=str,
    required=True,
)
@click.option(
    "--license",
    "-l",
    help="The license under which the model is created.",
    type=str,
    required=False,
)
@click.option(
    "--description",
    "-d",
    help="The description of the model.",
    type=str,
    required=False,
)
@click.option(
    "--audience",
    "-a",
    help="The target audience for the model.",
    type=str,
    required=False,
)
@click.option(
    "--use-cases",
    "-u",
    help="The use cases of the model.",
    type=str,
    required=False,
)
@click.option(
    "--tradeoffs",
    help="The tradeoffs of the model.",
    type=str,
    required=False,
)
@click.option(
    "--ethical",
    "-e",
    help="The ethical implications of the model.",
    type=str,
    required=False,
)
@click.option(
    "--limitations",
    help="The known limitations of the model.",
    type=str,
    required=False,
)
@click.option(
    "--tag",
    "-t",
    help="Tags associated with the model.",
    type=str,
    required=False,
    multiple=True,
)
@click.option(
    "--save-models-to-registry",
    "-s",
    help="Whether to automatically save model artifacts to the model registry.",
    type=click.BOOL,
    required=False,
    default=True,
)
def register_model(
    name: str,
    license: Optional[str],
    description: Optional[str],
    audience: Optional[str],
    use_cases: Optional[str],
    tradeoffs: Optional[str],
    ethical: Optional[str],
    limitations: Optional[str],
    tag: Optional[List[str]],
    save_models_to_registry: Optional[bool],
) -> None:
    """Register a new model in the Model Control Plane.

    Args:
        name: The name of the model.
        license: The license model created under.
        description: The description of the model.
        audience: The target audience of the model.
        use_cases: The use cases of the model.
        tradeoffs: The tradeoffs of the model.
        ethical: The ethical implications of the model.
        limitations: The know limitations of the model.
        tag: Tags associated with the model.
        save_models_to_registry: Whether to save the model to the
            registry.
    """
    try:
        model = Client().create_model(
            **remove_none_values(
                dict(
                    name=name,
                    license=license,
                    description=description,
                    audience=audience,
                    use_cases=use_cases,
                    trade_offs=tradeoffs,
                    ethics=ethical,
                    limitations=limitations,
                    tags=tag,
                    save_models_to_registry=save_models_to_registry,
                )
            )
        )
    except (EntityExistsError, ValueError) as e:
        cli_utils.error(str(e))

    cli_utils.print_table([_model_to_print(model)])

register_model_deployer_subcommands()

Registers CLI subcommands for the Model Deployer.

Source code in src/zenml/cli/served_model.py
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
def register_model_deployer_subcommands() -> None:  # noqa: C901
    """Registers CLI subcommands for the Model Deployer."""
    model_deployer_group = cast(TagGroup, cli.commands.get("model-deployer"))
    if not model_deployer_group:
        return

    @model_deployer_group.group(
        cls=TagGroup,
        help="Commands for interacting with served models.",
    )
    @click.pass_context
    def models(ctx: click.Context) -> None:
        """List and manage served models with the active model deployer.

        Args:
            ctx: The click context.
        """
        from zenml.client import Client
        from zenml.stack.stack_component import StackComponent

        client = Client()
        model_deployer_models = client.active_stack_model.components.get(
            StackComponentType.MODEL_DEPLOYER
        )
        if model_deployer_models is None:
            error(
                "No active model deployer found. Please add a model_deployer "
                "to your stack."
            )
            return
        ctx.obj = StackComponent.from_model(model_deployer_models[0])

    @models.command(
        "list",
        help="Get a list of all served models within the model-deployer stack "
        "component.",
    )
    @click.option(
        "--step",
        "-s",
        type=click.STRING,
        default=None,
        help="Show only served models that were deployed by the indicated "
        "pipeline step.",
    )
    @click.option(
        "--pipeline-run-id",
        "-r",
        type=click.STRING,
        default=None,
        help="Show only served models that were deployed by the indicated "
        "pipeline run.",
    )
    @click.option(
        "--pipeline-name",
        "-p",
        type=click.STRING,
        default=None,
        help="Show only served models that were deployed by the indicated "
        "pipeline.",
    )
    @click.option(
        "--model",
        "-m",
        type=click.STRING,
        default=None,
        help="Show only served model versions for the given model name.",
    )
    @click.option(
        "--model-version",
        "-v",
        type=click.STRING,
        default=None,
        help="Show only served model versions for the given model version.",
    )
    @click.option(
        "--flavor",
        "-f",
        type=click.STRING,
        default=None,
        help="Show only served model versions for the given model flavor.",
    )
    @click.option(
        "--running",
        is_flag=True,
        help="Show only model servers that are currently running.",
    )
    @click.pass_obj
    def list_models(
        model_deployer: "BaseModelDeployer",
        step: Optional[str],
        pipeline_name: Optional[str],
        pipeline_run_id: Optional[str],
        model: Optional[str],
        model_version: Optional[str],
        flavor: Optional[str],
        running: bool,
    ) -> None:
        """List of all served models within the model-deployer stack component.

        Args:
            model_deployer: The model-deployer stack component.
            step: Show only served models that were deployed by the indicated
                pipeline step.
            pipeline_run_id: Show only served models that were deployed by the
                indicated pipeline run.
            pipeline_name: Show only served models that were deployed by the
                indicated pipeline.
            model: Show only served model versions for the given model name.
            running: Show only model servers that are currently running.
            model_version: Show only served model versions for the given model
                version.
            flavor: Show only served model versions for the given model flavor.
        """
        services = model_deployer.find_model_server(
            running=running,
            pipeline_name=pipeline_name,
            pipeline_run_id=pipeline_run_id if pipeline_run_id else None,
            pipeline_step_name=step,
            model_name=model,
            model_version=model_version,
            flavor=flavor,
        )
        if services:
            pretty_print_model_deployer(
                services,
                model_deployer,
            )
        else:
            warning("No served models found.")

    @models.command("describe", help="Describe a specified served model.")
    @click.argument("served_model_uuid", type=click.STRING)
    @click.pass_obj
    def describe_model(
        model_deployer: "BaseModelDeployer", served_model_uuid: str
    ) -> None:
        """Describe a specified served model.

        Args:
            model_deployer: The model-deployer stack component.
            served_model_uuid: The UUID of the served model.
        """
        served_models = model_deployer.find_model_server(
            service_uuid=uuid.UUID(served_model_uuid)
        )
        if served_models:
            print_served_model_configuration(served_models[0], model_deployer)
            return
        warning(f"No model with uuid: '{served_model_uuid}' could be found.")
        return

    @models.command(
        "get-url",
        help="Return the prediction URL to a specified model server.",
    )
    @click.argument("served_model_uuid", type=click.STRING)
    @click.pass_obj
    def get_url(
        model_deployer: "BaseModelDeployer", served_model_uuid: str
    ) -> None:
        """Return the prediction URL to a specified model server.

        Args:
            model_deployer: The model-deployer stack component.
            served_model_uuid: The UUID of the served model.
        """
        served_models = model_deployer.find_model_server(
            service_uuid=uuid.UUID(served_model_uuid)
        )
        if served_models:
            try:
                prediction_url = model_deployer.get_model_server_info(
                    served_models[0]
                ).get("PREDICTION_URL")
                prediction_hostname = (
                    model_deployer.get_model_server_info(served_models[0]).get(
                        "PREDICTION_HOSTNAME"
                    )
                    or "No hostname specified for this service"
                )
                prediction_apis_urls = (
                    model_deployer.get_model_server_info(served_models[0]).get(
                        "PREDICTION_APIS_URLS"
                    )
                    or "No prediction APIs URLs specified for this service"
                )
                declare(
                    f"  Prediction URL of Served Model {served_model_uuid} "
                    f"is:\n"
                    f"  {prediction_url}\n"
                    f"  and the hostname is: {prediction_hostname}\n"
                    f"  and the prediction APIs URLs are: {prediction_apis_urls}\n"
                )
            except KeyError:
                warning("The deployed model instance has no 'prediction_url'.")
            return
        warning(f"No model with uuid: '{served_model_uuid}' could be found.")
        return

    @models.command("start", help="Start a specified model server.")
    @click.argument("served_model_uuid", type=click.STRING)
    @click.option(
        "--timeout",
        "-t",
        type=click.INT,
        default=300,
        help="Time in seconds to wait for the model to start. Set to 0 to "
        "return immediately after telling the server to start, without "
        "waiting for it to become fully active (default: 300s).",
    )
    @click.pass_obj
    def start_model_service(
        model_deployer: "BaseModelDeployer",
        served_model_uuid: str,
        timeout: int,
    ) -> None:
        """Start a specified model server.

        Args:
            model_deployer: The model-deployer stack component.
            served_model_uuid: The UUID of the served model.
            timeout: Time in seconds to wait for the model to start.
        """
        served_models = model_deployer.find_model_server(
            service_uuid=uuid.UUID(served_model_uuid)
        )
        if served_models:
            model_deployer.start_model_server(
                served_models[0].uuid, timeout=timeout
            )
            declare(f"Model server {served_models[0]} was started.")
            return

        warning(f"No model with uuid: '{served_model_uuid}' could be found.")
        return

    @models.command("stop", help="Stop a specified model server.")
    @click.argument("served_model_uuid", type=click.STRING)
    @click.option(
        "--timeout",
        "-t",
        type=click.INT,
        default=300,
        help="Time in seconds to wait for the model to start. Set to 0 to "
        "return immediately after telling the server to stop, without "
        "waiting for it to become inactive (default: 300s).",
    )
    @click.option(
        "--yes",
        "-y",
        "force",
        is_flag=True,
        help="Force the model server to stop. This will bypass any graceful "
        "shutdown processes and try to force the model server to stop "
        "immediately, if possible.",
    )
    @click.pass_obj
    def stop_model_service(
        model_deployer: "BaseModelDeployer",
        served_model_uuid: str,
        timeout: int,
        force: bool,
    ) -> None:
        """Stop a specified model server.

        Args:
            model_deployer: The model-deployer stack component.
            served_model_uuid: The UUID of the served model.
            timeout: Time in seconds to wait for the model to stop.
            force: Force the model server to stop.
        """
        served_models = model_deployer.find_model_server(
            service_uuid=uuid.UUID(served_model_uuid)
        )
        if served_models:
            model_deployer.stop_model_server(
                served_models[0].uuid, timeout=timeout, force=force
            )
            declare(f"Model server {served_models[0]} was stopped.")
            return

        warning(f"No model with uuid: '{served_model_uuid}' could be found.")
        return

    @models.command("delete", help="Delete a specified model server.")
    @click.argument("served_model_uuid", type=click.STRING)
    @click.option(
        "--timeout",
        "-t",
        type=click.INT,
        default=300,
        help="Time in seconds to wait for the model to be deleted. Set to 0 to "
        "return immediately after stopping and deleting the model server, "
        "without waiting for it to release all allocated resources.",
    )
    @click.option(
        "--yes",
        "-y",
        "force",
        is_flag=True,
        help="Force the model server to stop and delete. This will bypass any "
        "graceful shutdown processes and try to force the model server to "
        "stop and delete immediately, if possible.",
    )
    @click.pass_obj
    def delete_model_service(
        model_deployer: "BaseModelDeployer",
        served_model_uuid: str,
        timeout: int,
        force: bool,
    ) -> None:
        """Delete a specified model server.

        Args:
            model_deployer: The model-deployer stack component.
            served_model_uuid: The UUID of the served model.
            timeout: Time in seconds to wait for the model to be deleted.
            force: Force the model server to stop and delete.
        """
        served_models = model_deployer.find_model_server(
            service_uuid=uuid.UUID(served_model_uuid)
        )
        if served_models:
            model_deployer.delete_model_server(
                served_models[0].uuid, timeout=timeout, force=force
            )
            declare(f"Model server {served_models[0]} was deleted.")
            return

        warning(f"No model with uuid: '{served_model_uuid}' could be found.")
        return

    @models.command("logs", help="Show the logs for a model server.")
    @click.argument("served_model_uuid", type=click.STRING)
    @click.option(
        "--follow",
        "-f",
        is_flag=True,
        help="Continue to output new log data as it becomes available.",
    )
    @click.option(
        "--tail",
        "-t",
        type=click.INT,
        default=None,
        help="Only show the last NUM lines of log output.",
    )
    @click.option(
        "--raw",
        "-r",
        is_flag=True,
        help="Show raw log contents (don't pretty-print logs).",
    )
    @click.pass_obj
    def get_model_service_logs(
        model_deployer: "BaseModelDeployer",
        served_model_uuid: str,
        follow: bool,
        tail: Optional[int],
        raw: bool,
    ) -> None:
        """Display the logs for a model server.

        Args:
            model_deployer: The model-deployer stack component.
            served_model_uuid: The UUID of the served model.
            follow: Continue to output new log data as it becomes available.
            tail: Only show the last NUM lines of log output.
            raw: Show raw log contents (don't pretty-print logs).
        """
        served_models = model_deployer.find_model_server(
            service_uuid=uuid.UUID(served_model_uuid)
        )
        if not served_models:
            warning(
                f"No model with uuid: '{served_model_uuid}' could be found."
            )
            return

        model_logs = model_deployer.get_model_server_logs(
            served_models[0].uuid, follow=follow, tail=tail
        )
        if model_logs:
            for line in model_logs:
                # don't pretty-print log lines that are already pretty-printed
                if raw or line.startswith("\x1b["):
                    console.print(line, markup=False)
                else:
                    try:
                        console.print(line)
                    except MarkupError:
                        console.print(line, markup=False)

register_model_registry_subcommands()

Registers CLI subcommands for the Model Registry.

Source code in src/zenml/cli/model_registry.py
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
def register_model_registry_subcommands() -> None:  # noqa: C901
    """Registers CLI subcommands for the Model Registry."""
    model_registry_group = cast(TagGroup, cli.commands.get("model-registry"))
    if not model_registry_group:
        return

    @model_registry_group.group(
        cls=TagGroup,
        help="Commands for interacting with registered models group of commands.",
    )
    @click.pass_context
    def models(ctx: click.Context) -> None:
        """List and manage models with the active model registry.

        Args:
            ctx: The click context.
        """
        from zenml.client import Client
        from zenml.stack.stack_component import StackComponent

        client = Client()
        model_registry_models = client.active_stack_model.components.get(
            StackComponentType.MODEL_REGISTRY
        )
        if model_registry_models is None:
            cli_utils.error(
                "No active model registry found. Please add a model_registry "
                "to your stack."
            )
            return
        ctx.obj = StackComponent.from_model(model_registry_models[0])

    @models.command(
        "list",
        help="Get a list of all registered models within the model registry.",
    )
    @click.option(
        "--metadata",
        "-m",
        type=(str, str),
        default=None,
        help="Filter models by metadata. can be used like: -m key1 value1 -m key2 value",
        multiple=True,
    )
    @click.pass_obj
    def list_registered_models(
        model_registry: "BaseModelRegistry",
        metadata: Optional[Dict[str, str]],
    ) -> None:
        """List of all registered models within the model registry.

        The list can be filtered by metadata (tags) using the --metadata flag.
        Example: zenml model-registry models list-versions -m key1 value1 -m key2 value2

        Args:
            model_registry: The model registry stack component.
            metadata: Filter models by Metadata (Tags).
        """
        metadata = dict(metadata) if metadata else None
        registered_models = model_registry.list_models(metadata=metadata)
        # Print registered models if any
        if registered_models:
            cli_utils.pretty_print_registered_model_table(registered_models)
        else:
            cli_utils.declare("No models found.")

    @models.command(
        "register",
        help="Register a model with the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--description",
        "-d",
        type=str,
        default=None,
        help="Description of the model to register.",
    )
    @click.option(
        "--metadata",
        "-t",
        type=(str, str),
        default=None,
        help="Metadata or Tags to add to the model. can be used like: -m key1 value1 -m key2 value",
        multiple=True,
    )
    @click.pass_obj
    def register_model(
        model_registry: "BaseModelRegistry",
        name: str,
        description: Optional[str],
        metadata: Optional[Dict[str, str]],
    ) -> None:
        """Register a model with the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to register.
            description: Description of the model to register.
            metadata: Metadata or Tags to add to the registered model.
        """
        try:
            model_registry.get_model(name)
            cli_utils.error(f"Model with name {name} already exists.")
        except KeyError:
            pass
        metadata = dict(metadata) if metadata else None
        model_registry.register_model(
            name=name,
            description=description,
            metadata=metadata,
        )
        cli_utils.declare(f"Model {name} registered successfully.")

    @models.command(
        "delete",
        help="Delete a model from the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--yes",
        "-y",
        is_flag=True,
        help="Don't ask for confirmation.",
    )
    @click.pass_obj
    def delete_model(
        model_registry: "BaseModelRegistry",
        name: str,
        yes: bool = False,
    ) -> None:
        """Delete a model from the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to delete.
            yes: If set, don't ask for confirmation.
        """
        try:
            model_registry.get_model(name)
        except KeyError:
            cli_utils.error(f"Model with name {name} does not exist.")
            return
        if not yes:
            confirmation = cli_utils.confirmation(
                f"Found Model with name {name}. Do you want to delete them?"
            )
            if not confirmation:
                cli_utils.declare("Model deletion canceled.")
                return
        model_registry.delete_model(name)
        cli_utils.declare(f"Model {name} deleted successfully.")

    @models.command(
        "update",
        help="Update a model in the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--description",
        "-d",
        type=str,
        default=None,
        help="Description of the model to update.",
    )
    @click.option(
        "--metadata",
        "-t",
        type=(str, str),
        default=None,
        help="Metadata or Tags to add to the model. Can be used like: -m key1 value1 -m key2 value",
        multiple=True,
    )
    @click.pass_obj
    def update_model(
        model_registry: "BaseModelRegistry",
        name: str,
        description: Optional[str],
        metadata: Optional[Dict[str, str]],
    ) -> None:
        """Update a model in the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to update.
            description: Description of the model to update.
            metadata: Metadata or Tags to add to the model.
        """
        try:
            model_registry.get_model(name)
        except KeyError:
            cli_utils.error(f"Model with name {name} does not exist.")
            return
        metadata = dict(metadata) if metadata else None
        model_registry.update_model(
            name=name,
            description=description,
            metadata=metadata,
        )
        cli_utils.declare(f"Model {name} updated successfully.")

    @models.command(
        "get",
        help="Get a model from the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.pass_obj
    def get_model(
        model_registry: "BaseModelRegistry",
        name: str,
    ) -> None:
        """Get a model from the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to get.
        """
        try:
            model = model_registry.get_model(name)
        except KeyError:
            cli_utils.error(f"Model with name {name} does not exist.")
            return
        cli_utils.pretty_print_registered_model_table([model])

    @models.command(
        "get-version",
        help="Get a model version from the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--version",
        "-v",
        type=str,
        default=None,
        help="Version of the model to get.",
        required=True,
    )
    @click.pass_obj
    def get_model_version(
        model_registry: "BaseModelRegistry",
        name: str,
        version: str,
    ) -> None:
        """Get a model version from the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to get.
            version: Version of the model to get.
        """
        try:
            model_version = model_registry.get_model_version(name, version)
        except KeyError:
            cli_utils.error(
                f"Model with name {name} and version {version} does not exist."
            )
            return
        cli_utils.pretty_print_model_version_details(model_version)

    @models.command(
        "delete-version",
        help="Delete a model version from the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--version",
        "-v",
        type=str,
        default=None,
        help="Version of the model to delete.",
        required=True,
    )
    @click.option(
        "--yes",
        "-y",
        is_flag=True,
        help="Don't ask for confirmation.",
    )
    @click.pass_obj
    def delete_model_version(
        model_registry: "BaseModelRegistry",
        name: str,
        version: str,
        yes: bool = False,
    ) -> None:
        """Delete a model version from the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to delete.
            version: Version of the model to delete.
            yes: If set, don't ask for confirmation.
        """
        try:
            model_registry.get_model_version(name, version)
        except KeyError:
            cli_utils.error(
                f"Model with name {name} and version {version} does not exist."
            )
            return
        if not yes:
            confirmation = cli_utils.confirmation(
                f"Found Model with the name `{name}` and the version `{version}`."
                f"Do you want to delete it?"
            )
            if not confirmation:
                cli_utils.declare("Model version deletion canceled.")
                return
        model_registry.delete_model_version(name, version)
        cli_utils.declare(
            f"Model {name} version {version} deleted successfully."
        )

    @models.command(
        "update-version",
        help="Update a model version in the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--version",
        "-v",
        type=str,
        default=None,
        help="Version of the model to update.",
        required=True,
    )
    @click.option(
        "--description",
        "-d",
        type=str,
        default=None,
        help="Description of the model to update.",
    )
    @click.option(
        "--metadata",
        "-m",
        type=(str, str),
        default=None,
        help="Metadata or Tags to add to the model. can be used like: --m key1 value1 -m key2 value",
        multiple=True,
    )
    @click.option(
        "--stage",
        "-s",
        type=click.Choice(["None", "Staging", "Production", "Archived"]),
        default=None,
        help="Stage of the model to update.",
    )
    @click.option(
        "--remove_metadata",
        "-rm",
        default=None,
        help="Metadata or Tags to remove from the model. Can be used like: -rm key1 -rm key2",
        multiple=True,
    )
    @click.pass_obj
    def update_model_version(
        model_registry: "BaseModelRegistry",
        name: str,
        version: str,
        description: Optional[str],
        metadata: Optional[Dict[str, str]],
        stage: Optional[str],
        remove_metadata: Optional[List[str]],
    ) -> None:
        """Update a model version in the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to update.
            version: Version of the model to update.
            description: Description of the model to update.
            metadata: Metadata to add to the model version.
            stage: Stage of the model to update.
            remove_metadata: Metadata to remove from the model version.
        """
        try:
            model_registry.get_model_version(name, version)
        except KeyError:
            cli_utils.error(
                f"Model with name {name} and version {version} does not exist."
            )
            return
        metadata = dict(metadata) if metadata else {}
        remove_metadata = list(remove_metadata) if remove_metadata else []
        updated_version = model_registry.update_model_version(
            name=name,
            version=version,
            description=description,
            metadata=ModelRegistryModelMetadata(**metadata),
            stage=ModelVersionStage(stage) if stage else None,
            remove_metadata=remove_metadata,
        )
        cli_utils.declare(
            f"Model {name} version {version} updated successfully."
        )
        cli_utils.pretty_print_model_version_details(updated_version)

    @models.command(
        "list-versions",
        help="List all model versions in the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--model-uri",
        "-m",
        type=str,
        default=None,
        help="Model URI of the model to list versions for.",
    )
    @click.option(
        "--metadata",
        "-m",
        type=(str, str),
        default=None,
        help="Metadata or Tags to filter the model versions by. Can be used like: -m key1 value1 -m key2 value",
        multiple=True,
    )
    @click.option(
        "--count",
        "-c",
        type=int,
        help="Number of model versions to list.",
    )
    @click.option(
        "--order-by-date",
        type=click.Choice(["asc", "desc"]),
        default="desc",
        help="Order by date.",
    )
    @click.option(
        "--created-after",
        type=click.DateTime(formats=["%Y-%m-%d"]),
        default=None,
        help="List model versions created after this date.",
    )
    @click.option(
        "--created-before",
        type=click.DateTime(formats=["%Y-%m-%d"]),
        default=None,
        help="List model versions created before this date.",
    )
    @click.pass_obj
    def list_model_versions(
        model_registry: "BaseModelRegistry",
        name: str,
        model_uri: Optional[str],
        count: Optional[int],
        metadata: Optional[Dict[str, str]],
        order_by_date: str,
        created_after: Optional[datetime],
        created_before: Optional[datetime],
    ) -> None:
        """List all model versions in the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to list versions for.
            model_uri: Model URI of the model to list versions for.
            metadata: Metadata or Tags to filter the model versions by.
            count: Number of model versions to list.
            order_by_date: Order by date.
            created_after: List model versions created after this date.
            created_before: List model versions created before this date.
        """
        metadata = dict(metadata) if metadata else {}
        model_versions = model_registry.list_model_versions(
            name=name,
            model_source_uri=model_uri,
            metadata=ModelRegistryModelMetadata(**metadata),
            count=count,
            order_by_date=order_by_date,
            created_after=created_after,
            created_before=created_before,
        )
        if not model_versions:
            cli_utils.declare("No model versions found.")
            return
        cli_utils.pretty_print_model_version_table(model_versions)

    @models.command(
        "register-version",
        help="Register a model version in the active model registry.",
    )
    @click.argument(
        "name",
        type=click.STRING,
        required=True,
    )
    @click.option(
        "--description",
        "-d",
        type=str,
        default=None,
        help="Description of the model version.",
    )
    @click.option(
        "--metadata",
        "-m",
        type=(str, str),
        default=None,
        help="Metadata or Tags to add to the model version. Can be used like: -m key1 value1 -m key2 value",
        multiple=True,
    )
    @click.option(
        "--version",
        "-v",
        type=str,
        required=True,
        default=None,
        help="Version of the model to register.",
    )
    @click.option(
        "--model-uri",
        "-u",
        type=str,
        default=None,
        help="Model URI of the model to register.",
        required=True,
    )
    @click.option(
        "--zenml-version",
        type=str,
        default=None,
        help="ZenML version of the model to register.",
    )
    @click.option(
        "--zenml-run-name",
        type=str,
        default=None,
        help="ZenML run name of the model to register.",
    )
    @click.option(
        "--zenml-pipeline-run-id",
        type=str,
        default=None,
        help="ZenML pipeline run ID of the model to register.",
    )
    @click.option(
        "--zenml-pipeline-name",
        type=str,
        default=None,
        help="ZenML pipeline name of the model to register.",
    )
    @click.option(
        "--zenml-step-name",
        type=str,
        default=None,
        help="ZenML step name of the model to register.",
    )
    @click.pass_obj
    def register_model_version(
        model_registry: "BaseModelRegistry",
        name: str,
        version: str,
        model_uri: str,
        description: Optional[str],
        metadata: Optional[Dict[str, str]],
        zenml_version: Optional[str],
        zenml_run_name: Optional[str],
        zenml_pipeline_name: Optional[str],
        zenml_step_name: Optional[str],
    ) -> None:
        """Register a model version in the active model registry.

        Args:
            model_registry: The model registry stack component.
            name: Name of the model to register.
            version: Version of the model to register.
            model_uri: Model URI of the model to register.
            description: Description of the model to register.
            metadata: Model version metadata.
            zenml_version: ZenML version of the model to register.
            zenml_run_name: ZenML pipeline run name of the model to register.
            zenml_pipeline_name: ZenML pipeline name of the model to register.
            zenml_step_name: ZenML step name of the model to register.
        """
        # Parse metadata
        metadata = dict(metadata) if metadata else {}
        registered_metadata = ModelRegistryModelMetadata(**dict(metadata))
        registered_metadata.zenml_version = zenml_version
        registered_metadata.zenml_run_name = zenml_run_name
        registered_metadata.zenml_pipeline_name = zenml_pipeline_name
        registered_metadata.zenml_step_name = zenml_step_name
        model_version = model_registry.register_model_version(
            name=name,
            version=version,
            model_source_uri=model_uri,
            description=description,
            metadata=registered_metadata,
        )
        cli_utils.declare(
            f"Model {name} version {version} registered successfully."
        )
        cli_utils.pretty_print_model_version_details(model_version)

register_pipeline(source, parameters_path=None)

Register a pipeline.

Parameters:

Name Type Description Default
source str

Importable source resolving to a pipeline instance.

required
parameters_path Optional[str]

Path to pipeline parameters file.

None
Source code in src/zenml/cli/pipeline.py
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
@pipeline.command(
    "register",
    help="Register a pipeline instance. The SOURCE argument needs to be an "
    "importable source path resolving to a ZenML pipeline instance, e.g. "
    "`my_module.my_pipeline_instance`.",
)
@click.argument("source")
@click.option(
    "--parameters",
    "-p",
    "parameters_path",
    type=click.Path(exists=True, dir_okay=False),
    required=False,
    help="Path to JSON file containing parameters for the pipeline function.",
)
def register_pipeline(
    source: str, parameters_path: Optional[str] = None
) -> None:
    """Register a pipeline.

    Args:
        source: Importable source resolving to a pipeline instance.
        parameters_path: Path to pipeline parameters file.
    """
    if "." not in source:
        cli_utils.error(
            f"The given source path `{source}` is invalid. Make sure it looks "
            "like `some.module.name_of_pipeline_instance_variable` and "
            "resolves to a pipeline object."
        )

    if not Client().root:
        cli_utils.warning(
            "You're running the `zenml pipeline register` command without a "
            "ZenML repository. Your current working directory will be used "
            "as the source root relative to which the `source` argument is "
            "expected. To silence this warning, run `zenml init` at your "
            "source code root."
        )

    pipeline_instance = _import_pipeline(source=source)

    parameters: Dict[str, Any] = {}
    if parameters_path:
        with open(parameters_path, "r") as f:
            parameters = json.load(f)

    try:
        pipeline_instance.prepare(**parameters)
    except ValueError:
        cli_utils.error(
            "Pipeline preparation failed. This is most likely due to your "
            "pipeline entrypoint function requiring arguments that were not "
            "provided. Please provide a JSON file with the parameters for "
            f"your pipeline like this: `zenml pipeline register {source} "
            "--parameters=<PATH_TO_JSON>`."
        )

    pipeline_instance.register()

register_secrets(skip_existing, stack_name_or_id=None)

Interactively registers all required secrets for a stack.

Parameters:

Name Type Description Default
skip_existing bool

If True, skip asking for secret values that already exist.

required
stack_name_or_id Optional[str]

Name of the stack for which to register secrets. If empty, the active stack will be used.

None
Source code in src/zenml/cli/stack.py
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
@stack.command(
    "register-secrets",
    help="Interactively register all required secrets for a stack.",
)
@click.argument("stack_name_or_id", type=str, required=False)
@click.option(
    "--skip-existing",
    "skip_existing",
    is_flag=True,
    default=False,
    help="Skip secrets with existing values.",
    type=bool,
)
def register_secrets(
    skip_existing: bool,
    stack_name_or_id: Optional[str] = None,
) -> None:
    """Interactively registers all required secrets for a stack.

    Args:
        skip_existing: If `True`, skip asking for secret values that already
            exist.
        stack_name_or_id: Name of the stack for which to register secrets.
                          If empty, the active stack will be used.
    """
    from zenml.stack.stack import Stack

    client = Client()

    stack_model = client.get_stack(name_id_or_prefix=stack_name_or_id)

    stack_ = Stack.from_model(stack_model)
    required_secrets = stack_.required_secrets

    if not required_secrets:
        cli_utils.declare("No secrets required for this stack.")
        return

    secret_names = {s.name for s in required_secrets}

    secrets_to_register = []
    secrets_to_update = []
    for name in secret_names:
        try:
            secret_content = client.get_secret(name).secret_values.copy()
            secret_exists = True
        except KeyError:
            secret_content = {}
            secret_exists = False

        required_keys = {s.key for s in required_secrets if s.name == name}
        needs_update = False

        for key in required_keys:
            existing_value = secret_content.get(key, None)

            if existing_value:
                if skip_existing:
                    continue

                value = getpass.getpass(
                    f"Value for secret `{name}.{key}` "
                    "(Leave empty to use existing value):"
                )
                if value:
                    value = cli_utils.expand_argument_value_from_file(
                        name=key, value=value
                    )
                else:
                    value = existing_value

                # only need to update if the value changed
                needs_update = needs_update or value != existing_value
            else:
                value = None
                while not value:
                    value = getpass.getpass(
                        f"Value for secret `{name}.{key}`:"
                    )
                value = cli_utils.expand_argument_value_from_file(
                    name=key, value=value
                )
                needs_update = True

            secret_content[key] = value

        if not secret_exists:
            secrets_to_register.append(
                (
                    name,
                    secret_content,
                )
            )
        elif needs_update:
            secrets_to_update.append(
                (
                    name,
                    secret_content,
                )
            )

    for secret_name, secret_values in secrets_to_register:
        cli_utils.declare(f"Registering secret `{secret_name}`:")
        cli_utils.pretty_print_secret(secret_values, hide_secret=True)
        client.create_secret(secret_name, values=secret_values)
    for secret_name, secret_values in secrets_to_update:
        cli_utils.declare(f"Updating secret `{secret_name}`:")
        cli_utils.pretty_print_secret(secret_values, hide_secret=True)
        client.update_secret(secret_name, add_or_update_values=secret_values)

register_service_connector(name, args, description=None, connector_type=None, resource_type=None, resource_id=None, auth_method=None, expires_at=None, expires_skew_tolerance=None, expiration_seconds=None, no_verify=False, labels=None, interactive=False, no_docs=False, show_secrets=False, auto_configure=False)

Registers a service connector.

Parameters:

Name Type Description Default
name Optional[str]

The name to use for the service connector.

required
args List[str]

Configuration arguments for the service connector.

required
description Optional[str]

Short description for the service connector.

None
connector_type Optional[str]

The service connector type.

None
resource_type Optional[str]

The type of resource to connect to.

None
resource_id Optional[str]

The ID of the resource to connect to.

None
auth_method Optional[str]

The authentication method to use.

None
expires_at Optional[datetime]

The exact UTC date and time when the credentials configured for this connector will expire.

None
expires_skew_tolerance Optional[int]

The tolerance, in seconds, allowed when determining when the credentials configured for or generated by this connector will expire.

None
expiration_seconds Optional[int]

The duration, in seconds, that the temporary credentials generated by this connector should remain valid.

None
no_verify bool

Do not verify the service connector before registering.

False
labels Optional[List[str]]

Labels to be associated with the service connector.

None
interactive bool

Register a new service connector interactively.

False
no_docs bool

Don't show documentation details during the interactive configuration.

False
show_secrets bool

Show security sensitive configuration attributes in the terminal.

False
auto_configure bool

Auto configure the service connector.

False

Raises:

Type Description
TypeError

If the connector_model does not have the correct type.

Source code in src/zenml/cli/service_connectors.py
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
@service_connector.command(
    "register",
    context_settings={"ignore_unknown_options": True},
    help="""Configure, validate and register a service connector.

This command can be used to configure and register a ZenML service connector and
to optionally verify that the service connector configuration and credentials
are valid and can be used to access the specified resource(s).

If the `-i|--interactive` flag is set, it will prompt the user for all the
information required to configure a service connector in a wizard-like fashion:

    $ zenml service-connector register -i

To trim down the amount of information displayed in interactive mode, pass the
`-n|--no-docs` flag:

    $ zenml service-connector register -ni

Secret configuration attributes are not shown by default. Use the
`-x|--show-secrets` flag to show them:

    $ zenml service-connector register -ix

Non-interactive examples:

- register a multi-purpose AWS service connector capable of accessing
any of the resource types that it supports (e.g. S3 buckets, EKS Kubernetes
clusters) using auto-configured credentials (i.e. extracted from the environment
variables or AWS CLI configuration files):

    $ zenml service-connector register aws-auto-multi --description \\
"Multi-purpose AWS connector" --type aws --auto-configure \\
--label auto=true --label purpose=multi

- register a Docker service connector providing access to a single DockerHub
repository named `dockerhub-hyppo` using explicit credentials:

    $ zenml service-connector register dockerhub-hyppo --description \\
"Hyppo's DockerHub repo" --type docker --resource-id dockerhub-hyppo \\
--username=hyppo --password=mypassword

- register an AWS service connector providing access to all the S3 buckets
that it's authorized to access using IAM role credentials:

    $ zenml service-connector register aws-s3-multi --description \\   
"Multi-bucket S3 connector" --type aws --resource-type s3-bucket \\    
--auth_method iam-role --role_arn=arn:aws:iam::<account>:role/<role> \\
--aws_region=us-east-1 --aws-access-key-id=<aws-key-id> \\            
--aws_secret_access_key=<aws-secret-key> --expiration-seconds 3600

All registered service connectors are validated before being registered. To
skip validation, pass the `--no-verify` flag.
""",
)
@click.argument(
    "name",
    type=str,
    required=False,
)
@click.option(
    "--description",
    "description",
    help="Short description for the connector instance.",
    required=False,
    type=str,
)
@click.option(
    "--type",
    "-t",
    "connector_type",
    help="The service connector type.",
    required=False,
    type=str,
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="The type of resource to connect to.",
    required=False,
    type=str,
)
@click.option(
    "--resource-id",
    "-ri",
    "resource_id",
    help="The ID of the resource to connect to.",
    required=False,
    type=str,
)
@click.option(
    "--auth-method",
    "-a",
    "auth_method",
    help="The authentication method to use.",
    required=False,
    type=str,
)
@click.option(
    "--expires-at",
    "expires_at",
    help="The exact UTC date and time when the credentials configured for this "
    "connector will expire. Takes the form 'YYYY-MM-DD HH:MM:SS'. This is only "
    "required if you are configuring a service connector with expiring "
    "credentials.",
    required=False,
    type=click.DateTime(),
)
@click.option(
    "--expires-skew-tolerance",
    "expires_skew_tolerance",
    help="The tolerance, in seconds, allowed when determining when the "
    "credentials configured for or generated by this connector will expire.",
    required=False,
    type=int,
)
@click.option(
    "--expiration-seconds",
    "expiration_seconds",
    help="The duration, in seconds, that the temporary credentials "
    "generated by this connector should remain valid.",
    required=False,
    type=int,
)
@click.option(
    "--label",
    "-l",
    "labels",
    help="Labels to be associated with the service connector. Takes the form "
    "-l key1=value1 and can be used multiple times.",
    multiple=True,
)
@click.option(
    "--no-verify",
    "no_verify",
    is_flag=True,
    default=False,
    help="Do not verify the service connector before registering.",
    type=click.BOOL,
)
@click.option(
    "--interactive",
    "-i",
    "interactive",
    is_flag=True,
    default=False,
    help="Register a new service connector interactively.",
    type=click.BOOL,
)
@click.option(
    "--no-docs",
    "-n",
    "no_docs",
    is_flag=True,
    default=False,
    help="Don't show documentation details during the interactive "
    "configuration.",
    type=click.BOOL,
)
@click.option(
    "--show-secrets",
    "-x",
    "show_secrets",
    is_flag=True,
    default=False,
    help="Show security sensitive configuration attributes in the terminal.",
    type=click.BOOL,
)
@click.option(
    "--auto-configure",
    "auto_configure",
    is_flag=True,
    default=False,
    help="Auto configure the service connector.",
    type=click.BOOL,
)
@click.argument("args", nargs=-1, type=click.UNPROCESSED)
def register_service_connector(
    name: Optional[str],
    args: List[str],
    description: Optional[str] = None,
    connector_type: Optional[str] = None,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    auth_method: Optional[str] = None,
    expires_at: Optional[datetime] = None,
    expires_skew_tolerance: Optional[int] = None,
    expiration_seconds: Optional[int] = None,
    no_verify: bool = False,
    labels: Optional[List[str]] = None,
    interactive: bool = False,
    no_docs: bool = False,
    show_secrets: bool = False,
    auto_configure: bool = False,
) -> None:
    """Registers a service connector.

    Args:
        name: The name to use for the service connector.
        args: Configuration arguments for the service connector.
        description: Short description for the service connector.
        connector_type: The service connector type.
        resource_type: The type of resource to connect to.
        resource_id: The ID of the resource to connect to.
        auth_method: The authentication method to use.
        expires_at: The exact UTC date and time when the credentials configured
            for this connector will expire.
        expires_skew_tolerance: The tolerance, in seconds, allowed when
            determining when the credentials configured for or generated by
            this connector will expire.
        expiration_seconds: The duration, in seconds, that the temporary
            credentials generated by this connector should remain valid.
        no_verify: Do not verify the service connector before
            registering.
        labels: Labels to be associated with the service connector.
        interactive: Register a new service connector interactively.
        no_docs: Don't show documentation details during the interactive
            configuration.
        show_secrets: Show security sensitive configuration attributes in
            the terminal.
        auto_configure: Auto configure the service connector.

    Raises:
        TypeError: If the connector_model does not have the correct type.
    """
    from rich.markdown import Markdown

    client = Client()

    # Parse the given args
    name, parsed_args = cli_utils.parse_name_and_extra_arguments(
        list(args) + [name or ""],
        expand_args=True,
        name_mandatory=not interactive,
    )

    # Parse the given labels
    parsed_labels = cast(Dict[str, str], cli_utils.get_parsed_labels(labels))

    if interactive:
        # Get the list of available service connector types
        connector_types = client.list_service_connector_types(
            connector_type=connector_type,
            resource_type=resource_type,
            auth_method=auth_method,
        )
        if not connector_types:
            cli_utils.error(
                "No service connectors found with the given parameters: "
                + f"type={connector_type} "
                if connector_type
                else "" + f"resource_type={resource_type} "
                if resource_type
                else "" + f"auth_method={auth_method} "
                if auth_method
                else "",
            )

        # Ask for a connector name
        name = prompt_connector_name(name)

        # Ask for a description
        description = click.prompt(
            "Please enter a description for the service connector",
            type=str,
            default="",
        )

        available_types = {c.connector_type: c for c in connector_types}
        if len(available_types) == 1:
            # Default to the first connector type if not supplied and if
            # only one type is available
            connector_type = connector_type or list(available_types.keys())[0]

        # Print the name, type and description of all available service
        # connectors
        if not no_docs:
            message = "# Available service connector types\n"
            for spec in connector_types:
                message += cli_utils.print_service_connector_type(
                    connector_type=spec,
                    heading="##",
                    footer="",
                    include_auth_methods=False,
                    include_resource_types=False,
                    print=False,
                )
            console.print(Markdown(f"{message}---"), justify="left", width=80)

        # Ask the user to select a service connector type
        connector_type = click.prompt(
            "Please select a service connector type",
            type=click.Choice(list(available_types.keys())),
            default=connector_type,
        )

        assert connector_type is not None
        connector_type_spec = available_types[connector_type]

        available_resource_types = [
            t.resource_type for t in connector_type_spec.resource_types
        ]

        if not no_docs:
            # Print the name, resource type identifiers and description of all
            # available resource types
            message = "# Available resource types\n"
            for r in connector_type_spec.resource_types:
                message += cli_utils.print_service_connector_resource_type(
                    resource_type=r,
                    heading="##",
                    footer="",
                    print=False,
                )
            console.print(Markdown(f"{message}---"), justify="left", width=80)

        # Ask the user to select a resource type
        resource_type = prompt_resource_type(
            available_resource_types=available_resource_types
        )

        # Ask the user whether to use autoconfiguration, if the connector
        # implementation is locally available and if autoconfiguration is
        # supported
        if (
            connector_type_spec.supports_auto_configuration
            and connector_type_spec.local
        ):
            auto_configure = click.confirm(
                "Would you like to attempt auto-configuration to extract the "
                "authentication configuration from your local environment ?",
                default=False,
            )
        else:
            auto_configure = False

        connector_model: Optional[
            Union[ServiceConnectorRequest, ServiceConnectorResponse]
        ] = None
        connector_resources: Optional[ServiceConnectorResourcesModel] = None
        if auto_configure:
            # Try to autoconfigure the service connector
            try:
                with console.status("Auto-configuring service connector...\n"):
                    (
                        connector_model,
                        connector_resources,
                    ) = client.create_service_connector(
                        name=name,
                        description=description or "",
                        connector_type=connector_type,
                        resource_type=resource_type,
                        auth_method=auth_method,
                        expires_skew_tolerance=expires_skew_tolerance,
                        auto_configure=True,
                        verify=True,
                        register=False,
                    )

                assert connector_model is not None
                assert connector_resources is not None
            except (
                KeyError,
                ValueError,
                IllegalOperationError,
                NotImplementedError,
                AuthorizationException,
            ) as e:
                cli_utils.warning(
                    f"Auto-configuration was not successful: {e} "
                )
                # Ask the user whether to continue with manual configuration
                manual = click.confirm(
                    "Would you like to continue with manual configuration ?",
                    default=True,
                )
                if not manual:
                    return
            else:
                auth_method = connector_model.auth_method
                expiration_seconds = connector_model.expiration_seconds
                expires_at = connector_model.expires_at
                cli_utils.declare(
                    "Service connector auto-configured successfully with the "
                    "following configuration:"
                )

                # Print the configuration detected by the autoconfiguration
                # process
                # TODO: Normally, this could have been handled with setter
                #   functions over the connector type property in the response
                #   model. However, pydantic breaks property setter functions.
                #   We can find a more elegant solution here.
                if isinstance(connector_model, ServiceConnectorResponse):
                    connector_model.set_connector_type(connector_type_spec)
                elif isinstance(connector_model, ServiceConnectorRequest):
                    connector_model.connector_type = connector_type_spec
                else:
                    raise TypeError(
                        "The service connector must be an instance of either"
                        "`ServiceConnectorResponse` or "
                        "`ServiceConnectorRequest`."
                    )

                cli_utils.print_service_connector_configuration(
                    connector_model,
                    active_status=False,
                    show_secrets=show_secrets,
                )
                cli_utils.declare(
                    "The service connector configuration has access to the "
                    "following resources:"
                )
                cli_utils.print_service_connector_resource_table(
                    [connector_resources],
                    show_resources_only=True,
                )

                # Ask the user whether to continue with the autoconfiguration
                choice = click.prompt(
                    "Would you like to continue with the auto-discovered "
                    "configuration or switch to manual ?",
                    type=click.Choice(["auto", "manual"]),
                    default="auto",
                )
                if choice == "manual":
                    # Reset the connector configuration to default to let the
                    # manual configuration kick in the next step
                    connector_model = None
                    connector_resources = None
                    expires_at = None

        if connector_model is not None and connector_resources is not None:
            assert auth_method is not None
            auth_method_spec = connector_type_spec.auth_method_dict[
                auth_method
            ]
        else:
            # In this branch, we are either not using autoconfiguration or the
            # autoconfiguration failed or was dismissed. In all cases, we need
            # to ask the user for the authentication method to use and then
            # prompt for the configuration

            auth_methods = list(connector_type_spec.auth_method_dict.keys())

            if not no_docs:
                # Print the name, identifier and description of all available
                # auth methods
                message = "# Available authentication methods\n"
                for a in auth_methods:
                    message += cli_utils.print_service_connector_auth_method(
                        auth_method=connector_type_spec.auth_method_dict[a],
                        heading="##",
                        footer="",
                        print=False,
                    )
                console.print(
                    Markdown(f"{message}---"), justify="left", width=80
                )

            if len(auth_methods) == 1:
                # Default to the first auth method if only one method is
                # available
                confirm = click.confirm(
                    "Only one authentication method is available for this "
                    f"connector ({auth_methods[0]}). Would you like to use it?",
                    default=True,
                )
                if not confirm:
                    return

                auth_method = auth_methods[0]
            else:
                # Ask the user to select an authentication method
                auth_method = click.prompt(
                    "Please select an authentication method",
                    type=click.Choice(auth_methods),
                    default=auth_method,
                )

            assert auth_method is not None
            auth_method_spec = connector_type_spec.auth_method_dict[
                auth_method
            ]

            cli_utils.declare(
                f"Please enter the configuration for the {auth_method_spec.name} "
                "authentication method."
            )

            # Prompt for the configuration of the selected authentication method
            # field by field
            config_schema = auth_method_spec.config_schema or {}
            config_dict = cli_utils.prompt_configuration(
                config_schema=config_schema,
                show_secrets=show_secrets,
            )

            # Prompt for an expiration time if the auth method supports it
            if auth_method_spec.supports_temporary_credentials():
                expiration_seconds = prompt_expiration_time(
                    min=auth_method_spec.min_expiration_seconds,
                    max=auth_method_spec.max_expiration_seconds,
                    default=auth_method_spec.default_expiration_seconds,
                )

            # Prompt for the time when the credentials will expire
            expires_at = prompt_expires_at(expires_at)

            try:
                # Validate the connector configuration and fetch all available
                # resources that are accessible with the provided configuration
                # in the process
                with console.status(
                    "Validating service connector configuration...\n"
                ):
                    (
                        connector_model,
                        connector_resources,
                    ) = client.create_service_connector(
                        name=name,
                        description=description or "",
                        connector_type=connector_type,
                        auth_method=auth_method,
                        resource_type=resource_type,
                        configuration=config_dict,
                        expires_at=expires_at,
                        expires_skew_tolerance=expires_skew_tolerance,
                        expiration_seconds=expiration_seconds,
                        auto_configure=False,
                        verify=True,
                        register=False,
                    )
                assert connector_model is not None
                assert connector_resources is not None
            except (
                KeyError,
                ValueError,
                IllegalOperationError,
                NotImplementedError,
                AuthorizationException,
            ) as e:
                cli_utils.error(f"Failed to configure service connector: {e}")

        if resource_type:
            # Finally, for connectors that are configured with a particular
            # resource type, prompt the user to select one of the available
            # resources that can be accessed with the connector. We don't do
            # need to do this for resource types that don't support instances.
            resource_type_spec = connector_type_spec.resource_type_dict[
                resource_type
            ]
            if resource_type_spec.supports_instances:
                assert len(connector_resources.resources) == 1
                resource_ids = connector_resources.resources[0].resource_ids
                assert resource_ids is not None
                resource_id = prompt_resource_id(
                    resource_name=resource_type_spec.name,
                    resource_ids=resource_ids,
                )
            else:
                resource_id = None
        else:
            resource_id = None

        # Prepare the rest of the variables to fall through to the
        # non-interactive configuration case
        parsed_args = connector_model.configuration
        parsed_args.update(
            {
                k: s.get_secret_value()
                for k, s in connector_model.secrets.items()
                if s is not None
            }
        )
        auto_configure = False
        no_verify = False
        expiration_seconds = connector_model.expiration_seconds

    if not connector_type:
        cli_utils.error(
            "The connector type must be specified when using non-interactive "
            "configuration."
        )

    with console.status(f"Registering service connector '{name}'...\n"):
        try:
            # Create a new service connector
            assert name is not None
            (
                connector_model,
                connector_resources,
            ) = client.create_service_connector(
                name=name,
                connector_type=connector_type,
                auth_method=auth_method,
                resource_type=resource_type,
                configuration=parsed_args,
                resource_id=resource_id,
                description=description or "",
                expires_skew_tolerance=expires_skew_tolerance,
                expiration_seconds=expiration_seconds,
                expires_at=expires_at,
                labels=parsed_labels,
                verify=not no_verify,
                auto_configure=auto_configure,
                register=True,
            )
        except (
            KeyError,
            ValueError,
            IllegalOperationError,
            NotImplementedError,
            AuthorizationException,
        ) as e:
            cli_utils.error(f"Failed to register service connector: {e}")

    if connector_resources is not None:
        cli_utils.declare(
            f"Successfully registered service connector `{name}` with access "
            "to the following resources:"
        )

        cli_utils.print_service_connector_resource_table(
            [connector_resources],
            show_resources_only=True,
        )

    else:
        cli_utils.declare(
            f"Successfully registered service connector `{name}`."
        )

register_single_stack_component_cli_commands(component_type, parent_group)

Registers all basic stack component CLI commands.

Parameters:

Name Type Description Default
component_type StackComponentType

Type of the component to generate the command for.

required
parent_group Group

The parent group to register the commands to.

required
Source code in src/zenml/cli/stack_components.py
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
def register_single_stack_component_cli_commands(
    component_type: StackComponentType, parent_group: click.Group
) -> None:
    """Registers all basic stack component CLI commands.

    Args:
        component_type: Type of the component to generate the command for.
        parent_group: The parent group to register the commands to.
    """
    command_name = component_type.value.replace("_", "-")
    singular_display_name = _component_display_name(component_type)
    plural_display_name = _component_display_name(component_type, plural=True)

    @parent_group.group(
        command_name,
        cls=TagGroup,
        help=f"Commands to interact with {plural_display_name}.",
        tag=CliCategories.STACK_COMPONENTS,
    )
    def command_group() -> None:
        """Group commands for a single stack component type."""

    # zenml stack-component get
    get_command = generate_stack_component_get_command(component_type)
    command_group.command(
        "get", help=f"Get the name of the active {singular_display_name}."
    )(get_command)

    # zenml stack-component describe
    describe_command = generate_stack_component_describe_command(
        component_type
    )
    command_group.command(
        "describe",
        help=f"Show details about the (active) {singular_display_name}.",
    )(describe_command)

    # zenml stack-component list
    list_command = generate_stack_component_list_command(component_type)
    command_group.command(
        "list", help=f"List all registered {plural_display_name}."
    )(list_command)

    # zenml stack-component register
    register_command = generate_stack_component_register_command(
        component_type
    )
    context_settings = {"ignore_unknown_options": True}
    command_group.command(
        "register",
        context_settings=context_settings,
        help=f"Register a new {singular_display_name}.",
    )(register_command)

    # zenml stack-component update
    update_command = generate_stack_component_update_command(component_type)
    context_settings = {"ignore_unknown_options": True}
    command_group.command(
        "update",
        context_settings=context_settings,
        help=f"Update a registered {singular_display_name}.",
    )(update_command)

    # zenml stack-component remove-attribute
    remove_attribute_command = (
        generate_stack_component_remove_attribute_command(component_type)
    )
    context_settings = {"ignore_unknown_options": True}
    command_group.command(
        "remove-attribute",
        context_settings=context_settings,
        help=f"Remove attributes from a registered {singular_display_name}.",
    )(remove_attribute_command)

    # zenml stack-component rename
    rename_command = generate_stack_component_rename_command(component_type)
    command_group.command(
        "rename", help=f"Rename a registered {singular_display_name}."
    )(rename_command)

    # zenml stack-component delete
    delete_command = generate_stack_component_delete_command(component_type)
    command_group.command(
        "delete", help=f"Delete a registered {singular_display_name}."
    )(delete_command)

    # zenml stack-component copy
    copy_command = generate_stack_component_copy_command(component_type)
    command_group.command(
        "copy", help=f"Copy a registered {singular_display_name}."
    )(copy_command)

    # zenml stack-component logs
    logs_command = generate_stack_component_logs_command(component_type)
    command_group.command(
        "logs", help=f"Display {singular_display_name} logs."
    )(logs_command)

    # zenml stack-component connect
    connect_command = generate_stack_component_connect_command(component_type)
    command_group.command(
        "connect",
        help=f"Connect {singular_display_name} to a service connector.",
    )(connect_command)

    # zenml stack-component connect
    disconnect_command = generate_stack_component_disconnect_command(
        component_type
    )
    command_group.command(
        "disconnect",
        help=f"Disconnect {singular_display_name} from a service connector.",
    )(disconnect_command)

    # zenml stack-component explain
    explain_command = generate_stack_component_explain_command(component_type)
    command_group.command(
        "explain", help=f"Explaining the {plural_display_name}."
    )(explain_command)

    # zenml stack-component flavor
    @command_group.group(
        "flavor", help=f"Commands to interact with {plural_display_name}."
    )
    def flavor_group() -> None:
        """Group commands to handle flavors for a stack component type."""

    # zenml stack-component flavor register
    register_flavor_command = generate_stack_component_flavor_register_command(
        component_type=component_type
    )
    flavor_group.command(
        "register",
        help=f"Register a new {singular_display_name} flavor.",
    )(register_flavor_command)

    # zenml stack-component flavor list
    list_flavor_command = generate_stack_component_flavor_list_command(
        component_type=component_type
    )
    flavor_group.command(
        "list",
        help=f"List all registered flavors for {plural_display_name}.",
    )(list_flavor_command)

    # zenml stack-component flavor describe
    describe_flavor_command = generate_stack_component_flavor_describe_command(
        component_type=component_type
    )
    flavor_group.command(
        "describe",
        help=f"Describe a {singular_display_name} flavor.",
    )(describe_flavor_command)

    # zenml stack-component flavor delete
    delete_flavor_command = generate_stack_component_flavor_delete_command(
        component_type=component_type
    )
    flavor_group.command(
        "delete",
        help=f"Delete a {plural_display_name} flavor.",
    )(delete_flavor_command)

register_stack(stack_name, artifact_store=None, orchestrator=None, container_registry=None, model_registry=None, step_operator=None, feature_store=None, model_deployer=None, experiment_tracker=None, alerter=None, annotator=None, data_validator=None, image_builder=None, set_stack=False, provider=None, connector=None)

Register a stack.

Parameters:

Name Type Description Default
stack_name str

Unique name of the stack

required
artifact_store Optional[str]

Name of the artifact store for this stack.

None
orchestrator Optional[str]

Name of the orchestrator for this stack.

None
container_registry Optional[str]

Name of the container registry for this stack.

None
model_registry Optional[str]

Name of the model registry for this stack.

None
step_operator Optional[str]

Name of the step operator for this stack.

None
feature_store Optional[str]

Name of the feature store for this stack.

None
model_deployer Optional[str]

Name of the model deployer for this stack.

None
experiment_tracker Optional[str]

Name of the experiment tracker for this stack.

None
alerter Optional[str]

Name of the alerter for this stack.

None
annotator Optional[str]

Name of the annotator for this stack.

None
data_validator Optional[str]

Name of the data validator for this stack.

None
image_builder Optional[str]

Name of the new image builder for this stack.

None
set_stack bool

Immediately set this stack as active.

False
provider Optional[str]

Name of the cloud provider for this stack.

None
connector Optional[str]

Name of the service connector for this stack.

None
Source code in src/zenml/cli/stack.py
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
@stack.command(
    "register",
    context_settings=dict(ignore_unknown_options=True),
    help="Register a stack with components.",
)
@click.argument("stack_name", type=str, required=True)
@click.option(
    "-a",
    "--artifact-store",
    "artifact_store",
    help="Name of the artifact store for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-o",
    "--orchestrator",
    "orchestrator",
    help="Name of the orchestrator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-c",
    "--container_registry",
    "container_registry",
    help="Name of the container registry for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-r",
    "--model_registry",
    "model_registry",
    help="Name of the model registry for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-s",
    "--step_operator",
    "step_operator",
    help="Name of the step operator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-f",
    "--feature_store",
    "feature_store",
    help="Name of the feature store for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-d",
    "--model_deployer",
    "model_deployer",
    help="Name of the model deployer for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-e",
    "--experiment_tracker",
    "experiment_tracker",
    help="Name of the experiment tracker for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-al",
    "--alerter",
    "alerter",
    help="Name of the alerter for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-an",
    "--annotator",
    "annotator",
    help="Name of the annotator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-dv",
    "--data_validator",
    "data_validator",
    help="Name of the data validator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-i",
    "--image_builder",
    "image_builder",
    help="Name of the image builder for this stack.",
    type=str,
    required=False,
)
@click.option(
    "--set",
    "set_stack",
    is_flag=True,
    help="Immediately set this stack as active.",
    type=click.BOOL,
)
@click.option(
    "-p",
    "--provider",
    help="Name of the cloud provider for this stack.",
    type=click.Choice(["aws", "azure", "gcp"]),
    required=False,
)
@click.option(
    "-sc",
    "--connector",
    help="Name of the service connector for this stack.",
    type=str,
    required=False,
)
def register_stack(
    stack_name: str,
    artifact_store: Optional[str] = None,
    orchestrator: Optional[str] = None,
    container_registry: Optional[str] = None,
    model_registry: Optional[str] = None,
    step_operator: Optional[str] = None,
    feature_store: Optional[str] = None,
    model_deployer: Optional[str] = None,
    experiment_tracker: Optional[str] = None,
    alerter: Optional[str] = None,
    annotator: Optional[str] = None,
    data_validator: Optional[str] = None,
    image_builder: Optional[str] = None,
    set_stack: bool = False,
    provider: Optional[str] = None,
    connector: Optional[str] = None,
) -> None:
    """Register a stack.

    Args:
        stack_name: Unique name of the stack
        artifact_store: Name of the artifact store for this stack.
        orchestrator: Name of the orchestrator for this stack.
        container_registry: Name of the container registry for this stack.
        model_registry: Name of the model registry for this stack.
        step_operator: Name of the step operator for this stack.
        feature_store: Name of the feature store for this stack.
        model_deployer: Name of the model deployer for this stack.
        experiment_tracker: Name of the experiment tracker for this stack.
        alerter: Name of the alerter for this stack.
        annotator: Name of the annotator for this stack.
        data_validator: Name of the data validator for this stack.
        image_builder: Name of the new image builder for this stack.
        set_stack: Immediately set this stack as active.
        provider: Name of the cloud provider for this stack.
        connector: Name of the service connector for this stack.
    """
    if (provider is None and connector is None) and (
        artifact_store is None or orchestrator is None
    ):
        cli_utils.error(
            "The only way to register a stack without specifying an "
            "orchestrator and an artifact store is by using either a provider"
            "(-p/--provider) or an existing service connector "
            "(-sc/--connector). Please specify the artifact store and "
            "the orchestrator or the service connector or cloud type settings."
        )

    client = Client()

    if provider is not None or connector is not None:
        if client.zen_store.is_local_store():
            cli_utils.error(
                "You are registering a stack using a service connector, but "
                "this feature cannot be used with a local ZenML deployment. "
                "ZenML needs to be accessible from the cloud provider to allow "
                "the stack and its components to be registered automatically. "
                "Please deploy ZenML in a remote environment as described in "
                "the documentation: https://docs.zenml.io/getting-started/deploying-zenml "
                "or use a managed ZenML Pro server instance for quick access "
                "to this feature and more: https://www.zenml.io/pro"
            )

    try:
        client.get_stack(
            name_id_or_prefix=stack_name,
            allow_name_prefix_match=False,
        )
        cli_utils.error(
            f"A stack with name `{stack_name}` already exists, "
            "please use a different name."
        )
    except KeyError:
        pass

    labels: Dict[str, str] = {}
    components: Dict[StackComponentType, List[Union[UUID, ComponentInfo]]] = {}

    # Cloud Flow
    created_objects: Set[str] = set()
    service_connector: Optional[Union[UUID, ServiceConnectorInfo]] = None
    if provider is not None and connector is None:
        service_connector_response = None
        use_auto_configure = False
        try:
            service_connector_response, _ = client.create_service_connector(
                name=stack_name,
                connector_type=provider,
                register=False,
                auto_configure=True,
                verify=False,
            )
        except NotImplementedError:
            cli_utils.warning(
                f"The {provider.upper()} service connector libraries are not "
                "installed properly. Please run `zenml integration install "
                f"{provider}` and try again to enable auto-discovery of the "
                "connection configuration."
            )
        except Exception:
            pass

        if service_connector_response:
            use_auto_configure = Confirm.ask(
                f"[bold]{provider.upper()} cloud service connector[/bold] "
                "has detected connection credentials in your environment.\n"
                "Would you like to use these credentials or create a new "
                "configuration by providing connection details?",
                default=True,
                show_choices=True,
                show_default=True,
            )

        connector_selected: Optional[int] = None
        if not use_auto_configure:
            service_connector_response = None
            existing_connectors = client.list_service_connectors(
                connector_type=provider, size=100
            )
            if existing_connectors.total:
                connector_selected = cli_utils.multi_choice_prompt(
                    object_type=f"{provider.upper()} service connectors",
                    choices=[
                        [connector.name]
                        for connector in existing_connectors.items
                    ],
                    headers=["Name"],
                    prompt_text=f"We found these {provider.upper()} service "
                    "connectors. Do you want to create a new one or use one "
                    "of the existing ones?",
                    default_choice="0",
                    allow_zero_be_a_new_object=True,
                )
        if use_auto_configure or connector_selected is None:
            service_connector = _get_service_connector_info(
                cloud_provider=provider,
                connector_details=service_connector_response,
            )
            created_objects.add("service_connector")
        else:
            selected_connector = existing_connectors.items[connector_selected]
            service_connector = selected_connector.id
            connector = selected_connector.name
            if isinstance(selected_connector.connector_type, str):
                provider = selected_connector.connector_type
            else:
                provider = selected_connector.connector_type.connector_type
    elif connector is not None:
        service_connector_response = client.get_service_connector(connector)
        service_connector = service_connector_response.id
        if provider:
            if service_connector_response.type != provider:
                cli_utils.warning(
                    f"The service connector `{connector}` is not of type `{provider}`."
                )
        else:
            provider = service_connector_response.type

    if service_connector:
        labels["zenml:wizard"] = "true"
        if provider:
            labels["zenml:provider"] = provider
        resources_info = None
        # explore the service connector
        with console.status(
            "Exploring resources available to the service connector...\n"
        ):
            resources_info = (
                get_resources_options_from_resource_model_for_full_stack(
                    connector_details=service_connector
                )
            )
        if resources_info is None:
            cli_utils.error(
                f"Failed to fetch service connector resources information for {service_connector}..."
            )

        # create components
        needed_components = (
            (StackComponentType.ARTIFACT_STORE, artifact_store),
            (StackComponentType.ORCHESTRATOR, orchestrator),
            (StackComponentType.CONTAINER_REGISTRY, container_registry),
        )
        for component_type, preset_name in needed_components:
            component_info: Optional[Union[UUID, ComponentInfo]] = None
            if preset_name is not None:
                component_response = client.get_stack_component(
                    component_type, preset_name
                )
                component_info = component_response.id
                component_name = component_response.name
            else:
                if isinstance(service_connector, UUID):
                    # find existing components under same connector
                    if (
                        component_type
                        in resources_info.components_resources_info
                    ):
                        existing_components = [
                            existing_response
                            for res_info in resources_info.components_resources_info[
                                component_type
                            ]
                            for existing_response in res_info.connected_through_service_connector
                        ]

                        # if some existing components are found - prompt user what to do
                        component_selected: Optional[int] = None
                        component_selected = cli_utils.multi_choice_prompt(
                            object_type=component_type.value.replace("_", " "),
                            choices=[
                                [
                                    component.flavor_name,
                                    component.name,
                                    component.configuration or "",
                                    component.connector_resource_id,
                                ]
                                for component in existing_components
                            ],
                            headers=[
                                "Type",
                                "Name",
                                "Configuration",
                                "Connected as",
                            ],
                            prompt_text=f"We found these {component_type.value.replace('_', ' ')} "
                            "connected using the current service connector. Do you "
                            "want to create a new one or use existing one?",
                            default_choice="0",
                            allow_zero_be_a_new_object=True,
                        )
                else:
                    component_selected = None

                if component_selected is None:
                    component_info = _get_stack_component_info(
                        component_type=component_type.value,
                        cloud_provider=provider
                        or resources_info.connector_type,
                        resources_info=resources_info,
                        service_connector_index=0,
                    )
                    component_name = stack_name
                    created_objects.add(component_type.value)
                else:
                    selected_component = existing_components[
                        component_selected
                    ]
                    component_info = selected_component.id
                    component_name = selected_component.name

            components[component_type] = [component_info]
            if component_type == StackComponentType.ARTIFACT_STORE:
                artifact_store = component_name
            if component_type == StackComponentType.ORCHESTRATOR:
                orchestrator = component_name
            if component_type == StackComponentType.CONTAINER_REGISTRY:
                container_registry = component_name

    # normal flow once all components are defined
    with console.status(f"Registering stack '{stack_name}'...\n"):
        for component_type_, component_name_ in [
            (StackComponentType.ARTIFACT_STORE, artifact_store),
            (StackComponentType.ORCHESTRATOR, orchestrator),
            (StackComponentType.ALERTER, alerter),
            (StackComponentType.ANNOTATOR, annotator),
            (StackComponentType.DATA_VALIDATOR, data_validator),
            (StackComponentType.FEATURE_STORE, feature_store),
            (StackComponentType.IMAGE_BUILDER, image_builder),
            (StackComponentType.MODEL_DEPLOYER, model_deployer),
            (StackComponentType.MODEL_REGISTRY, model_registry),
            (StackComponentType.STEP_OPERATOR, step_operator),
            (StackComponentType.EXPERIMENT_TRACKER, experiment_tracker),
            (StackComponentType.CONTAINER_REGISTRY, container_registry),
        ]:
            if component_name_ and component_type_ not in components:
                components[component_type_] = [
                    client.get_stack_component(
                        component_type_, component_name_
                    ).id
                ]

        try:
            created_stack = client.zen_store.create_stack(
                stack=StackRequest(
                    user=client.active_user.id,
                    workspace=client.active_workspace.id,
                    name=stack_name,
                    components=components,
                    service_connectors=[service_connector]
                    if service_connector
                    else [],
                    labels=labels,
                )
            )
        except (KeyError, IllegalOperationError) as err:
            cli_utils.error(str(err))

        cli_utils.declare(
            f"Stack '{created_stack.name}' successfully registered!"
        )
        cli_utils.print_stack_configuration(
            stack=created_stack,
            active=created_stack.id == client.active_stack_model.id,
        )

    if set_stack:
        client.activate_stack(created_stack.id)

        scope = "repository" if client.uses_local_configuration else "global"
        cli_utils.declare(
            f"Active {scope} stack set to:'{created_stack.name}'"
        )

    delete_commands = []
    if "service_connector" in created_objects:
        created_objects.remove("service_connector")
        connectors = set()
        for each in created_objects:
            if comps_ := created_stack.components[StackComponentType(each)]:
                if conn_ := comps_[0].connector:
                    connectors.add(conn_.name)
        for connector in connectors:
            delete_commands.append(
                "zenml service-connector delete " + connector
            )
    for each in created_objects:
        if comps_ := created_stack.components[StackComponentType(each)]:
            delete_commands.append(
                f"zenml {each.replace('_', '-')} delete {comps_[0].name}"
            )
    delete_commands.append("zenml stack delete -y " + created_stack.name)

    Console().print(
        "To delete the objects created by this command run, please run in a sequence:\n"
    )
    Console().print(Syntax("\n".join(delete_commands[::-1]), "bash"))

    print_model_url(get_stack_url(created_stack))

register_tag(name, color)

Register a new model in the Model Control Plane.

Parameters:

Name Type Description Default
name str

The name of the tag.

required
color Optional[ColorVariants]

The color variant for UI.

required
Source code in src/zenml/cli/tag.py
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
@tag.command("register", help="Register a new tag.")
@click.option(
    "--name",
    "-n",
    help="The name of the tag.",
    type=str,
    required=True,
)
@click.option(
    "--color",
    "-c",
    help="The color variant for UI.",
    type=click.Choice(choices=ColorVariants.values()),
    required=False,
)
def register_tag(name: str, color: Optional[ColorVariants]) -> None:
    """Register a new model in the Model Control Plane.

    Args:
        name: The name of the tag.
        color: The color variant for UI.
    """
    request_dict = remove_none_values(dict(name=name, color=color))
    try:
        tag = Client().create_tag(TagRequest(**request_dict))
    except (EntityExistsError, ValueError) as e:
        cli_utils.error(str(e))

    cli_utils.print_pydantic_models(
        [tag],
        exclude_columns=["created"],
    )

remove_none_values(dict_, recursive=False)

Removes all key-value pairs with None value.

Parameters:

Name Type Description Default
dict_ Dict[str, Any]

The dict from which the key-value pairs should be removed.

required
recursive bool

If True, will recursively remove None values in all child dicts.

False

Returns:

Type Description
Dict[str, Any]

The updated dictionary.

Source code in src/zenml/utils/dict_utils.py
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
def remove_none_values(
    dict_: Dict[str, Any], recursive: bool = False
) -> Dict[str, Any]:
    """Removes all key-value pairs with `None` value.

    Args:
        dict_: The dict from which the key-value pairs should be removed.
        recursive: If `True`, will recursively remove `None` values in all
            child dicts.

    Returns:
        The updated dictionary.
    """

    def _maybe_recurse(value: Any) -> Any:
        """Calls `remove_none_values` recursively if required.

        Args:
            value: A dictionary value.

        Returns:
            The updated dictionary value.
        """
        if recursive and isinstance(value, Dict):
            return remove_none_values(value, recursive=True)
        else:
            return value

    return {k: _maybe_recurse(v) for k, v in dict_.items() if v is not None}

remove_stack_component(stack_name_or_id=None, container_registry_flag=False, step_operator_flag=False, feature_store_flag=False, model_deployer_flag=False, experiment_tracker_flag=False, alerter_flag=False, annotator_flag=False, data_validator_flag=False, image_builder_flag=False, model_registry_flag=None)

Remove stack components from a stack.

Parameters:

Name Type Description Default
stack_name_or_id Optional[str]

Name of the stack to remove components from.

None
container_registry_flag Optional[bool]

To remove the container registry from this stack.

False
step_operator_flag Optional[bool]

To remove the step operator from this stack.

False
feature_store_flag Optional[bool]

To remove the feature store from this stack.

False
model_deployer_flag Optional[bool]

To remove the model deployer from this stack.

False
experiment_tracker_flag Optional[bool]

To remove the experiment tracker from this stack.

False
alerter_flag Optional[bool]

To remove the alerter from this stack.

False
annotator_flag Optional[bool]

To remove the annotator from this stack.

False
data_validator_flag Optional[bool]

To remove the data validator from this stack.

False
image_builder_flag Optional[bool]

To remove the image builder from this stack.

False
model_registry_flag Optional[str]

To remove the model registry from this stack.

None
Source code in src/zenml/cli/stack.py
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
@stack.command(
    "remove-component",
    context_settings=dict(ignore_unknown_options=True),
    help="Remove stack components from a stack.",
)
@click.argument("stack_name_or_id", type=str, required=False)
@click.option(
    "-c",
    "--container_registry",
    "container_registry_flag",
    help="Include this to remove the container registry from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-s",
    "--step_operator",
    "step_operator_flag",
    help="Include this to remove the step operator from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-r",
    "--model_registry",
    "model_registry_flag",
    help="Include this to remove the model registry from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-f",
    "--feature_store",
    "feature_store_flag",
    help="Include this to remove the feature store from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-d",
    "--model_deployer",
    "model_deployer_flag",
    help="Include this to remove the model deployer from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-e",
    "--experiment_tracker",
    "experiment_tracker_flag",
    help="Include this to remove the experiment tracker from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-al",
    "--alerter",
    "alerter_flag",
    help="Include this to remove the alerter from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-an",
    "--annotator",
    "annotator_flag",
    help="Include this to remove the annotator from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-dv",
    "--data_validator",
    "data_validator_flag",
    help="Include this to remove the data validator from this stack.",
    is_flag=True,
    required=False,
)
@click.option(
    "-i",
    "--image_builder",
    "image_builder_flag",
    help="Include this to remove the image builder from this stack.",
    is_flag=True,
    required=False,
)
def remove_stack_component(
    stack_name_or_id: Optional[str] = None,
    container_registry_flag: Optional[bool] = False,
    step_operator_flag: Optional[bool] = False,
    feature_store_flag: Optional[bool] = False,
    model_deployer_flag: Optional[bool] = False,
    experiment_tracker_flag: Optional[bool] = False,
    alerter_flag: Optional[bool] = False,
    annotator_flag: Optional[bool] = False,
    data_validator_flag: Optional[bool] = False,
    image_builder_flag: Optional[bool] = False,
    model_registry_flag: Optional[str] = None,
) -> None:
    """Remove stack components from a stack.

    Args:
        stack_name_or_id: Name of the stack to remove components from.
        container_registry_flag: To remove the container registry from this
            stack.
        step_operator_flag: To remove the step operator from this stack.
        feature_store_flag: To remove the feature store from this stack.
        model_deployer_flag: To remove the model deployer from this stack.
        experiment_tracker_flag: To remove the experiment tracker from this
            stack.
        alerter_flag: To remove the alerter from this stack.
        annotator_flag: To remove the annotator from this stack.
        data_validator_flag: To remove the data validator from this stack.
        image_builder_flag: To remove the image builder from this stack.
        model_registry_flag: To remove the model registry from this stack.
    """
    client = Client()

    with console.status("Updating the stack...\n"):
        stack_component_update: Dict[StackComponentType, List[Any]] = dict()

        if container_registry_flag:
            stack_component_update[StackComponentType.CONTAINER_REGISTRY] = []

        if step_operator_flag:
            stack_component_update[StackComponentType.STEP_OPERATOR] = []

        if feature_store_flag:
            stack_component_update[StackComponentType.FEATURE_STORE] = []

        if model_deployer_flag:
            stack_component_update[StackComponentType.MODEL_DEPLOYER] = []

        if experiment_tracker_flag:
            stack_component_update[StackComponentType.EXPERIMENT_TRACKER] = []

        if alerter_flag:
            stack_component_update[StackComponentType.ALERTER] = []

        if model_registry_flag:
            stack_component_update[StackComponentType.MODEL_REGISTRY] = []

        if annotator_flag:
            stack_component_update[StackComponentType.ANNOTATOR] = []

        if data_validator_flag:
            stack_component_update[StackComponentType.DATA_VALIDATOR] = []

        if image_builder_flag:
            stack_component_update[StackComponentType.IMAGE_BUILDER] = []

        try:
            updated_stack = client.update_stack(
                name_id_or_prefix=stack_name_or_id,
                component_updates=stack_component_update,
            )
        except (KeyError, IllegalOperationError) as err:
            cli_utils.error(str(err))
        cli_utils.declare(
            f"Stack `{updated_stack.name}` successfully updated!"
        )

rename_secret(name_or_id, new_name)

Update a secret for a given name or id.

Parameters:

Name Type Description Default
name_or_id str

The name or id of the secret to update.

required
new_name str

The new name of the secret.

required
Source code in src/zenml/cli/secret.py
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
@secret.command(
    "rename",
    context_settings={"ignore_unknown_options": True},
    help="Rename a secret with a given name or id.",
)
@click.argument(
    "name_or_id",
    type=click.STRING,
)
@click.option(
    "--new-name",
    "-n",
    type=click.STRING,
)
def rename_secret(
    name_or_id: str,
    new_name: str,
) -> None:
    """Update a secret for a given name or id.

    Args:
        name_or_id: The name or id of the secret to update.
        new_name: The new name of the secret.
    """
    if new_name == "name":
        error("Your secret cannot be called 'name'.")

    client = Client()

    with console.status(f"Checking secret `{name_or_id}`..."):
        try:
            client.get_secret(name_id_or_prefix=name_or_id)
        except KeyError as e:
            error(
                f"Secret with name `{name_or_id}` does not exist or could not "
                f"be loaded: {str(e)}."
            )
        except NotImplementedError as e:
            error(f"Centralized secrets management is disabled: {str(e)}")

    client.update_secret(
        name_id_or_prefix=name_or_id,
        new_name=new_name,
    )
    declare(f"Secret '{name_or_id}' successfully renamed to '{new_name}'.")

rename_stack(stack_name_or_id, new_stack_name)

Rename a stack.

Parameters:

Name Type Description Default
stack_name_or_id str

Name of the stack to rename.

required
new_stack_name str

New name of the stack.

required
Source code in src/zenml/cli/stack.py
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
@stack.command("rename", help="Rename a stack.")
@click.argument("stack_name_or_id", type=str, required=True)
@click.argument("new_stack_name", type=str, required=True)
def rename_stack(
    stack_name_or_id: str,
    new_stack_name: str,
) -> None:
    """Rename a stack.

    Args:
        stack_name_or_id: Name of the stack to rename.
        new_stack_name: New name of the stack.
    """
    client = Client()

    with console.status("Renaming stack...\n"):
        try:
            stack_ = client.update_stack(
                name_id_or_prefix=stack_name_or_id,
                name=new_stack_name,
            )
        except (KeyError, IllegalOperationError) as err:
            cli_utils.error(str(err))
        cli_utils.declare(
            f"Stack `{stack_name_or_id}` successfully renamed to `"
            f"{new_stack_name}`!"
        )

    print_model_url(get_stack_url(stack_))

restore_database(strategy=None, location=None, cleanup=False)

Restore the ZenML database.

Parameters:

Name Type Description Default
strategy Optional[str]

Custom backup strategy to use. Defaults to whatever is configured in the store config.

None
location Optional[str]

Custom location where the backup is stored. Defaults to whatever is configured in the store config. Depending on the strategy, this can be a local path or a database name.

None
cleanup bool

Whether to cleanup the backup after restoring.

False
Source code in src/zenml/cli/base.py
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
@cli.command(
    "restore-database", help="Restore the database from a backup.", hidden=True
)
@click.option(
    "--strategy",
    "-s",
    help="Custom backup strategy to use. Defaults to whatever is configured "
    "in the store config.",
    type=click.Choice(choices=DatabaseBackupStrategy.values()),
    required=False,
    default=None,
)
@click.option(
    "--location",
    default=None,
    help="Custom location where the backup is stored. Defaults to whatever is "
    "configured in the store config. Depending on the strategy, this can be "
    "a local path or a database name.",
    type=str,
)
@click.option(
    "--cleanup",
    "-c",
    is_flag=True,
    default=False,
    help="Cleanup the backup after restoring.",
    type=bool,
)
def restore_database(
    strategy: Optional[str] = None,
    location: Optional[str] = None,
    cleanup: bool = False,
) -> None:
    """Restore the ZenML database.

    Args:
        strategy: Custom backup strategy to use. Defaults to whatever is
            configured in the store config.
        location: Custom location where the backup is stored. Defaults to
            whatever is configured in the store config. Depending on the
            strategy, this can be a local path or a database name.
        cleanup: Whether to cleanup the backup after restoring.
    """
    from zenml.zen_stores.base_zen_store import BaseZenStore
    from zenml.zen_stores.sql_zen_store import SqlZenStore

    store_config = GlobalConfiguration().store_configuration
    if store_config.type == StoreType.SQL:
        store = BaseZenStore.create_store(
            store_config, skip_default_registrations=True, skip_migrations=True
        )
        assert isinstance(store, SqlZenStore)
        store.restore_database(
            strategy=DatabaseBackupStrategy(strategy) if strategy else None,
            location=location,
            cleanup=cleanup,
        )
        cli_utils.declare("Database restore finished.")
    else:
        cli_utils.warning(
            "Cannot restore database while connected to a ZenML server."
        )

restore_secrets(ignore_errors=False, delete_secrets=False)

Backup all secrets to the backup secrets store.

Parameters:

Name Type Description Default
ignore_errors bool

Whether to ignore individual errors when backing up secrets and continue with the backup operation until all secrets have been backed up.

False
delete_secrets bool

Whether to delete the secrets that have been successfully restored from the backup secrets store. Setting this flag effectively moves all secrets from the backup secrets store to the primary secrets store.

False
Source code in src/zenml/cli/secret.py
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
@secret.command(
    "restore", help="Restore all secrets from the backup secrets store."
)
@click.option(
    "--ignore-errors",
    "-i",
    type=click.BOOL,
    default=False,
    help="Whether to ignore individual errors when backing up secrets and "
    "continue with the backup operation until all secrets have been backed up.",
)
@click.option(
    "--delete-secrets",
    "-d",
    is_flag=True,
    default=False,
    help="Whether to delete the secrets that have been successfully restored "
    "from the backup secrets store. Setting this flag effectively moves all "
    "secrets from the backup secrets store to the primary secrets store.",
)
def restore_secrets(
    ignore_errors: bool = False, delete_secrets: bool = False
) -> None:
    """Backup all secrets to the backup secrets store.

    Args:
        ignore_errors: Whether to ignore individual errors when backing up
            secrets and continue with the backup operation until all secrets
            have been backed up.
        delete_secrets: Whether to delete the secrets that have been
            successfully restored from the backup secrets store. Setting
            this flag effectively moves all secrets from the backup secrets
            store to the primary secrets store.
    """
    client = Client()

    with console.status("Restoring secrets from backup..."):
        try:
            client.restore_secrets(
                ignore_errors=ignore_errors, delete_secrets=delete_secrets
            )
            declare("Secrets successfully restored.")
        except NotImplementedError as e:
            error(f"Could not restore secrets: {str(e)}")

rotate_api_key(service_account_name_or_id, name_or_id, retain=0, set_key=False, output_file=None)

Rotate an API key.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account to which the API key belongs.

required
name_or_id str

The name or ID of the API key to rotate.

required
retain int

Number of minutes for which the previous key is still valid after it has been rotated.

0
set_key bool

Configure the local client with the newly generated key.

False
output_file Optional[str]

Output file to write the API key to.

None
Source code in src/zenml/cli/service_accounts.py
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
@api_key.command("rotate", help="Rotate an API key.")
@click.argument("name_or_id", type=str, required=True)
@click.option(
    "--retain",
    type=int,
    required=False,
    default=0,
    help="Number of minutes for which the previous key is still valid after it "
    "has been rotated.",
)
@click.option(
    "--set-key",
    is_flag=True,
    help="Configure the local client with the generated key.",
)
@click.option(
    "--output-file",
    type=str,
    required=False,
    help="File to write the API key to.",
)
@click.pass_obj
def rotate_api_key(
    service_account_name_or_id: str,
    name_or_id: str,
    retain: int = 0,
    set_key: bool = False,
    output_file: Optional[str] = None,
) -> None:
    """Rotate an API key.

    Args:
        service_account_name_or_id: The name or ID of the service account to
            which the API key belongs.
        name_or_id: The name or ID of the API key to rotate.
        retain: Number of minutes for which the previous key is still valid
            after it has been rotated.
        set_key: Configure the local client with the newly generated key.
        output_file: Output file to write the API key to.
    """
    client = Client()
    zen_store = client.zen_store

    try:
        api_key = client.rotate_api_key(
            service_account_name_id_or_prefix=service_account_name_or_id,
            name_id_or_prefix=name_or_id,
            retain_period_minutes=retain,
        )
    except KeyError as e:
        cli_utils.error(str(e))

    cli_utils.declare(f"Successfully rotated API key `{name_or_id}`.")
    if retain:
        cli_utils.declare(
            f"The previous API key will remain valid for {retain} minutes."
        )

    if set_key and api_key.key:
        if zen_store.TYPE != StoreType.REST:
            cli_utils.warning(
                "Could not configure the local ZenML client with the generated "
                "API key. This type of authentication is only supported if "
                "connected to a ZenML server."
            )
        else:
            client.set_api_key(api_key.key)
            cli_utils.declare(
                "The local client has been configured with the new API key."
            )
            return

    if output_file and api_key.key:
        with open(output_file, "w") as f:
            f.write(api_key.key)

        cli_utils.declare(f"Wrote API key value to {output_file}")
    else:
        cli_utils.declare(
            f"The new API key value is: '{api_key.key}'\nPlease store it "
            "safely as it will not be shown again.\nTo configure a ZenML "
            "client to use this API key, run:\n\n"
            f"zenml login {zen_store.config.url} --api-key \n\n"
            f"and enter the following API key when prompted: {api_key.key}\n"
        )

run_pipeline(source, config_path=None, stack_name_or_id=None, build_path_or_id=None, prevent_build_reuse=False)

Run a pipeline.

Parameters:

Name Type Description Default
source str

Importable source resolving to a pipeline instance.

required
config_path Optional[str]

Path to pipeline configuration file.

None
stack_name_or_id Optional[str]

Name or ID of the stack on which the pipeline should run.

None
build_path_or_id Optional[str]

ID of file path of the build to use for the pipeline run.

None
prevent_build_reuse bool

If True, prevents automatic reusing of previous builds.

False
Source code in src/zenml/cli/pipeline.py
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
@pipeline.command(
    "run",
    help="Run a pipeline. The SOURCE argument needs to be an "
    "importable source path resolving to a ZenML pipeline instance, e.g. "
    "`my_module.my_pipeline_instance`.",
)
@click.argument("source")
@click.option(
    "--config",
    "-c",
    "config_path",
    type=click.Path(exists=True, dir_okay=False),
    required=False,
    help="Path to configuration file for the run.",
)
@click.option(
    "--stack",
    "-s",
    "stack_name_or_id",
    type=str,
    required=False,
    help="Name or ID of the stack to run on.",
)
@click.option(
    "--build",
    "-b",
    "build_path_or_id",
    type=str,
    required=False,
    help="ID or path of the build to use.",
)
@click.option(
    "--prevent-build-reuse",
    is_flag=True,
    default=False,
    required=False,
    help="Prevent automatic build reusing.",
)
def run_pipeline(
    source: str,
    config_path: Optional[str] = None,
    stack_name_or_id: Optional[str] = None,
    build_path_or_id: Optional[str] = None,
    prevent_build_reuse: bool = False,
) -> None:
    """Run a pipeline.

    Args:
        source: Importable source resolving to a pipeline instance.
        config_path: Path to pipeline configuration file.
        stack_name_or_id: Name or ID of the stack on which the pipeline should
            run.
        build_path_or_id: ID of file path of the build to use for the pipeline
            run.
        prevent_build_reuse: If True, prevents automatic reusing of previous
            builds.
    """
    if not Client().root:
        cli_utils.warning(
            "You're running the `zenml pipeline run` command without a "
            "ZenML repository. Your current working directory will be used "
            "as the source root relative to which the registered step classes "
            "will be resolved. To silence this warning, run `zenml init` at "
            "your source code root."
        )

    with cli_utils.temporary_active_stack(stack_name_or_id=stack_name_or_id):
        pipeline_instance = _import_pipeline(source=source)

        build: Union[str, PipelineBuildBase, None] = None
        if build_path_or_id:
            if uuid_utils.is_valid_uuid(build_path_or_id):
                build = build_path_or_id
            elif os.path.exists(build_path_or_id):
                build = PipelineBuildBase.from_yaml(build_path_or_id)
            else:
                cli_utils.error(
                    f"The specified build {build_path_or_id} is not a valid UUID "
                    "or file path."
                )

        pipeline_instance = pipeline_instance.with_options(
            config_path=config_path,
            build=build,
            prevent_build_reuse=prevent_build_reuse,
        )
        pipeline_instance()

runs()

Commands for pipeline runs.

Source code in src/zenml/cli/pipeline.py
486
487
488
@pipeline.group()
def runs() -> None:
    """Commands for pipeline runs."""

schedule()

Commands for pipeline run schedules.

Source code in src/zenml/cli/pipeline.py
427
428
429
@pipeline.group()
def schedule() -> None:
    """Commands for pipeline run schedules."""

seconds_to_human_readable(time_seconds)

Converts seconds to human-readable format.

Parameters:

Name Type Description Default
time_seconds int

Seconds to convert.

required

Returns:

Type Description
str

Human readable string.

Source code in src/zenml/utils/time_utils.py
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
def seconds_to_human_readable(time_seconds: int) -> str:
    """Converts seconds to human-readable format.

    Args:
        time_seconds: Seconds to convert.

    Returns:
        Human readable string.
    """
    seconds = time_seconds % 60
    minutes = (time_seconds // 60) % 60
    hours = (time_seconds // 3600) % 24
    days = time_seconds // 86400
    tokens = []
    if days:
        tokens.append(f"{days}d")
    if hours:
        tokens.append(f"{hours}h")
    if minutes:
        tokens.append(f"{minutes}m")
    if seconds:
        tokens.append(f"{seconds}s")

    return "".join(tokens)

secret()

Create, list, update, or delete secrets.

Source code in src/zenml/cli/secret.py
50
51
52
@cli.group(cls=TagGroup, tag=CliCategories.IDENTITY_AND_SECURITY)
def secret() -> None:
    """Create, list, update, or delete secrets."""

server()

Commands for managing ZenML servers.

Source code in src/zenml/cli/server.py
505
506
507
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def server() -> None:
    """Commands for managing ZenML servers."""

server_list(verbose=False, all=False, pro_api_url=None)

List all ZenML servers that this client is authorized to access.

Parameters:

Name Type Description Default
verbose bool

Whether to show verbose output.

False
all bool

Whether to show all ZenML servers.

False
pro_api_url Optional[str]

Custom URL for the ZenML Pro API.

None
Source code in src/zenml/cli/server.py
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
@server.command(
    "list",
    help="""List all ZenML servers that this client is authenticated to.

    The CLI can be authenticated to multiple ZenML servers at the same time,
    even though it can only be connected to one server at a time. You can list
    all the ZenML servers that the client is currently authenticated to by
    using this command.

    When logged in to ZenML Pro, this list will also include all ZenML Pro
    servers that the authenticated user can access or could potentially access,
    including details such as their current state and the organization they
    belong to.

    The complete list of servers displayed by this command includes the
    following:

      * ZenML Pro servers that the authenticated ZenML Pro user can or could
        access. The client needs to be logged to ZenML Pro via
        `zenml login --pro` to access these servers.

      * ZenML servers that the client has logged in to via
        `zenml login --url` in the past.

      * the local ZenML server started with `zenml login --local`, if one is
        running.

    By default, this command does not display ZenML servers that are not
    accessible: servers that are not running, are no longer accessible due to
    an expired authentication and ZenML Pro servers where the user is not a
    member. To include these servers in the list, use the `--all` flag.
    """,
)
@click.option(
    "--verbose",
    "-v",
    is_flag=True,
    help="Show verbose output.",
)
@click.option(
    "--all",
    "-a",
    is_flag=True,
    help="Show all ZenML servers, including those that are not running "
    "and those with an expired authentication.",
)
@click.option(
    "--pro-api-url",
    type=str,
    default=None,
    help="Custom URL for the ZenML Pro API. Useful when disconnecting "
    "from a self-hosted ZenML Pro deployment.",
)
def server_list(
    verbose: bool = False,
    all: bool = False,
    pro_api_url: Optional[str] = None,
) -> None:
    """List all ZenML servers that this client is authorized to access.

    Args:
        verbose: Whether to show verbose output.
        all: Whether to show all ZenML servers.
        pro_api_url: Custom URL for the ZenML Pro API.
    """
    from zenml.login.credentials_store import get_credentials_store
    from zenml.login.pro.client import ZenMLProClient
    from zenml.login.pro.constants import ZENML_PRO_API_URL
    from zenml.login.pro.tenant.models import TenantRead, TenantStatus

    pro_api_url = pro_api_url or ZENML_PRO_API_URL
    pro_api_url = pro_api_url.rstrip("/")

    credentials_store = get_credentials_store()
    pro_token = credentials_store.get_pro_token(
        allow_expired=True, pro_api_url=pro_api_url
    )
    current_store_config = GlobalConfiguration().store_configuration

    # The list of ZenML Pro servers kept in the credentials store
    pro_servers = credentials_store.list_credentials(type=ServerType.PRO)
    # The list of regular remote ZenML servers kept in the credentials store
    servers = list(credentials_store.list_credentials(type=ServerType.REMOTE))
    # The list of local ZenML servers kept in the credentials store
    local_servers = list(
        credentials_store.list_credentials(type=ServerType.LOCAL)
    )

    if pro_token and not pro_token.expired:
        # If the ZenML Pro authentication is still valid, we include all ZenML
        # Pro servers that the current ZenML Pro user can access, even those
        # that the user has never connected to (and are therefore not stored in
        # the credentials store).

        accessible_pro_servers: List[TenantRead] = []
        try:
            client = ZenMLProClient(pro_api_url)
            accessible_pro_servers = client.tenant.list(member_only=not all)
        except AuthorizationException as e:
            cli_utils.warning(f"ZenML Pro authorization error: {e}")

        # We update the list of stored ZenML Pro servers with the ones that the
        # client is a member of
        for accessible_server in accessible_pro_servers:
            for idx, stored_server in enumerate(pro_servers):
                if stored_server.server_id == accessible_server.id:
                    # All ZenML Pro servers accessible by the current ZenML Pro
                    # user have an authentication that is valid at least until
                    # the current ZenML Pro authentication token expires.
                    stored_server.update_server_info(
                        accessible_server,
                    )
                    updated_server = stored_server.model_copy()
                    # Replace the current server API token with the current
                    # ZenML Pro API token to reflect the current authentication
                    # status.
                    updated_server.api_token = pro_token
                    pro_servers[idx] = updated_server
                    break
            else:
                stored_server = ServerCredentials(
                    url=accessible_server.url or "",
                    api_token=pro_token,
                )
                stored_server.update_server_info(accessible_server)
                pro_servers.append(stored_server)

        if not all:
            accessible_pro_servers = [
                s
                for s in accessible_pro_servers
                if s.status == TenantStatus.AVAILABLE
            ]

        if not accessible_pro_servers:
            cli_utils.declare(
                "No ZenML Pro servers that are accessible to the current "
                "user could be found."
            )
            if not all:
                cli_utils.declare(
                    "Hint: use the `--all` flag to show all ZenML servers, "
                    "including those that the client is not currently "
                    "authorized to access or are not running."
                )

    elif pro_servers:
        cli_utils.warning(
            "The ZenML Pro authentication has expired. Please re-login "
            "to ZenML Pro using `zenml login` to include all ZenML Pro servers "
            "that you are a member of in the list."
        )

    # We add the local server to the list of servers, if it is running
    local_server = get_local_server()
    if local_server:
        url = (
            local_server.status.url if local_server.status else None
        ) or local_server.config.url
        status = local_server.status.status if local_server.status else ""
        local_servers.append(
            ServerCredentials(
                url=url or "",
                status=status,
                version=zenml.__version__,
                server_id=GlobalConfiguration().user_id,
                server_name=f"local {local_server.config.provider} server",
            )
        )

    all_servers = pro_servers + local_servers + servers

    if not all:
        # Filter out servers that are expired or not running
        all_servers = [s for s in all_servers if s.is_available]

    if verbose:
        columns = [
            "type",
            "server_id_hyperlink",
            "server_name_hyperlink",
            "organization_hyperlink" if pro_servers else "",
            "version",
            "status",
            "dashboard_url",
            "api_hyperlink",
            "auth_status",
        ]
    elif all:
        columns = [
            "type",
            "server_id_hyperlink",
            "server_name_hyperlink",
            "organization_hyperlink" if pro_servers else "",
            "version",
            "status",
            "api_hyperlink",
        ]
    else:
        columns = [
            "type",
            "server_id_hyperlink" if pro_servers else "",
            "server_name_hyperlink",
            "organization_hyperlink" if pro_servers else "",
            "version",
            "api_hyperlink" if servers else "",
        ]

    # Remove empty columns
    columns = [c for c in columns if c]

    # Figure out if the client is already connected to one of the
    # servers in the list
    current_server: List[ServerCredentials] = []
    if current_store_config.type == StoreType.REST:
        current_server = [
            s for s in all_servers if s.url == current_store_config.url
        ]

    cli_utils.print_pydantic_models(  # type: ignore[type-var]
        all_servers,
        columns=columns,
        rename_columns={
            "server_name_hyperlink": "name",
            "server_id_hyperlink": "ID",
            "organization_hyperlink": "organization",
            "dashboard_url": "dashboard URL",
            "api_hyperlink": "API URL",
            "auth_status": "auth status",
        },
        active_models=current_server,
        show_active=True,
    )

service_account()

Commands for service account management.

Source code in src/zenml/cli/service_accounts.py
94
95
96
@cli.group(cls=TagGroup, tag=CliCategories.IDENTITY_AND_SECURITY)
def service_account() -> None:
    """Commands for service account management."""

service_connector()

Configure and manage service connectors.

Source code in src/zenml/cli/service_connectors.py
43
44
45
46
47
48
@cli.group(
    cls=TagGroup,
    tag=CliCategories.IDENTITY_AND_SECURITY,
)
def service_connector() -> None:
    """Configure and manage service connectors."""

set_active_stack_command(stack_name_or_id)

Sets a stack as active.

Parameters:

Name Type Description Default
stack_name_or_id str

Name of the stack to set as active.

required
Source code in src/zenml/cli/stack.py
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
@stack.command("set", help="Sets a stack as active.")
@click.argument("stack_name_or_id", type=str)
def set_active_stack_command(stack_name_or_id: str) -> None:
    """Sets a stack as active.

    Args:
        stack_name_or_id: Name of the stack to set as active.
    """
    client = Client()
    scope = "repository" if client.uses_local_configuration else "global"

    with console.status(
        f"Setting the {scope} active stack to '{stack_name_or_id}'..."
    ):
        try:
            client.activate_stack(stack_name_id_or_prefix=stack_name_or_id)
        except KeyError as err:
            cli_utils.error(str(err))

        cli_utils.declare(
            f"Active {scope} stack set to: '{client.active_stack_model.name}'"
        )

set_logging_verbosity(verbosity)

Set logging level.

Parameters:

Name Type Description Default
verbosity str

The logging level.

required

Raises:

Type Description
KeyError

If the logging level is not supported.

Source code in src/zenml/cli/config.py
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
@logging.command("set-verbosity")
@click.argument(
    "verbosity",
    type=click.Choice(
        list(map(lambda x: x.name, LoggingLevels)), case_sensitive=False
    ),
)
def set_logging_verbosity(verbosity: str) -> None:
    """Set logging level.

    Args:
        verbosity: The logging level.

    Raises:
        KeyError: If the logging level is not supported.
    """
    verbosity = verbosity.upper()
    if verbosity not in LoggingLevels.__members__:
        raise KeyError(
            f"Verbosity must be one of {list(LoggingLevels.__members__.keys())}"
        )
    cli_utils.declare(f"Set verbosity to: {verbosity}")

show(local=False, ngrok_token=None)

Show the ZenML dashboard.

Parameters:

Name Type Description Default
local bool

Whether to show the ZenML dashboard for the local server.

False
ngrok_token Optional[str]

An ngrok auth token to use for exposing the ZenML dashboard on a public domain. Primarily used for accessing the local dashboard in Colab.

None
Source code in src/zenml/cli/server.py
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
@server.command(
    "show",
    help="Show the ZenML dashboard for the server that the client is connected to.",
)
@click.option(
    "--local",
    is_flag=True,
    help="Show the ZenML dashboard for the local server.",
    default=False,
    type=click.BOOL,
)
@click.option(
    "--ngrok-token",
    type=str,
    default=None,
    help="Specify an ngrok auth token to use for exposing the local ZenML "
    "server. Only used when `--local` is set. Primarily used for accessing the "
    "local dashboard in Colab.",
)
def show(local: bool = False, ngrok_token: Optional[str] = None) -> None:
    """Show the ZenML dashboard.

    Args:
        local: Whether to show the ZenML dashboard for the local server.
        ngrok_token: An ngrok auth token to use for exposing the ZenML dashboard
            on a public domain. Primarily used for accessing the local dashboard
            in Colab.
    """
    try:
        zenml.show(ngrok_token=ngrok_token)
    except RuntimeError as e:
        cli_utils.error(str(e))

show_dashboard(local=False, ngrok_token=None)

Show the ZenML dashboard.

Parameters:

Name Type Description Default
local bool

Whether to show the dashboard for the local server or the one for the active server.

False
ngrok_token Optional[str]

An ngrok auth token to use for exposing the ZenML dashboard on a public domain. Primarily used for accessing the dashboard in Colab.

None

Raises:

Type Description
RuntimeError

If no server is connected.

Source code in src/zenml/zen_server/utils.py
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
def show_dashboard(
    local: bool = False,
    ngrok_token: Optional[str] = None,
) -> None:
    """Show the ZenML dashboard.

    Args:
        local: Whether to show the dashboard for the local server or the
            one for the active server.
        ngrok_token: An ngrok auth token to use for exposing the ZenML
            dashboard on a public domain. Primarily used for accessing the
            dashboard in Colab.

    Raises:
        RuntimeError: If no server is connected.
    """
    from zenml.utils.dashboard_utils import show_dashboard
    from zenml.utils.networking_utils import get_or_create_ngrok_tunnel

    url: Optional[str] = None
    if not local:
        gc = GlobalConfiguration()
        if gc.store_configuration.type == StoreType.REST:
            url = gc.store_configuration.url

    if not url:
        # Else, check for local servers
        server = get_local_server()
        if server and server.status and server.status.url:
            url = server.status.url

    if not url:
        raise RuntimeError(
            "ZenML is not connected to any server right now. Please use "
            "`zenml login` to connect to a server or spin up a new local server "
            "via `zenml login --local`."
        )

    if ngrok_token:
        parsed_url = urlparse(url)

        ngrok_url = get_or_create_ngrok_tunnel(
            ngrok_token=ngrok_token, port=parsed_url.port or 80
        )
        logger.debug(f"Tunneling dashboard from {url} to {ngrok_url}.")
        url = ngrok_url

    show_dashboard(url)

stack()

Stacks to define various environments.

Source code in src/zenml/cli/stack.py
89
90
91
92
93
94
@cli.group(
    cls=TagGroup,
    tag=CliCategories.MANAGEMENT_TOOLS,
)
def stack() -> None:
    """Stacks to define various environments."""

start_local_server(docker=False, ip_address=None, port=None, blocking=False, image=None, ngrok_token=None, restart=False)

Start the ZenML dashboard locally and connect the client to it.

Parameters:

Name Type Description Default
docker bool

Use a docker deployment instead of the local process.

False
ip_address Union[IPv4Address, IPv6Address, None]

The IP address to bind the server to.

None
port Optional[int]

The port to bind the server to.

None
blocking bool

Block the CLI while the server is running.

False
image Optional[str]

A custom Docker image to use for the server, when the --docker flag is set.

None
ngrok_token Optional[str]

An ngrok auth token to use for exposing the ZenML dashboard on a public domain. Primarily used for accessing the dashboard in Colab.

None
restart bool

Restart the local ZenML server if it is already running.

False
Source code in src/zenml/cli/login.py
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
def start_local_server(
    docker: bool = False,
    ip_address: Union[
        ipaddress.IPv4Address, ipaddress.IPv6Address, None
    ] = None,
    port: Optional[int] = None,
    blocking: bool = False,
    image: Optional[str] = None,
    ngrok_token: Optional[str] = None,
    restart: bool = False,
) -> None:
    """Start the ZenML dashboard locally and connect the client to it.

    Args:
        docker: Use a docker deployment instead of the local process.
        ip_address: The IP address to bind the server to.
        port: The port to bind the server to.
        blocking: Block the CLI while the server is running.
        image: A custom Docker image to use for the server, when the
            `--docker` flag is set.
        ngrok_token: An ngrok auth token to use for exposing the ZenML dashboard
            on a public domain. Primarily used for accessing the dashboard in
            Colab.
        restart: Restart the local ZenML server if it is already running.
    """
    from zenml.zen_server.deploy.deployer import LocalServerDeployer

    if docker:
        from zenml.utils.docker_utils import check_docker

        if not check_docker():
            cli_utils.error(
                "Docker does not seem to be installed on your system. Please "
                "install Docker to use the Docker ZenML server local "
                "deployment or use one of the other deployment options."
            )
        provider = ServerProviderType.DOCKER
    else:
        if sys.platform == "win32" and not blocking:
            cli_utils.error(
                "Running the ZenML server locally as a background process is "
                "not supported on Windows. Please use the `--blocking` flag "
                "to run the server in blocking mode, or run the server in "
                "a Docker container by setting `--docker` instead."
            )
        else:
            pass
        provider = ServerProviderType.DAEMON
    if cli_utils.requires_mac_env_var_warning():
        cli_utils.error(
            "The `OBJC_DISABLE_INITIALIZE_FORK_SAFETY` environment variable "
            "is recommended to run the ZenML server locally on a Mac. "
            "Please set it to `YES` and try again."
        )

    deployer = LocalServerDeployer()

    config_attrs: Dict[str, Any] = dict(
        provider=provider,
    )
    if not docker:
        config_attrs["blocking"] = blocking
    elif image:
        config_attrs["image"] = image
    if port is not None:
        config_attrs["port"] = port
    if ip_address is not None:
        config_attrs["ip_address"] = ip_address

    from zenml.zen_server.deploy.deployment import LocalServerDeploymentConfig

    server_config = LocalServerDeploymentConfig(**config_attrs)
    if blocking:
        deployer.remove_server()
        cli_utils.declare(
            "The local ZenML dashboard is about to deploy in a "
            "blocking process."
        )

    server = deployer.deploy_server(server_config, restart=restart)

    if not blocking:
        deployer.connect_to_server()

        if server.status and server.status.url:
            cli_utils.declare(
                f"The local ZenML dashboard is available at "
                f"'{server.status.url}'."
            )
            show_dashboard(
                local=True,
                ngrok_token=ngrok_token,
            )

status()

Show details about the current configuration.

Source code in src/zenml/cli/server.py
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
@cli.command(
    "status", help="Show information about the current configuration."
)
def status() -> None:
    """Show details about the current configuration."""
    from zenml.login.credentials_store import get_credentials_store
    from zenml.login.pro.client import ZenMLProClient
    from zenml.login.pro.constants import ZENML_PRO_API_URL

    gc = GlobalConfiguration()
    client = Client()

    store_cfg = gc.store_configuration

    # Write about the current ZenML client
    cli_utils.declare("-----ZenML Client Status-----")
    if gc.uses_default_store():
        cli_utils.declare(
            f"Connected to the local ZenML database: '{store_cfg.url}'"
        )
    elif connected_to_local_server():
        cli_utils.declare(
            f"Connected to the local ZenML server: {store_cfg.url}"
        )
    elif re.match(r"^mysql://", store_cfg.url):
        cli_utils.declare(
            f"Connected directly to a SQL database: '{store_cfg.url}'"
        )
    else:
        credentials_store = get_credentials_store()
        server = credentials_store.get_credentials(store_cfg.url)
        if server:
            if server.type == ServerType.PRO:
                # If connected to a ZenML Pro server, refresh the server info
                pro_credentials = credentials_store.get_pro_credentials(
                    pro_api_url=server.pro_api_url or ZENML_PRO_API_URL,
                    allow_expired=False,
                )
                if pro_credentials:
                    pro_client = ZenMLProClient(pro_credentials.url)
                    pro_servers = pro_client.tenant.list(
                        url=store_cfg.url, member_only=True
                    )
                    if pro_servers:
                        credentials_store.update_server_info(
                            server_url=store_cfg.url,
                            server_info=pro_servers[0],
                        )

                cli_utils.declare(
                    f"Connected to a ZenML Pro server: `{server.server_name_hyperlink}`"
                    f" [{server.server_id_hyperlink}]"
                )

                cli_utils.declare(
                    f"  ZenML Pro Organization: {server.organization_hyperlink}"
                )
                if pro_credentials:
                    cli_utils.declare(
                        f"  ZenML Pro authentication: {pro_credentials.auth_status}"
                    )
            else:
                cli_utils.declare(
                    f"Connected to a remote ZenML server: `{server.dashboard_hyperlink}`"
                )

            cli_utils.declare(f"  Dashboard: {server.dashboard_hyperlink}")
            cli_utils.declare(f"  API: {server.api_hyperlink}")
            cli_utils.declare(f"  Server status: '{server.status}'")
            cli_utils.declare(f"  Server authentication: {server.auth_status}")

        else:
            cli_utils.declare(
                f"Connected to a remote ZenML server: [link={store_cfg.url}]"
                f"{store_cfg.url}[/link]"
            )

    try:
        client.zen_store.get_store_info()
    except Exception as e:
        cli_utils.warning(f"Error while initializing client: {e}")
    else:
        # Write about the active entities
        scope = "repository" if client.uses_local_configuration else "global"
        cli_utils.declare(f"  The active user is: '{client.active_user.name}'")
        cli_utils.declare(
            f"  The active stack is: '{client.active_stack_model.name}' ({scope})"
        )

    if client.root:
        cli_utils.declare(f"Active repository root: {client.root}")

    # Write about the configuration files
    cli_utils.declare(f"Using configuration from: '{gc.config_directory}'")
    cli_utils.declare(
        f"Local store files are located at: '{gc.local_stores_path}'"
    )

    cli_utils.declare("\n-----Local ZenML Server Status-----")
    local_server = get_local_server()
    if local_server:
        if local_server.status:
            if local_server.status.status == ServiceState.ACTIVE:
                cli_utils.declare(
                    f"The local {local_server.config.provider} server is "
                    f"running at: {local_server.status.url}"
                )
            else:
                cli_utils.declare(
                    f"The local {local_server.config.provider} server is not "
                    "available."
                )
                cli_utils.declare(
                    f"  Server state: {local_server.status.status}"
                )
                if local_server.status.status_message:
                    cli_utils.declare(
                        f"  Status message: {local_server.status.status_message}"
                    )
        else:
            cli_utils.declare(
                f"The local {local_server.config.provider} server is not "
                "running."
            )
    else:
        cli_utils.declare("The local server has not been started.")

tag()

Interact with tags.

Source code in src/zenml/cli/tag.py
37
38
39
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def tag() -> None:
    """Interact with tags."""

title(text)

Echo a title formatted string on the CLI.

Parameters:

Name Type Description Default
text str

Input text string.

required
Source code in src/zenml/cli/utils.py
115
116
117
118
119
120
121
def title(text: str) -> None:
    """Echo a title formatted string on the CLI.

    Args:
        text: Input text string.
    """
    console.print(text.upper(), style=zenml_style_defaults["title"])

track_decorator(event)

Decorator to track event.

If the decorated function takes in a AnalyticsTrackedModelMixin object as an argument or returns one, it will be called to track the event. The return value takes precedence over the argument when determining which object is called to track the event.

If the decorated function is a method of a class that inherits from AnalyticsTrackerMixin, the parent object will be used to intermediate tracking analytics.

Parameters:

Name Type Description Default
event AnalyticsEvent

Event string to stamp with.

required

Returns:

Type Description
Callable[[F], F]

A decorator that applies the analytics tracking to a function.

Source code in src/zenml/analytics/utils.py
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
def track_decorator(event: AnalyticsEvent) -> Callable[[F], F]:
    """Decorator to track event.

    If the decorated function takes in a `AnalyticsTrackedModelMixin` object as
    an argument or returns one, it will be called to track the event. The return
    value takes precedence over the argument when determining which object is
    called to track the event.

    If the decorated function is a method of a class that inherits from
    `AnalyticsTrackerMixin`, the parent object will be used to intermediate
    tracking analytics.

    Args:
        event: Event string to stamp with.

    Returns:
        A decorator that applies the analytics tracking to a function.
    """

    def inner_decorator(func: F) -> F:
        """Inner decorator function.

        Args:
            func: Function to decorate.

        Returns:
            Decorated function.
        """

        @wraps(func)
        def inner_func(*args: Any, **kwargs: Any) -> Any:
            """Inner function.

            Args:
                *args: Arguments to be passed to the function.
                **kwargs: Keyword arguments to be passed to the function.

            Returns:
                Result of the function.
            """
            with track_handler(event=event) as handler:
                try:
                    for obj in list(args) + list(kwargs.values()):
                        if isinstance(obj, AnalyticsTrackedModelMixin):
                            handler.metadata = obj.get_analytics_metadata()
                            break
                except Exception as e:
                    logger.debug(f"Analytics tracking failure for {func}: {e}")

                result = func(*args, **kwargs)

                try:
                    if isinstance(result, AnalyticsTrackedModelMixin):
                        handler.metadata = result.get_analytics_metadata()
                except Exception as e:
                    logger.debug(f"Analytics tracking failure for {func}: {e}")

                return result

        return cast(F, inner_func)

    return inner_decorator

uninstall(integrations, force=False, uv=False)

Uninstalls the required packages for a given integration.

If no integration is specified all required packages for all integrations are uninstalled using pip or uv.

Parameters:

Name Type Description Default
integrations Tuple[str]

The name of the integration to uninstall the requirements for.

required
force bool

Force the uninstallation of the required packages.

False
uv bool

Use uv for package uninstallation (experimental).

False
Source code in src/zenml/cli/integration.py
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
@integration.command(
    help="Uninstall the required packages for the integration of choice."
)
@click.argument("integrations", nargs=-1, required=False)
@click.option(
    "--yes",
    "-y",
    "force",
    is_flag=True,
    help="Force the uninstallation of the required packages. This will skip "
    "the confirmation step",
)
@click.option(
    "--uv",
    "uv",
    is_flag=True,
    help="Experimental: Use uv for package uninstallation.",
    default=False,
)
def uninstall(
    integrations: Tuple[str], force: bool = False, uv: bool = False
) -> None:
    """Uninstalls the required packages for a given integration.

    If no integration is specified all required packages for all integrations
    are uninstalled using pip or uv.

    Args:
        integrations: The name of the integration to uninstall the requirements
            for.
        force: Force the uninstallation of the required packages.
        uv: Use uv for package uninstallation (experimental).
    """
    from zenml.cli.utils import is_pip_installed, is_uv_installed
    from zenml.integrations.registry import integration_registry

    if uv and not is_uv_installed():
        error("Package `uv` is not installed. Please install it and retry.")

    if not uv and not is_pip_installed():
        error(
            "Pip is not installed. Please install pip or use the uv flag "
            "(--uv) for package installation."
        )

    if not integrations:
        # no integrations specified, use all registered integrations
        integrations = tuple(integration_registry.integrations.keys())

    requirements = []
    for integration_name in integrations:
        try:
            if integration_registry.is_installed(integration_name):
                requirements += (
                    integration_registry.select_uninstall_requirements(
                        integration_name
                    )
                )
            else:
                warning(
                    f"Requirements for integration '{integration_name}' "
                    f"already not installed."
                )
        except KeyError:
            warning(f"Unable to find integration '{integration_name}'.")

    if requirements and (
        force
        or confirmation(
            "Are you sure you want to uninstall the following "
            "packages from the current environment?\n"
            f"{requirements}"
        )
    ):
        for n in track(
            range(len(requirements)),
            description="Uninstalling integrations...",
        ):
            uninstall_package(requirements[n], use_uv=uv)

uninstall_package(package, use_uv=False)

Uninstalls pypi package from the current environment with pip or uv.

Parameters:

Name Type Description Default
package str

The package to uninstall.

required
use_uv bool

Whether to use uv for package uninstallation.

False
Source code in src/zenml/cli/utils.py
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
def uninstall_package(package: str, use_uv: bool = False) -> None:
    """Uninstalls pypi package from the current environment with pip or uv.

    Args:
        package: The package to uninstall.
        use_uv: Whether to use uv for package uninstallation.
    """
    if use_uv and not is_installed_in_python_environment("uv"):
        # If uv is installed globally, don't run as a python module
        command = []
    else:
        command = [sys.executable, "-m"]

    command += (
        ["uv", "pip", "uninstall", "-q"]
        if use_uv
        else ["pip", "uninstall", "-y", "-qqq"]
    )
    command += [package]

    subprocess.check_call(command)

unlock_authorized_device(id)

Unlock an authorized device.

Parameters:

Name Type Description Default
id str

The ID of the authorized device to unlock.

required
Source code in src/zenml/cli/authorized_device.py
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
@authorized_device.command("unlock")
@click.argument("id", type=str, required=True)
def unlock_authorized_device(id: str) -> None:
    """Unlock an authorized device.

    Args:
        id: The ID of the authorized device to unlock.
    """
    try:
        Client().update_authorized_device(
            id_or_prefix=id,
            locked=False,
        )
    except KeyError as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Locked authorized device `{id}`.")

up(docker=False, ip_address=None, port=None, blocking=False, image=None, ngrok_token=None)

Start the ZenML dashboard locally and connect the client to it.

Parameters:

Name Type Description Default
docker bool

Use a docker deployment instead of the local process.

False
ip_address Union[IPv4Address, IPv6Address, None]

The IP address to bind the server to.

None
port Optional[int]

The port to bind the server to.

None
blocking bool

Block the CLI while the server is running.

False
image Optional[str]

A custom Docker image to use for the server, when the --docker flag is set.

None
ngrok_token Optional[str]

An ngrok auth token to use for exposing the ZenML dashboard on a public domain. Primarily used for accessing the dashboard in Colab.

None
Source code in src/zenml/cli/server.py
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
@cli.command(
    "up",
    help="""Start the ZenML dashboard locally.

DEPRECATED: Please use `zenml login --local` instead.             
""",
)
@click.option(
    "--docker",
    is_flag=True,
    help="Start the ZenML dashboard as a Docker container instead of a local "
    "process.",
    default=False,
    type=click.BOOL,
)
@click.option(
    "--port",
    type=int,
    default=None,
    help="Use a custom TCP port value for the ZenML dashboard.",
)
@click.option(
    "--ip-address",
    type=ipaddress.ip_address,
    default=None,
    help="Have the ZenML dashboard listen on an IP address different than the "
    "localhost.",
)
@click.option(
    "--blocking",
    is_flag=True,
    help="Run the ZenML dashboard in blocking mode. The CLI will not return "
    "until the dashboard is stopped.",
    default=False,
    type=click.BOOL,
)
@click.option(
    "--image",
    type=str,
    default=None,
    help="Use a custom Docker image for the ZenML server. Only used when "
    "`--docker` is set.",
)
@click.option(
    "--ngrok-token",
    type=str,
    default=None,
    help="Specify an ngrok auth token to use for exposing the ZenML server.",
)
def up(
    docker: bool = False,
    ip_address: Union[
        ipaddress.IPv4Address, ipaddress.IPv6Address, None
    ] = None,
    port: Optional[int] = None,
    blocking: bool = False,
    image: Optional[str] = None,
    ngrok_token: Optional[str] = None,
) -> None:
    """Start the ZenML dashboard locally and connect the client to it.

    Args:
        docker: Use a docker deployment instead of the local process.
        ip_address: The IP address to bind the server to.
        port: The port to bind the server to.
        blocking: Block the CLI while the server is running.
        image: A custom Docker image to use for the server, when the
            `--docker` flag is set.
        ngrok_token: An ngrok auth token to use for exposing the ZenML dashboard
            on a public domain. Primarily used for accessing the dashboard in
            Colab.
    """
    cli_utils.warning(
        "The `zenml up` command is deprecated and will be removed in a "
        "future release. Please use the `zenml login --local` command instead."
    )

    # Calling the `zenml login` command
    cli_utils.declare("Calling `zenml login --local`...")
    login.callback(  # type: ignore[misc]
        local=True,
        docker=docker,
        ip_address=ip_address,
        port=port,
        blocking=blocking,
        image=image,
        ngrok_token=ngrok_token,
    )

update_api_key(service_account_name_or_id, name_or_id, name=None, description=None, active=None)

Update an API key.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account to which the API key belongs.

required
name_or_id str

The name or ID of the API key to update.

required
name Optional[str]

The new name of the API key.

None
description Optional[str]

The new description of the API key.

None
active Optional[bool]

Set an API key to active/inactive.

None
Source code in src/zenml/cli/service_accounts.py
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
@api_key.command("update", help="Update an API key.")
@click.argument("name_or_id", type=str, required=True)
@click.option("--name", type=str, required=False, help="The new name.")
@click.option(
    "--description", type=str, required=False, help="The new description."
)
@click.option(
    "--active",
    type=bool,
    required=False,
    help="Activate or deactivate an API key.",
)
@click.pass_obj
def update_api_key(
    service_account_name_or_id: str,
    name_or_id: str,
    name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
) -> None:
    """Update an API key.

    Args:
        service_account_name_or_id: The name or ID of the service account to
            which the API key belongs.
        name_or_id: The name or ID of the API key to update.
        name: The new name of the API key.
        description: The new description of the API key.
        active: Set an API key to active/inactive.
    """
    try:
        Client().update_api_key(
            service_account_name_id_or_prefix=service_account_name_or_id,
            name_id_or_prefix=name_or_id,
            name=name,
            description=description,
            active=active,
        )
    except (KeyError, EntityExistsError) as e:
        cli_utils.error(str(e))

    cli_utils.declare(f"Successfully updated API key `{name_or_id}`.")

update_artifact(artifact_name_or_id, name=None, tag=None, remove_tag=None)

Update an artifact by ID or name.

Usage example:

zenml artifact update <NAME> -n <NEW_NAME> -t <TAG1> -t <TAG2> -r <TAG_TO_REMOVE>

Parameters:

Name Type Description Default
artifact_name_or_id str

Name or ID of the artifact to update.

required
name Optional[str]

New name of the artifact.

None
tag Optional[List[str]]

New tags of the artifact.

None
remove_tag Optional[List[str]]

Tags to remove from the artifact.

None
Source code in src/zenml/cli/artifact.py
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
@artifact.command("update", help="Update an artifact.")
@click.argument("artifact_name_or_id")
@click.option(
    "--name",
    "-n",
    type=str,
    help="New name of the artifact.",
)
@click.option(
    "--tag",
    "-t",
    type=str,
    multiple=True,
    help="Tags to add to the artifact.",
)
@click.option(
    "--remove-tag",
    "-r",
    type=str,
    multiple=True,
    help="Tags to remove from the artifact.",
)
def update_artifact(
    artifact_name_or_id: str,
    name: Optional[str] = None,
    tag: Optional[List[str]] = None,
    remove_tag: Optional[List[str]] = None,
) -> None:
    """Update an artifact by ID or name.

    Usage example:
    ```
    zenml artifact update <NAME> -n <NEW_NAME> -t <TAG1> -t <TAG2> -r <TAG_TO_REMOVE>
    ```

    Args:
        artifact_name_or_id: Name or ID of the artifact to update.
        name: New name of the artifact.
        tag: New tags of the artifact.
        remove_tag: Tags to remove from the artifact.
    """
    try:
        artifact = Client().update_artifact(
            name_id_or_prefix=artifact_name_or_id,
            new_name=name,
            add_tags=tag,
            remove_tags=remove_tag,
        )
    except (KeyError, ValueError) as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Artifact '{artifact.id}' updated.")

update_artifact_version(name_id_or_prefix, version=None, tag=None, remove_tag=None)

Update an artifact version by ID or artifact name.

Usage example:

zenml artifact version update <NAME> -v <VERSION> -t <TAG1> -t <TAG2> -r <TAG_TO_REMOVE>

Parameters:

Name Type Description Default
name_id_or_prefix str

Either the ID of the artifact version or the name of the artifact.

required
version Optional[str]

The version of the artifact to get. Only used if name_id_or_prefix is the name of the artifact. If not specified, the latest version is returned.

None
tag Optional[List[str]]

Tags to add to the artifact version.

None
remove_tag Optional[List[str]]

Tags to remove from the artifact version.

None
Source code in src/zenml/cli/artifact.py
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
@version.command("update", help="Update an artifact version.")
@click.argument("name_id_or_prefix")
@click.option(
    "--version",
    "-v",
    type=str,
    help=(
        "The version of the artifact to get. Only used if "
        "`name_id_or_prefix` is the name of the artifact. If not specified, "
        "the latest version is returned."
    ),
)
@click.option(
    "--tag",
    "-t",
    type=str,
    multiple=True,
    help="Tags to add to the artifact version.",
)
@click.option(
    "--remove-tag",
    "-r",
    type=str,
    multiple=True,
    help="Tags to remove from the artifact version.",
)
def update_artifact_version(
    name_id_or_prefix: str,
    version: Optional[str] = None,
    tag: Optional[List[str]] = None,
    remove_tag: Optional[List[str]] = None,
) -> None:
    """Update an artifact version by ID or artifact name.

    Usage example:
    ```
    zenml artifact version update <NAME> -v <VERSION> -t <TAG1> -t <TAG2> -r <TAG_TO_REMOVE>
    ```

    Args:
        name_id_or_prefix: Either the ID of the artifact version or the name of
            the artifact.
        version: The version of the artifact to get. Only used if
            `name_id_or_prefix` is the name of the artifact. If not specified,
            the latest version is returned.
        tag: Tags to add to the artifact version.
        remove_tag: Tags to remove from the artifact version.
    """
    try:
        artifact_version = Client().update_artifact_version(
            name_id_or_prefix=name_id_or_prefix,
            version=version,
            add_tags=tag,
            remove_tags=remove_tag,
        )
    except (KeyError, ValueError) as e:
        cli_utils.error(str(e))
    else:
        cli_utils.declare(f"Artifact version '{artifact_version.id}' updated.")

update_code_repository(name_or_id, name, description, logo_url, args)

Update a code repository.

Parameters:

Name Type Description Default
name_or_id str

Name or ID of the code repository to update.

required
name Optional[str]

New name of the code repository.

required
description Optional[str]

New description of the code repository.

required
logo_url Optional[str]

New logo URL of the code repository.

required
args List[str]

Code repository configurations.

required
Source code in src/zenml/cli/code_repository.py
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
@code_repository.command(
    "update",
    help="Update a code repository.",
    context_settings={"ignore_unknown_options": True},
)
@click.argument("name_or_id", type=str, required=True)
@click.option(
    "--name",
    "-n",
    type=str,
    required=False,
    help="The new code repository name.",
)
@click.option(
    "--description",
    "-d",
    type=str,
    required=False,
    help="The new code repository description.",
)
@click.option(
    "--logo-url",
    "-l",
    type=str,
    required=False,
    help="New URL of a logo (png, jpg or svg) for the code repository.",
)
@click.argument(
    "args",
    nargs=-1,
    type=click.UNPROCESSED,
)
def update_code_repository(
    name_or_id: str,
    name: Optional[str],
    description: Optional[str],
    logo_url: Optional[str],
    args: List[str],
) -> None:
    """Update a code repository.

    Args:
        name_or_id: Name or ID of the code repository to update.
        name: New name of the code repository.
        description: New description of the code repository.
        logo_url: New logo URL of the code repository.
        args: Code repository configurations.
    """
    parsed_name_or_id, parsed_args = cli_utils.parse_name_and_extra_arguments(
        list(args) + [name_or_id], expand_args=True, name_mandatory=True
    )
    assert parsed_name_or_id

    with console.status(
        f"Updating code repository '{parsed_name_or_id}'...\n"
    ):
        Client().update_code_repository(
            name_id_or_prefix=parsed_name_or_id,
            name=name,
            description=description,
            logo_url=logo_url,
            config=parsed_args,
        )
        cli_utils.declare(
            f"Successfully updated code repository `{parsed_name_or_id}`."
        )

update_model(model_name_or_id, name, license, description, audience, use_cases, tradeoffs, ethical, limitations, tag, remove_tag, save_models_to_registry)

Register a new model in the Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id str

The name of the model.

required
name Optional[str]

The name of the model.

required
license Optional[str]

The license model created under.

required
description Optional[str]

The description of the model.

required
audience Optional[str]

The target audience of the model.

required
use_cases Optional[str]

The use cases of the model.

required
tradeoffs Optional[str]

The tradeoffs of the model.

required
ethical Optional[str]

The ethical implications of the model.

required
limitations Optional[str]

The know limitations of the model.

required
tag Optional[List[str]]

Tags to be added to the model.

required
remove_tag Optional[List[str]]

Tags to be removed from the model.

required
save_models_to_registry Optional[bool]

Whether to save the model to the registry.

required
Source code in src/zenml/cli/model.py
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
@model.command("update", help="Update an existing model.")
@click.argument("model_name_or_id")
@click.option(
    "--name",
    "-n",
    help="The name of the model.",
    type=str,
    required=False,
)
@click.option(
    "--license",
    "-l",
    help="The license under which the model is created.",
    type=str,
    required=False,
)
@click.option(
    "--description",
    "-d",
    help="The description of the model.",
    type=str,
    required=False,
)
@click.option(
    "--audience",
    "-a",
    help="The target audience for the model.",
    type=str,
    required=False,
)
@click.option(
    "--use-cases",
    "-u",
    help="The use cases of the model.",
    type=str,
    required=False,
)
@click.option(
    "--tradeoffs",
    help="The tradeoffs of the model.",
    type=str,
    required=False,
)
@click.option(
    "--ethical",
    "-e",
    help="The ethical implications of the model.",
    type=str,
    required=False,
)
@click.option(
    "--limitations",
    help="The known limitations of the model.",
    type=str,
    required=False,
)
@click.option(
    "--tag",
    "-t",
    help="Tags to be added to the model.",
    type=str,
    required=False,
    multiple=True,
)
@click.option(
    "--remove-tag",
    "-r",
    help="Tags to be removed from the model.",
    type=str,
    required=False,
    multiple=True,
)
@click.option(
    "--save-models-to-registry",
    "-s",
    help="Whether to automatically save model artifacts to the model registry.",
    type=click.BOOL,
    required=False,
    default=True,
)
def update_model(
    model_name_or_id: str,
    name: Optional[str],
    license: Optional[str],
    description: Optional[str],
    audience: Optional[str],
    use_cases: Optional[str],
    tradeoffs: Optional[str],
    ethical: Optional[str],
    limitations: Optional[str],
    tag: Optional[List[str]],
    remove_tag: Optional[List[str]],
    save_models_to_registry: Optional[bool],
) -> None:
    """Register a new model in the Model Control Plane.

    Args:
        model_name_or_id: The name of the model.
        name: The name of the model.
        license: The license model created under.
        description: The description of the model.
        audience: The target audience of the model.
        use_cases: The use cases of the model.
        tradeoffs: The tradeoffs of the model.
        ethical: The ethical implications of the model.
        limitations: The know limitations of the model.
        tag: Tags to be added to the model.
        remove_tag: Tags to be removed from the model.
        save_models_to_registry: Whether to save the model to the
            registry.
    """
    model_id = Client().get_model(model_name_or_id=model_name_or_id).id
    update_dict = remove_none_values(
        dict(
            name=name,
            license=license,
            description=description,
            audience=audience,
            use_cases=use_cases,
            trade_offs=tradeoffs,
            ethics=ethical,
            limitations=limitations,
            add_tags=tag,
            remove_tags=remove_tag,
            save_models_to_registry=save_models_to_registry,
        )
    )
    model = Client().update_model(model_name_or_id=model_id, **update_dict)

    cli_utils.print_table([_model_to_print(model)])

update_model_version(model_name_or_id, model_version_name_or_number_or_id, stage, name, description, tag, remove_tag, force=False)

Update an existing model version stage in the Model Control Plane.

Parameters:

Name Type Description Default
model_name_or_id str

The ID or name of the model containing version.

required
model_version_name_or_number_or_id str

The ID, number or name of the model version.

required
stage Optional[str]

The stage of the model version to be set.

required
name Optional[str]

The name of the model version.

required
description Optional[str]

The description of the model version.

required
tag Optional[List[str]]

Tags to be added to the model version.

required
remove_tag Optional[List[str]]

Tags to be removed from the model version.

required
force bool

Whether existing model version in target stage should be silently archived.

False
Source code in src/zenml/cli/model.py
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
@version.command("update", help="Update an existing model version stage.")
@click.argument("model_name_or_id")
@click.argument("model_version_name_or_number_or_id")
@click.option(
    "--stage",
    "-s",
    type=click.Choice(choices=ModelStages.values()),
    required=False,
    help="The stage of the model version.",
)
@click.option(
    "--name",
    "-n",
    type=str,
    required=False,
    help="The name of the model version.",
)
@click.option(
    "--description",
    "-d",
    type=str,
    required=False,
    help="The description of the model version.",
)
@click.option(
    "--tag",
    "-t",
    help="Tags to be added to the model.",
    type=str,
    required=False,
    multiple=True,
)
@click.option(
    "--remove-tag",
    "-r",
    help="Tags to be removed from the model.",
    type=str,
    required=False,
    multiple=True,
)
@click.option(
    "--force",
    "-f",
    is_flag=True,
    help="Don't ask for confirmation, if stage already occupied.",
)
def update_model_version(
    model_name_or_id: str,
    model_version_name_or_number_or_id: str,
    stage: Optional[str],
    name: Optional[str],
    description: Optional[str],
    tag: Optional[List[str]],
    remove_tag: Optional[List[str]],
    force: bool = False,
) -> None:
    """Update an existing model version stage in the Model Control Plane.

    Args:
        model_name_or_id: The ID or name of the model containing version.
        model_version_name_or_number_or_id: The ID, number or name of the model version.
        stage: The stage of the model version to be set.
        name: The name of the model version.
        description: The description of the model version.
        tag: Tags to be added to the model version.
        remove_tag: Tags to be removed from the model version.
        force: Whether existing model version in target stage should be silently archived.
    """
    model_version = Client().get_model_version(
        model_name_or_id=model_name_or_id,
        model_version_name_or_number_or_id=model_version_name_or_number_or_id,
    )
    try:
        model_version = Client().update_model_version(
            model_name_or_id=model_name_or_id,
            version_name_or_id=model_version.id,
            stage=stage,
            add_tags=tag,
            remove_tags=remove_tag,
            force=force,
            name=name,
            description=description,
        )
    except RuntimeError:
        if not force:
            cli_utils.print_table([_model_version_to_print(model_version)])

            confirmation = cli_utils.confirmation(
                "Are you sure you want to change the status of model "
                f"version '{model_version_name_or_number_or_id}' to "
                f"'{stage}'?\nThis stage is already taken by "
                "model version shown above and if you will proceed this "
                "model version will get into archived stage."
            )
            if not confirmation:
                cli_utils.declare("Model version stage update canceled.")
                return
            model_version = Client().update_model_version(
                model_name_or_id=model_version.model.id,
                version_name_or_id=model_version.id,
                stage=stage,
                add_tags=tag,
                remove_tags=remove_tag,
                force=True,
                description=description,
            )
    cli_utils.print_table([_model_version_to_print(model_version)])

update_secret(name_or_id, extra_args, new_scope=None, remove_keys=[], interactive=False, values='')

Update a secret for a given name or id.

Parameters:

Name Type Description Default
name_or_id str

The name or id of the secret to update.

required
new_scope Optional[str]

The new scope of the secret.

None
extra_args List[str]

The arguments to pass to the secret.

required
interactive bool

Whether to use interactive mode to update the secret.

False
remove_keys List[str]

The keys to remove from the secret.

[]
values str

Secret key-value pairs to be passed as JSON or YAML.

''
Source code in src/zenml/cli/secret.py
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
@secret.command(
    "update",
    context_settings={"ignore_unknown_options": True},
    help="Update a secret with a given name or id.",
)
@click.argument(
    "name_or_id",
    type=click.STRING,
)
@click.option(
    "--new-scope",
    "-s",
    type=click.Choice([scope.value for scope in list(SecretScope)]),
)
@click.option(
    "--interactive",
    "-i",
    "interactive",
    is_flag=True,
    help="Use interactive mode to update the secret values.",
    type=click.BOOL,
)
@click.option(
    "--values",
    "-v",
    "values",
    help="Pass one or more values using JSON or YAML format or reference a file by prefixing the filename with the @ "
    "special character.",
    required=False,
    type=str,
)
@click.option("--remove-keys", "-r", type=click.STRING, multiple=True)
@click.argument("extra_args", nargs=-1, type=click.UNPROCESSED)
def update_secret(
    name_or_id: str,
    extra_args: List[str],
    new_scope: Optional[str] = None,
    remove_keys: List[str] = [],
    interactive: bool = False,
    values: str = "",
) -> None:
    """Update a secret for a given name or id.

    Args:
        name_or_id: The name or id of the secret to update.
        new_scope: The new scope of the secret.
        extra_args: The arguments to pass to the secret.
        interactive: Whether to use interactive mode to update the secret.
        remove_keys: The keys to remove from the secret.
        values: Secret key-value pairs to be passed as JSON or YAML.
    """
    name, parsed_args = parse_name_and_extra_arguments(
        list(extra_args) + [name_or_id], expand_args=True
    )
    if values:
        inline_values = expand_argument_value_from_file(SECRET_VALUES, values)
        inline_values_dict = convert_structured_str_to_dict(inline_values)
        parsed_args.update(inline_values_dict)

    client = Client()

    with console.status(f"Checking secret `{name}`..."):
        try:
            secret = client.get_secret(
                name_id_or_prefix=name_or_id, allow_partial_name_match=False
            )
        except KeyError as e:
            error(
                f"Secret with name `{name}` does not exist or could not be "
                f"loaded: {str(e)}."
            )
        except NotImplementedError as e:
            error(f"Centralized secrets management is disabled: {str(e)}")

    declare(
        f"Updating secret with name '{secret.name}' and ID '{secret.id}' in "
        f"scope '{secret.scope.value}:"
    )

    if "name" in parsed_args:
        error("The word 'name' cannot be used as a key for a secret.")

    if interactive:
        if parsed_args:
            error(
                "Cannot pass secret fields as arguments when using "
                "interactive mode."
            )

        declare(
            "You will now have a chance to update each secret pair one by one."
        )
        secret_args_add_update = {}
        for k, _ in secret.secret_values.items():
            item_choice = (
                click.prompt(
                    text=f"Do you want to update key '{k}'? (enter to skip)",
                    type=click.Choice(["y", "n"]),
                    default="n",
                ),
            )
            if "n" in item_choice:
                continue
            elif "y" in item_choice:
                new_value = getpass.getpass(
                    f"Please enter the new secret value for the key '{k}'"
                )
                if new_value:
                    secret_args_add_update[k] = new_value

        # check if any additions to be made
        while True:
            addition_check = confirmation(
                "Do you want to add a new key:value pair?"
            )
            if not addition_check:
                break

            new_key = click.prompt(
                text="Please enter the new key name",
                type=click.STRING,
            )
            new_value = getpass.getpass(
                f"Please enter the new secret value for the key '{new_key}'"
            )
            secret_args_add_update[new_key] = new_value
    else:
        secret_args_add_update = parsed_args

    client.update_secret(
        name_id_or_prefix=secret.id,
        new_scope=SecretScope(new_scope) if new_scope else None,
        add_or_update_values=secret_args_add_update,
        remove_values=remove_keys,
    )
    declare(f"Secret '{secret.name}' successfully updated.")

update_service_account(service_account_name_or_id, updated_name=None, description=None, active=None)

Update an existing service account.

Parameters:

Name Type Description Default
service_account_name_or_id str

The name or ID of the service account to update.

required
updated_name Optional[str]

The new name of the service account.

None
description Optional[str]

The new API key description.

None
active Optional[bool]

Activate or deactivate a service account.

None
Source code in src/zenml/cli/service_accounts.py
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
@service_account.command(
    "update",
    help="Update a service account.",
)
@click.argument("service_account_name_or_id", type=str, required=True)
@click.option(
    "--name",
    "-n",
    "updated_name",
    type=str,
    required=False,
    help="New service account name.",
)
@click.option(
    "--description",
    "-d",
    type=str,
    required=False,
    help="The API key description.",
)
@click.option(
    "--active",
    type=bool,
    required=False,
    help="Activate or deactivate a service account.",
)
def update_service_account(
    service_account_name_or_id: str,
    updated_name: Optional[str] = None,
    description: Optional[str] = None,
    active: Optional[bool] = None,
) -> None:
    """Update an existing service account.

    Args:
        service_account_name_or_id: The name or ID of the service account to
            update.
        updated_name: The new name of the service account.
        description: The new API key description.
        active: Activate or deactivate a service account.
    """
    try:
        Client().update_service_account(
            name_id_or_prefix=service_account_name_or_id,
            updated_name=updated_name,
            description=description,
            active=active,
        )
    except (KeyError, EntityExistsError) as err:
        cli_utils.error(str(err))

update_service_connector(args, name_id_or_prefix=None, name=None, description=None, connector_type=None, resource_type=None, resource_id=None, auth_method=None, expires_at=None, expires_skew_tolerance=None, expiration_seconds=None, no_verify=False, labels=None, interactive=False, show_secrets=False, remove_attrs=None)

Updates a service connector.

Parameters:

Name Type Description Default
args List[str]

Configuration arguments for the service connector.

required
name_id_or_prefix Optional[str]

The name or ID of the service connector to update.

None
name Optional[str]

New connector name.

None
description Optional[str]

Short description for the service connector.

None
connector_type Optional[str]

The service connector type.

None
resource_type Optional[str]

The type of resource to connect to.

None
resource_id Optional[str]

The ID of the resource to connect to.

None
auth_method Optional[str]

The authentication method to use.

None
expires_at Optional[datetime]

The time at which the credentials configured for this connector will expire.

None
expires_skew_tolerance Optional[int]

The tolerance, in seconds, allowed when determining when the credentials configured for or generated by this connector will expire.

None
expiration_seconds Optional[int]

The duration, in seconds, that the temporary credentials generated by this connector should remain valid.

None
no_verify bool

Do not verify the service connector before updating.

False
labels Optional[List[str]]

Labels to be associated with the service connector.

None
interactive bool

Register a new service connector interactively.

False
show_secrets bool

Show security sensitive configuration attributes in the terminal.

False
remove_attrs Optional[List[str]]

Configuration attributes to be removed from the configuration.

None
Source code in src/zenml/cli/service_connectors.py
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
@service_connector.command(
    "update",
    context_settings={"ignore_unknown_options": True},
    help="""Update and verify an existing service connector.

This command can be used to update an existing ZenML service connector and
to optionally verify that the updated service connector configuration and
credentials are still valid and can be used to access the specified resource(s).

If the `-i|--interactive` flag is set, it will prompt the user for all the
information required to update the service connector configuration:

    $ zenml service-connector update -i <connector-name-or-id>

For consistency reasons, the connector type cannot be changed. If you need to
change the connector type, you need to create a new service connector.
You also cannot change the authentication method, resource type and resource ID
of a service connector that is already actively being used by one or more stack
components.

Secret configuration attributes are not shown by default. Use the
`-x|--show-secrets` flag to show them:

    $ zenml service-connector update -ix <connector-name-or-id>

Non-interactive examples:

- update the DockerHub repository that a Docker service connector is configured
to provide access to:

    $ zenml service-connector update dockerhub-hyppo --resource-id lylemcnew

- update the AWS credentials that an AWS service connector is configured to
use from an STS token to an AWS secret key. This involves updating some config
values and deleting others:

    $ zenml service-connector update aws-auto-multi \\
--aws_access_key_id=<aws-key-id> \\
--aws_secret_access_key=<aws-secret-key>  \\
--remove-attr aws-sts-token

- update the foo label to a new value and delete the baz label from a connector:

    $ zenml service-connector update gcp-eu-multi \\                          
--label foo=bar --label baz

All service connectors updates are validated before being applied. To skip
validation, pass the `--no-verify` flag.
""",
)
@click.argument(
    "name_id_or_prefix",
    type=str,
    required=True,
)
@click.option(
    "--name",
    "name",
    help="New connector name.",
    required=False,
    type=str,
)
@click.option(
    "--description",
    "description",
    help="Short description for the connector instance.",
    required=False,
    type=str,
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="The type of resource to connect to.",
    required=False,
    type=str,
)
@click.option(
    "--resource-id",
    "-ri",
    "resource_id",
    help="The ID of the resource to connect to.",
    required=False,
    type=str,
)
@click.option(
    "--auth-method",
    "-a",
    "auth_method",
    help="The authentication method to use.",
    required=False,
    type=str,
)
@click.option(
    "--expires-at",
    "expires_at",
    help="The time at which the credentials configured for this connector "
    "will expire.",
    required=False,
    type=click.DateTime(),
)
@click.option(
    "--expires-skew-tolerance",
    "expires_skew_tolerance",
    help="The tolerance, in seconds, allowed when determining when the "
    "credentials configured for or generated by this connector will expire.",
    required=False,
    type=int,
)
@click.option(
    "--expiration-seconds",
    "expiration_seconds",
    help="The duration, in seconds, that the temporary credentials "
    "generated by this connector should remain valid.",
    required=False,
    type=int,
)
@click.option(
    "--label",
    "-l",
    "labels",
    help="Labels to be associated with the service connector. Takes the form "
    "`-l key1=value1` or `-l key1` and can be used multiple times.",
    multiple=True,
)
@click.option(
    "--no-verify",
    "no_verify",
    is_flag=True,
    default=False,
    help="Do not verify the service connector before registering.",
    type=click.BOOL,
)
@click.option(
    "--interactive",
    "-i",
    "interactive",
    is_flag=True,
    default=False,
    help="Register a new service connector interactively.",
    type=click.BOOL,
)
@click.option(
    "--show-secrets",
    "-x",
    "show_secrets",
    is_flag=True,
    default=False,
    help="Show security sensitive configuration attributes in the terminal.",
    type=click.BOOL,
)
@click.option(
    "--remove-attr",
    "-r",
    "remove_attrs",
    help="Configuration attributes to be removed from the configuration. Takes "
    "the form `-r attr-name` and can be used multiple times.",
    multiple=True,
)
@click.argument("args", nargs=-1, type=click.UNPROCESSED)
def update_service_connector(
    args: List[str],
    name_id_or_prefix: Optional[str] = None,
    name: Optional[str] = None,
    description: Optional[str] = None,
    connector_type: Optional[str] = None,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    auth_method: Optional[str] = None,
    expires_at: Optional[datetime] = None,
    expires_skew_tolerance: Optional[int] = None,
    expiration_seconds: Optional[int] = None,
    no_verify: bool = False,
    labels: Optional[List[str]] = None,
    interactive: bool = False,
    show_secrets: bool = False,
    remove_attrs: Optional[List[str]] = None,
) -> None:
    """Updates a service connector.

    Args:
        args: Configuration arguments for the service connector.
        name_id_or_prefix: The name or ID of the service connector to
            update.
        name: New connector name.
        description: Short description for the service connector.
        connector_type: The service connector type.
        resource_type: The type of resource to connect to.
        resource_id: The ID of the resource to connect to.
        auth_method: The authentication method to use.
        expires_at: The time at which the credentials configured for this
            connector will expire.
        expires_skew_tolerance: The tolerance, in seconds, allowed when
            determining when the credentials configured for or generated by
            this connector will expire.
        expiration_seconds: The duration, in seconds, that the temporary
            credentials generated by this connector should remain valid.
        no_verify: Do not verify the service connector before
            updating.
        labels: Labels to be associated with the service connector.
        interactive: Register a new service connector interactively.
        show_secrets: Show security sensitive configuration attributes in
            the terminal.
        remove_attrs: Configuration attributes to be removed from the
            configuration.
    """
    client = Client()

    # Parse the given args
    name_id_or_prefix, parsed_args = cli_utils.parse_name_and_extra_arguments(
        list(args) + [name_id_or_prefix or ""],
        expand_args=True,
        name_mandatory=True,
    )
    assert name_id_or_prefix is not None

    # Parse the given labels
    parsed_labels = cli_utils.get_parsed_labels(labels, allow_label_only=True)

    # Start by fetching the existing connector configuration
    try:
        connector = client.get_service_connector(
            name_id_or_prefix,
            allow_name_prefix_match=False,
            load_secrets=True,
        )
    except KeyError as e:
        cli_utils.error(str(e))

    if interactive:
        # Fetch the connector type specification if not already embedded
        # into the connector model
        if isinstance(connector.connector_type, str):
            try:
                connector_type_spec = client.get_service_connector_type(
                    connector.connector_type
                )
            except KeyError as e:
                cli_utils.error(
                    "Could not find the connector type "
                    f"'{connector.connector_type}' associated with the "
                    f"'{connector.name}' connector: {e}."
                )
        else:
            connector_type_spec = connector.connector_type

        # Ask for a new name, if needed
        name = prompt_connector_name(connector.name, connector=connector.id)

        # Ask for a new description, if needed
        description = click.prompt(
            "Updated service connector description",
            type=str,
            default=connector.description,
        )

        # Ask for a new authentication method
        auth_method = click.prompt(
            "If you would like to update the authentication method, please "
            "select a new one from the following options, otherwise press "
            "enter to keep the existing one. Please note that changing "
            "the authentication method may invalidate the existing "
            "configuration and credentials and may require you to reconfigure "
            "the connector from scratch",
            type=click.Choice(
                list(connector_type_spec.auth_method_dict.keys()),
            ),
            default=connector.auth_method,
        )

        assert auth_method is not None
        auth_method_spec = connector_type_spec.auth_method_dict[auth_method]

        # If the authentication method has changed, we need to reconfigure
        # the connector from scratch; otherwise, we ask the user if they
        # want to update the existing configuration
        if auth_method != connector.auth_method:
            confirm = True
        else:
            confirm = click.confirm(
                "Would you like to update the authentication configuration?",
                default=False,
            )

        existing_config = connector.full_configuration

        if confirm:
            # Here we reconfigure the connector or update the existing
            # configuration. The existing configuration is used as much
            # as possible to avoid the user having to re-enter the same
            # values from scratch.

            cli_utils.declare(
                f"Please update or verify the existing configuration for the "
                f"'{auth_method_spec.name}' authentication method."
            )

            # Prompt for the configuration of the selected authentication method
            # field by field
            config_schema = auth_method_spec.config_schema or {}
            config_dict = cli_utils.prompt_configuration(
                config_schema=config_schema,
                show_secrets=show_secrets,
                existing_config=existing_config,
            )

        else:
            config_dict = existing_config

        # Next, we address resource type updates. If the connector is
        # configured to access a single resource type, we don't need to
        # ask the user for a new resource type. We only look at the
        # resource types that support the selected authentication method.
        available_resource_types = [
            r.resource_type
            for r in connector_type_spec.resource_types
            if auth_method in r.auth_methods
        ]
        if len(available_resource_types) == 1:
            resource_type = available_resource_types[0]
        else:
            if connector.is_multi_type:
                resource_type = None
                title = (
                    "The connector is configured to access any of the supported "
                    f"resource types ({', '.join(available_resource_types)}). "
                    "Would you like to restrict it to a single resource type?"
                )
            else:
                resource_type = connector.resource_types[0]
                title = (
                    "The connector is configured to access the "
                    f"{resource_type} resource type. "
                    "Would you like to change that?"
                )
            confirm = click.confirm(title, default=False)

            if confirm:
                # Prompt for a new resource type, if needed
                resource_type = prompt_resource_type(
                    available_resource_types=available_resource_types
                )

        # Prompt for a new expiration time if the auth method supports it
        expiration_seconds = None
        if auth_method_spec.supports_temporary_credentials():
            expiration_seconds = prompt_expiration_time(
                min=auth_method_spec.min_expiration_seconds,
                max=auth_method_spec.max_expiration_seconds,
                default=connector.expiration_seconds
                or auth_method_spec.default_expiration_seconds,
            )

        # Prompt for the time when the credentials will expire
        expires_at = prompt_expires_at(expires_at or connector.expires_at)

        try:
            # Validate the connector configuration and fetch all available
            # resources that are accessible with the provided configuration
            # in the process
            with console.status("Validating service connector update...\n"):
                (
                    connector_model,
                    connector_resources,
                ) = client.update_service_connector(
                    name_id_or_prefix=connector.id,
                    name=name,
                    description=description,
                    auth_method=auth_method,
                    # Use empty string to indicate that the resource type
                    # should be removed in the update if not set here
                    resource_type=resource_type or "",
                    configuration=config_dict,
                    expires_at=expires_at,
                    # Use zero value to indicate that the expiration time
                    # should be removed in the update if not set here
                    expiration_seconds=expiration_seconds or 0,
                    expires_skew_tolerance=expires_skew_tolerance,
                    verify=True,
                    update=False,
                )
            assert connector_model is not None
            assert connector_resources is not None
        except (
            KeyError,
            ValueError,
            IllegalOperationError,
            NotImplementedError,
            AuthorizationException,
        ) as e:
            cli_utils.error(f"Failed to verify service connector update: {e}")

        if resource_type:
            # Finally, for connectors that are configured with a particular
            # resource type, prompt the user to select one of the available
            # resources that can be accessed with the connector. We don't need
            # to do this for resource types that don't support instances.
            resource_type_spec = connector_type_spec.resource_type_dict[
                resource_type
            ]
            if resource_type_spec.supports_instances:
                assert len(connector_resources.resources) == 1
                resource_ids = connector_resources.resources[0].resource_ids
                assert resource_ids is not None
                resource_id = prompt_resource_id(
                    resource_name=resource_type_spec.name,
                    resource_ids=resource_ids,
                )
            else:
                resource_id = None
        else:
            resource_id = None

        # Prepare the rest of the variables to fall through to the
        # non-interactive configuration case
        no_verify = False

    else:
        # Non-interactive configuration

        # Apply the configuration from the command line arguments
        config_dict = connector.full_configuration
        config_dict.update(parsed_args)

        if not resource_type and not connector.is_multi_type:
            resource_type = connector.resource_types[0]

        resource_id = resource_id or connector.resource_id
        expiration_seconds = expiration_seconds or connector.expiration_seconds

        # Remove attributes that the user has indicated should be removed
        if remove_attrs:
            for remove_attr in remove_attrs:
                config_dict.pop(remove_attr, None)
            if "resource_id" in remove_attrs or "resource-id" in remove_attrs:
                resource_id = None
            if (
                "resource_type" in remove_attrs
                or "resource-type" in remove_attrs
            ):
                resource_type = None
            if (
                "expiration_seconds" in remove_attrs
                or "expiration-seconds" in remove_attrs
            ):
                expiration_seconds = None

    with console.status(
        f"Updating service connector {name_id_or_prefix}...\n"
    ):
        try:
            (
                connector_model,
                connector_resources,
            ) = client.update_service_connector(
                name_id_or_prefix=connector.id,
                name=name,
                auth_method=auth_method,
                # Use empty string to indicate that the resource type
                # should be removed in the update if not set here
                resource_type=resource_type or "",
                configuration=config_dict,
                # Use empty string to indicate that the resource ID
                # should be removed in the update if not set here
                resource_id=resource_id or "",
                description=description,
                expires_at=expires_at,
                # Use empty string to indicate that the expiration time
                # should be removed in the update if not set here
                expiration_seconds=expiration_seconds or 0,
                expires_skew_tolerance=expires_skew_tolerance,
                labels=parsed_labels,
                verify=not no_verify,
                update=True,
            )
        except (
            KeyError,
            ValueError,
            IllegalOperationError,
            NotImplementedError,
            AuthorizationException,
        ) as e:
            cli_utils.error(f"Failed to update service connector: {e}")

    if connector_resources is not None:
        cli_utils.declare(
            f"Successfully updated service connector `{connector.name}`. It "
            "can now be used to access the following resources:"
        )

        cli_utils.print_service_connector_resource_table(
            [connector_resources],
            show_resources_only=True,
        )

    else:
        cli_utils.declare(
            f"Successfully updated service connector `{connector.name}` "
        )

update_stack(stack_name_or_id=None, artifact_store=None, orchestrator=None, container_registry=None, step_operator=None, feature_store=None, model_deployer=None, experiment_tracker=None, alerter=None, annotator=None, data_validator=None, image_builder=None, model_registry=None)

Update a stack.

Parameters:

Name Type Description Default
stack_name_or_id Optional[str]

Name or id of the stack to update.

None
artifact_store Optional[str]

Name of the new artifact store for this stack.

None
orchestrator Optional[str]

Name of the new orchestrator for this stack.

None
container_registry Optional[str]

Name of the new container registry for this stack.

None
step_operator Optional[str]

Name of the new step operator for this stack.

None
feature_store Optional[str]

Name of the new feature store for this stack.

None
model_deployer Optional[str]

Name of the new model deployer for this stack.

None
experiment_tracker Optional[str]

Name of the new experiment tracker for this stack.

None
alerter Optional[str]

Name of the new alerter for this stack.

None
annotator Optional[str]

Name of the new annotator for this stack.

None
data_validator Optional[str]

Name of the new data validator for this stack.

None
image_builder Optional[str]

Name of the new image builder for this stack.

None
model_registry Optional[str]

Name of the new model registry for this stack.

None
Source code in src/zenml/cli/stack.py
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
@stack.command(
    "update",
    context_settings=dict(ignore_unknown_options=True),
    help="Update a stack with new components.",
)
@click.argument("stack_name_or_id", type=str, required=False)
@click.option(
    "-a",
    "--artifact-store",
    "artifact_store",
    help="Name of the new artifact store for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-o",
    "--orchestrator",
    "orchestrator",
    help="Name of the new orchestrator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-c",
    "--container_registry",
    "container_registry",
    help="Name of the new container registry for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-r",
    "--model_registry",
    "model_registry",
    help="Name of the model registry for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-s",
    "--step_operator",
    "step_operator",
    help="Name of the new step operator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-f",
    "--feature_store",
    "feature_store",
    help="Name of the new feature store for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-d",
    "--model_deployer",
    "model_deployer",
    help="Name of the new model deployer for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-e",
    "--experiment_tracker",
    "experiment_tracker",
    help="Name of the new experiment tracker for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-al",
    "--alerter",
    "alerter",
    help="Name of the new alerter for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-an",
    "--annotator",
    "annotator",
    help="Name of the new annotator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-dv",
    "--data_validator",
    "data_validator",
    help="Name of the data validator for this stack.",
    type=str,
    required=False,
)
@click.option(
    "-i",
    "--image_builder",
    "image_builder",
    help="Name of the image builder for this stack.",
    type=str,
    required=False,
)
def update_stack(
    stack_name_or_id: Optional[str] = None,
    artifact_store: Optional[str] = None,
    orchestrator: Optional[str] = None,
    container_registry: Optional[str] = None,
    step_operator: Optional[str] = None,
    feature_store: Optional[str] = None,
    model_deployer: Optional[str] = None,
    experiment_tracker: Optional[str] = None,
    alerter: Optional[str] = None,
    annotator: Optional[str] = None,
    data_validator: Optional[str] = None,
    image_builder: Optional[str] = None,
    model_registry: Optional[str] = None,
) -> None:
    """Update a stack.

    Args:
        stack_name_or_id: Name or id of the stack to update.
        artifact_store: Name of the new artifact store for this stack.
        orchestrator: Name of the new orchestrator for this stack.
        container_registry: Name of the new container registry for this stack.
        step_operator: Name of the new step operator for this stack.
        feature_store: Name of the new feature store for this stack.
        model_deployer: Name of the new model deployer for this stack.
        experiment_tracker: Name of the new experiment tracker for this
            stack.
        alerter: Name of the new alerter for this stack.
        annotator: Name of the new annotator for this stack.
        data_validator: Name of the new data validator for this stack.
        image_builder: Name of the new image builder for this stack.
        model_registry: Name of the new model registry for this stack.
    """
    client = Client()

    with console.status("Updating stack...\n"):
        updates: Dict[StackComponentType, List[Union[str, UUID]]] = dict()
        if artifact_store:
            updates[StackComponentType.ARTIFACT_STORE] = [artifact_store]
        if alerter:
            updates[StackComponentType.ALERTER] = [alerter]
        if annotator:
            updates[StackComponentType.ANNOTATOR] = [annotator]
        if container_registry:
            updates[StackComponentType.CONTAINER_REGISTRY] = [
                container_registry
            ]
        if data_validator:
            updates[StackComponentType.DATA_VALIDATOR] = [data_validator]
        if experiment_tracker:
            updates[StackComponentType.EXPERIMENT_TRACKER] = [
                experiment_tracker
            ]
        if feature_store:
            updates[StackComponentType.FEATURE_STORE] = [feature_store]
        if model_registry:
            updates[StackComponentType.MODEL_REGISTRY] = [model_registry]
        if image_builder:
            updates[StackComponentType.IMAGE_BUILDER] = [image_builder]
        if model_deployer:
            updates[StackComponentType.MODEL_DEPLOYER] = [model_deployer]
        if orchestrator:
            updates[StackComponentType.ORCHESTRATOR] = [orchestrator]
        if step_operator:
            updates[StackComponentType.STEP_OPERATOR] = [step_operator]

        try:
            updated_stack = client.update_stack(
                name_id_or_prefix=stack_name_or_id,
                component_updates=updates,
            )

        except (KeyError, IllegalOperationError) as err:
            cli_utils.error(str(err))

        cli_utils.declare(
            f"Stack `{updated_stack.name}` successfully updated!"
        )
    print_model_url(get_stack_url(updated_stack))

update_tag(tag_name_or_id, name, color)

Register a new model in the Model Control Plane.

Parameters:

Name Type Description Default
tag_name_or_id Union[str, UUID]

The name or ID of the tag.

required
name Optional[str]

The name of the tag.

required
color Optional[str]

The color variant for UI.

required
Source code in src/zenml/cli/tag.py
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
@tag.command("update", help="Update an existing tag.")
@click.argument("tag_name_or_id")
@click.option(
    "--name",
    "-n",
    help="The name of the tag.",
    type=str,
    required=False,
)
@click.option(
    "--color",
    "-c",
    help="The color variant for UI.",
    type=click.Choice(choices=ColorVariants.values()),
    required=False,
)
def update_tag(
    tag_name_or_id: Union[str, UUID], name: Optional[str], color: Optional[str]
) -> None:
    """Register a new model in the Model Control Plane.

    Args:
        tag_name_or_id: The name or ID of the tag.
        name: The name of the tag.
        color: The color variant for UI.
    """
    update_dict = remove_none_values(dict(name=name, color=color))
    if not update_dict:
        cli_utils.declare("You need to specify --name or --color for update.")
        return

    tag = Client().update_tag(
        tag_name_or_id=tag_name_or_id,
        tag_update_model=TagUpdate(**update_dict),
    )

    cli_utils.print_pydantic_models(
        [tag],
        exclude_columns=["created"],
    )

update_user(user_name_or_id, updated_name=None, updated_full_name=None, updated_email=None, make_admin=None, make_user=None, active=None)

Update an existing user.

Parameters:

Name Type Description Default
user_name_or_id str

The name of the user to create.

required
updated_name Optional[str]

The name of the user to create.

None
updated_full_name Optional[str]

The name of the user to create.

None
updated_email Optional[str]

The name of the user to create.

None
make_admin Optional[bool]

Whether the user should be an admin.

None
make_user Optional[bool]

Whether the user should be a regular user.

None
active Optional[bool]

Use to activate or deactivate a user account.

None
Source code in src/zenml/cli/user_management.py
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
@user.command(
    "update",
    help="Update user information through the cli.",
)
@click.argument("user_name_or_id", type=str, required=True)
@click.option(
    "--name",
    "-n",
    "updated_name",
    type=str,
    required=False,
    help="New user name.",
)
@click.option(
    "--full_name",
    "-f",
    "updated_full_name",
    type=str,
    required=False,
    help="New full name. If this contains an empty space make sure to surround "
    "the name with quotes '<Full Name>'.",
)
@click.option(
    "--email",
    "-e",
    "updated_email",
    type=str,
    required=False,
    help="New user email.",
)
@click.option(
    "--admin",
    "-a",
    "make_admin",
    is_flag=True,
    required=False,
    default=None,
    help="Whether the user should be an admin.",
)
@click.option(
    "--user",
    "-u",
    "make_user",
    is_flag=True,
    required=False,
    default=None,
    help="Whether the user should be a regular user.",
)
@click.option(
    "--active",
    "active",
    type=bool,
    required=False,
    default=None,
    help="Use to activate or deactivate a user account.",
)
def update_user(
    user_name_or_id: str,
    updated_name: Optional[str] = None,
    updated_full_name: Optional[str] = None,
    updated_email: Optional[str] = None,
    make_admin: Optional[bool] = None,
    make_user: Optional[bool] = None,
    active: Optional[bool] = None,
) -> None:
    """Update an existing user.

    Args:
        user_name_or_id: The name of the user to create.
        updated_name: The name of the user to create.
        updated_full_name: The name of the user to create.
        updated_email: The name of the user to create.
        make_admin: Whether the user should be an admin.
        make_user: Whether the user should be a regular user.
        active: Use to activate or deactivate a user account.
    """
    if make_admin is not None and make_user is not None:
        cli_utils.error(
            "Cannot set both --admin and --user flags as these are mutually exclusive."
        )
    try:
        current_user = Client().get_user(
            user_name_or_id, allow_name_prefix_match=False
        )
        if current_user.is_admin and make_user:
            confirmation = cli_utils.confirmation(
                f"Currently user `{current_user.name}` is an admin. Are you "
                "sure you want to make them a regular user?"
            )
            if not confirmation:
                cli_utils.declare("User update canceled.")
                return

        updated_is_admin = None
        if make_admin is True:
            updated_is_admin = True
        elif make_user is True:
            updated_is_admin = False
        Client().update_user(
            name_id_or_prefix=user_name_or_id,
            updated_name=updated_name,
            updated_full_name=updated_full_name,
            updated_email=updated_email,
            updated_is_admin=updated_is_admin,
            active=active,
        )
    except (KeyError, IllegalOperationError) as err:
        cli_utils.error(str(err))

upgrade(integrations, force=False, uv=False)

Upgrade the required packages for a given integration.

If no integration is specified all required packages for all integrations are installed using pip or uv.

Parameters:

Name Type Description Default
integrations Tuple[str]

The name of the integration to install the requirements for.

required
force bool

Force the installation of the required packages.

False
uv bool

Use uv for package installation (experimental).

False
Source code in src/zenml/cli/integration.py
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
@integration.command(
    help="Upgrade the required packages for the integration of choice."
)
@click.argument("integrations", nargs=-1, required=False)
@click.option(
    "--yes",
    "-y",
    "force",
    is_flag=True,
    help="Force the upgrade of the required packages. This will skip the "
    "confirmation step and re-upgrade existing packages as well",
)
@click.option(
    "--uv",
    "uv",
    is_flag=True,
    help="Experimental: Use uv for package upgrade.",
    default=False,
)
def upgrade(
    integrations: Tuple[str],
    force: bool = False,
    uv: bool = False,
) -> None:
    """Upgrade the required packages for a given integration.

    If no integration is specified all required packages for all integrations
    are installed using pip or uv.

    Args:
        integrations: The name of the integration to install the requirements
            for.
        force: Force the installation of the required packages.
        uv: Use uv for package installation (experimental).
    """
    from zenml.cli.utils import is_pip_installed, is_uv_installed
    from zenml.integrations.registry import integration_registry

    if uv and not is_uv_installed():
        error("Package `uv` is not installed. Please install it and retry.")

    if not uv and not is_pip_installed():
        error(
            "Pip is not installed. Please install pip or use the uv flag "
            "(--uv) for package installation."
        )

    if not integrations:
        # no integrations specified, use all registered integrations
        integrations = set(integration_registry.integrations.keys())

    requirements = []
    integrations_to_install = []
    for integration_name in integrations:
        try:
            if integration_registry.is_installed(integration_name):
                requirements += (
                    integration_registry.select_integration_requirements(
                        integration_name
                    )
                )
                integrations_to_install.append(integration_name)
            else:
                declare(
                    f"None of the required packages for integration "
                    f"'{integration_name}' are installed."
                )
        except KeyError:
            warning(f"Unable to find integration '{integration_name}'.")

    if requirements and (
        force
        or confirmation(
            f"Are you sure you want to upgrade the following "
            "packages to the current environment?\n"
            f"{requirements}"
        )
    ):
        with console.status("Upgrading integrations..."):
            install_packages(requirements, upgrade=True, use_uv=uv)

user()

Commands for user management.

Source code in src/zenml/cli/user_management.py
35
36
37
@cli.group(cls=TagGroup, tag=CliCategories.IDENTITY_AND_SECURITY)
def user() -> None:
    """Commands for user management."""

utc_now(tz_aware=False)

Get the current time in the UTC timezone.

Parameters:

Name Type Description Default
tz_aware Union[bool, datetime]

Use this flag to control whether the returned datetime is timezone-aware or timezone-naive. If a datetime is provided, the returned datetime will be timezone-aware if and only if the input datetime is also timezone-aware.

False

Returns:

Type Description
datetime

The current UTC time. If tz_aware is a datetime, the returned datetime

datetime

will be timezone-aware only if the input datetime is also timezone-aware.

datetime

If tz_aware is a boolean, the returned datetime will be timezone-aware

datetime

if True, and timezone-naive if False.

Source code in src/zenml/utils/time_utils.py
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
def utc_now(tz_aware: Union[bool, datetime] = False) -> datetime:
    """Get the current time in the UTC timezone.

    Args:
        tz_aware: Use this flag to control whether the returned datetime is
            timezone-aware or timezone-naive. If a datetime is provided, the
            returned datetime will be timezone-aware if and only if the input
            datetime is also timezone-aware.

    Returns:
        The current UTC time. If tz_aware is a datetime, the returned datetime
        will be timezone-aware only if the input datetime is also timezone-aware.
        If tz_aware is a boolean, the returned datetime will be timezone-aware
        if True, and timezone-naive if False.
    """
    now = datetime.now(timezone.utc)
    if (
        isinstance(tz_aware, bool)
        and tz_aware is False
        or isinstance(tz_aware, datetime)
        and tz_aware.tzinfo is None
    ):
        return now.replace(tzinfo=None)

    return now

utc_now_tz_aware()

Get the current timezone-aware UTC time.

Returns:

Type Description
datetime

The current UTC time.

Source code in src/zenml/utils/time_utils.py
47
48
49
50
51
52
53
def utc_now_tz_aware() -> datetime:
    """Get the current timezone-aware UTC time.

    Returns:
        The current UTC time.
    """
    return utc_now(tz_aware=True)

validate_keys(key)

Validates key if it is a valid python string.

Parameters:

Name Type Description Default
key str

key to validate

required
Source code in src/zenml/cli/utils.py
877
878
879
880
881
882
883
884
def validate_keys(key: str) -> None:
    """Validates key if it is a valid python string.

    Args:
        key: key to validate
    """
    if not key.isidentifier():
        error("Please provide args with a proper identifier as the key.")

validate_name(ctx, param, value)

Validate the name of the stack.

Parameters:

Name Type Description Default
ctx Context

The click context.

required
param str

The parameter name.

required
value str

The value of the parameter.

required

Returns:

Type Description
str

The validated value.

Raises:

Type Description
BadParameter

If the name is invalid.

Source code in src/zenml/cli/stack.py
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
def validate_name(ctx: click.Context, param: str, value: str) -> str:
    """Validate the name of the stack.

    Args:
        ctx: The click context.
        param: The parameter name.
        value: The value of the parameter.

    Returns:
        The validated value.

    Raises:
        BadParameter: If the name is invalid.
    """
    if not value:
        return value

    if not re.match(r"^[a-zA-Z0-9-]*$", value):
        raise click.BadParameter(
            "Stack name must contain only alphanumeric characters and hyphens."
        )

    if len(value) > 16:
        raise click.BadParameter(
            "Stack name must have a maximum length of 16 characters."
        )

    return value

verify_service_connector(name_id_or_prefix, resource_type=None, resource_id=None, verify_only=False)

Verifies if a service connector has access to one or more resources.

Parameters:

Name Type Description Default
name_id_or_prefix str

The name or id of the service connector to verify.

required
resource_type Optional[str]

The type of resource for which to verify access.

None
resource_id Optional[str]

The ID of the resource for which to verify access.

None
verify_only bool

Only verify the service connector, do not list resources.

False
Source code in src/zenml/cli/service_connectors.py
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
@service_connector.command(
    "verify",
    help="""Verify and list resources for a service connector.

Use this command to check if a registered ZenML service connector is correctly
configured with valid credentials and is able to actively access one or more
resources.

This command has a double purpose:

1. first, it can be used to check if a service connector is correctly configured
and has valid credentials, by actively exercising its credentials and trying to
gain access to the remote service or resource it is configured for.

2. second, it is used to fetch the list of resources that a service connector
has access to. This is useful when the service connector is configured to access
multiple resources, and you want to know which ones it has access to, or even
as a confirmation that it has access to the single resource that it is
configured for. 

You can use this command to answer questions like:

- is this connector valid and correctly configured?
- have I configured this connector to access the correct resource?
- which resources can this connector give me access to?

For connectors that are configured to access multiple types of resources, a
list of resources is not fetched, because it would be too expensive to list
all resources of all types that the connector has access to. In this case,
you can use the `--resource-type` argument to scope down the verification to
a particular resource type.

Examples:

- check if a Kubernetes service connector has access to the cluster it is
configured for:

    $ zenml service-connector verify my-k8s-connector

- check if a generic, multi-type, multi-instance AWS service connector has
access to a particular S3 bucket:

    $ zenml service-connector verify my-generic-aws-connector \\               
--resource-type s3-bucket --resource-id my-bucket

""",
)
@click.option(
    "--resource-type",
    "-r",
    "resource_type",
    help="The type of the resource for which to verify access.",
    required=False,
    type=str,
)
@click.option(
    "--resource-id",
    "-ri",
    "resource_id",
    help="The ID of the resource for which to verify access.",
    required=False,
    type=str,
)
@click.option(
    "--verify-only",
    "-v",
    "verify_only",
    help="Only verify the service connector, do not list resources.",
    required=False,
    is_flag=True,
)
@click.argument("name_id_or_prefix", type=str, required=True)
def verify_service_connector(
    name_id_or_prefix: str,
    resource_type: Optional[str] = None,
    resource_id: Optional[str] = None,
    verify_only: bool = False,
) -> None:
    """Verifies if a service connector has access to one or more resources.

    Args:
        name_id_or_prefix: The name or id of the service connector to verify.
        resource_type: The type of resource for which to verify access.
        resource_id: The ID of the resource for which to verify access.
        verify_only: Only verify the service connector, do not list resources.
    """
    client = Client()

    with console.status(
        f"Verifying service connector '{name_id_or_prefix}'...\n"
    ):
        try:
            resources = client.verify_service_connector(
                name_id_or_prefix=name_id_or_prefix,
                resource_type=resource_type,
                resource_id=resource_id,
                list_resources=not verify_only,
            )
        except (
            KeyError,
            ValueError,
            IllegalOperationError,
            NotImplementedError,
            AuthorizationException,
        ) as e:
            cli_utils.error(
                f"Service connector '{name_id_or_prefix}' verification failed: "
                f"{e}"
            )

    click.echo(
        f"Service connector '{name_id_or_prefix}' is correctly configured "
        f"with valid credentials and has access to the following resources:"
    )

    cli_utils.print_service_connector_resource_table(
        resources=[resources],
        show_resources_only=True,
    )

version()

Interact with model versions in the Model Control Plane.

Source code in src/zenml/cli/model.py
396
397
398
@model.group()
def version() -> None:
    """Interact with model versions in the Model Control Plane."""

warn_unsupported_non_default_workspace()

Warning for unsupported non-default workspace.

Source code in src/zenml/cli/utils.py
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
def warn_unsupported_non_default_workspace() -> None:
    """Warning for unsupported non-default workspace."""
    from zenml.constants import (
        ENV_ZENML_DISABLE_WORKSPACE_WARNINGS,
        handle_bool_env_var,
    )

    disable_warnings = handle_bool_env_var(
        ENV_ZENML_DISABLE_WORKSPACE_WARNINGS, False
    )
    if not disable_warnings:
        warning(
            "Currently the concept of `workspace` is not supported "
            "within the Dashboard. The Project functionality will be "
            "completed in the coming weeks. For the time being it "
            "is recommended to stay within the `default` workspace."
        )

warning(text, bold=None, italic=None, **kwargs)

Echo a warning string on the CLI.

Parameters:

Name Type Description Default
text str

Input text string.

required
bold Optional[bool]

Optional boolean to bold the text.

None
italic Optional[bool]

Optional boolean to italicize the text.

None
**kwargs Any

Optional kwargs to be passed to console.print().

{}
Source code in src/zenml/cli/utils.py
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
def warning(
    text: str,
    bold: Optional[bool] = None,
    italic: Optional[bool] = None,
    **kwargs: Any,
) -> None:
    """Echo a warning string on the CLI.

    Args:
        text: Input text string.
        bold: Optional boolean to bold the text.
        italic: Optional boolean to italicize the text.
        **kwargs: Optional kwargs to be passed to console.print().
    """
    base_style = zenml_style_defaults["warning"]
    style = Style.chain(base_style, Style(bold=bold, italic=italic))
    console.print(text, style=style, **kwargs)

web_login(url=None, verify_ssl=None, pro_api_url=None)

Implements the OAuth2 Device Authorization Grant flow.

This function implements the client side of the OAuth2 Device Authorization Grant flow as defined in https://tools.ietf.org/html/rfc8628, with the following customizations:

  • the unique ZenML client ID (user_id in the global config) is used as the OAuth2 client ID value
  • additional information is added to the user agent header to be used by users to identify the ZenML client

Upon completion of the flow, the access token is saved in the credentials store.

Parameters:

Name Type Description Default
url Optional[str]

The URL of the OAuth2 server. If not provided, the ZenML Pro API server is used by default.

None
verify_ssl Optional[Union[str, bool]]

Whether to verify the SSL certificate of the OAuth2 server. If a string is passed, it is interpreted as the path to a CA bundle file.

None
pro_api_url Optional[str]

The URL of the ZenML Pro API server. If not provided, the default ZenML Pro API server URL is used.

None

Returns:

Type Description
APIToken

The response returned by the OAuth2 server.

Raises:

Type Description
AuthorizationException

If an error occurred during the authorization process.

Source code in src/zenml/login/web_login.py
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
def web_login(
    url: Optional[str] = None,
    verify_ssl: Optional[Union[str, bool]] = None,
    pro_api_url: Optional[str] = None,
) -> APIToken:
    """Implements the OAuth2 Device Authorization Grant flow.

    This function implements the client side of the OAuth2 Device Authorization
    Grant flow as defined in https://tools.ietf.org/html/rfc8628, with the
    following customizations:

    * the unique ZenML client ID (`user_id` in the global config) is used
    as the OAuth2 client ID value
    * additional information is added to the user agent header to be used by
    users to identify the ZenML client

    Upon completion of the flow, the access token is saved in the credentials store.

    Args:
        url: The URL of the OAuth2 server. If not provided, the ZenML Pro API
            server is used by default.
        verify_ssl: Whether to verify the SSL certificate of the OAuth2 server.
            If a string is passed, it is interpreted as the path to a CA bundle
            file.
        pro_api_url: The URL of the ZenML Pro API server. If not provided, the
            default ZenML Pro API server URL is used.

    Returns:
        The response returned by the OAuth2 server.

    Raises:
        AuthorizationException: If an error occurred during the authorization
            process.
    """
    from zenml.login.credentials_store import get_credentials_store
    from zenml.models import (
        OAuthDeviceAuthorizationRequest,
        OAuthDeviceAuthorizationResponse,
        OAuthDeviceTokenRequest,
        OAuthDeviceUserAgentHeader,
        OAuthTokenResponse,
    )

    credentials_store = get_credentials_store()

    # Make a request to the OAuth2 server to get the device code and user code.
    # The client ID used for the request is the unique ID of the ZenML client.
    response: Optional[requests.Response] = None

    # Add the following information in the user agent header to be used by users
    # to identify the ZenML client:
    #
    # * the ZenML version
    # * the python version
    # * the OS type
    # * the hostname
    #
    user_agent_header = OAuthDeviceUserAgentHeader(
        hostname=platform.node(),
        zenml_version=__version__,
        python_version=platform.python_version(),
        os=platform.system(),
    )

    zenml_pro = False
    if not url:
        # If no URL is provided, we use the ZenML Pro API server by default
        zenml_pro = True
        url = base_url = pro_api_url or ZENML_PRO_API_URL
    else:
        # Get rid of any trailing slashes to prevent issues when having double
        # slashes in the URL
        url = url.rstrip("/")
        if pro_api_url:
            # This is a ZenML Pro server. The device authentication is done
            # through the ZenML Pro API.
            zenml_pro = True
            base_url = pro_api_url
        else:
            base_url = url

    auth_request = OAuthDeviceAuthorizationRequest(
        client_id=GlobalConfiguration().user_id
    )

    # If an existing token is found in the credentials store, we reuse its
    # device ID to avoid creating a new device ID for the same device.
    existing_token = credentials_store.get_token(url)
    if existing_token and existing_token.device_id:
        auth_request.device_id = existing_token.device_id

    if zenml_pro:
        auth_url = base_url + AUTH + DEVICE_AUTHORIZATION
        login_url = base_url + AUTH + LOGIN
    else:
        auth_url = base_url + API + VERSION_1 + DEVICE_AUTHORIZATION
        login_url = base_url + API + VERSION_1 + LOGIN

    try:
        response = requests.post(
            auth_url,
            headers={
                "Content-Type": "application/x-www-form-urlencoded",
                "User-Agent": user_agent_header.encode(),
            },
            data=auth_request.model_dump(exclude_none=True),
            verify=verify_ssl,
            timeout=DEFAULT_HTTP_TIMEOUT,
        )
        if response.status_code == 200:
            auth_response = OAuthDeviceAuthorizationResponse(**response.json())
        else:
            logger.info(f"Error: {response.status_code} {response.text}")
            raise AuthorizationException(
                f"Could not connect to {base_url}. Please check the URL."
            )
    except (requests.exceptions.JSONDecodeError, ValueError, TypeError):
        logger.exception("Bad response received from API server.")
        raise AuthorizationException(
            "Bad response received from API server. Please check the URL."
        )
    except requests.exceptions.RequestException:
        logger.exception("Could not connect to API server.")
        raise AuthorizationException(
            f"Could not connect to {base_url}. Please check the URL."
        )

    # Open the verification URL in the user's browser
    verification_uri = (
        auth_response.verification_uri_complete
        or auth_response.verification_uri
    )
    if verification_uri.startswith("/"):
        # If the verification URI is a relative path, we need to add the base
        # URL to it
        verification_uri = base_url + verification_uri
    webbrowser.open(verification_uri)
    logger.info(
        f"If your browser did not open automatically, please open the "
        f"following URL into your browser to proceed with the authentication:"
        f"\n\n{verification_uri}\n"
    )

    # Poll the OAuth2 server until the user has authorized the device
    token_request = OAuthDeviceTokenRequest(
        device_code=auth_response.device_code,
        client_id=auth_request.client_id,
    )
    expires_in = auth_response.expires_in
    interval = auth_response.interval
    token_response: OAuthTokenResponse
    while True:
        response = requests.post(
            login_url,
            headers={"Content-Type": "application/x-www-form-urlencoded"},
            data=token_request.model_dump(),
            verify=verify_ssl,
            timeout=DEFAULT_HTTP_TIMEOUT,
        )
        if response.status_code == 200:
            # The user has authorized the device, so we can extract the access token
            token_response = OAuthTokenResponse(**response.json())
            if zenml_pro:
                logger.info("Successfully logged in to ZenML Pro.")
            else:
                logger.info(f"Successfully logged in to {url}.")
            break
        elif response.status_code == 400:
            try:
                error_response = OAuthError(**response.json())
            except (
                requests.exceptions.JSONDecodeError,
                ValueError,
                TypeError,
            ):
                raise AuthorizationException(
                    f"Error received from {base_url}: {response.text}"
                )

            if error_response.error == "authorization_pending":
                # The user hasn't authorized the device yet, so we wait for the
                # interval and try again
                pass
            elif error_response.error == "slow_down":
                # The OAuth2 server is asking us to slow down our polling
                interval += 5
            else:
                # There was another error with the request
                raise AuthorizationException(
                    f"Error: {error_response.error} {error_response.error_description}"
                )

            expires_in -= interval
            if expires_in <= 0:
                raise AuthorizationException(
                    "User did not authorize the device in time."
                )
            time.sleep(interval)
        else:
            # There was another error with the request
            raise AuthorizationException(
                f"Error: {response.status_code} {response.json()['error']}"
            )

    # Save the token in the credentials store
    return credentials_store.set_token(
        url, token_response, is_zenml_pro=zenml_pro
    )

workspace()

Commands for workspace management.

Source code in src/zenml/cli/workspace.py
33
34
35
@cli.group(cls=TagGroup, tag=CliCategories.MANAGEMENT_TOOLS)
def workspace() -> None:
    """Commands for workspace management."""

write_yaml(file_path, contents, sort_keys=True)

Write contents as YAML format to file_path.

Parameters:

Name Type Description Default
file_path str

Path to YAML file.

required
contents Union[Dict[Any, Any], List[Any]]

Contents of YAML file as dict or list.

required
sort_keys bool

If True, keys are sorted alphabetically. If False, the order in which the keys were inserted into the dict will be preserved.

True

Raises:

Type Description
FileNotFoundError

if directory does not exist.

Source code in src/zenml/utils/yaml_utils.py
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
def write_yaml(
    file_path: str,
    contents: Union[Dict[Any, Any], List[Any]],
    sort_keys: bool = True,
) -> None:
    """Write contents as YAML format to file_path.

    Args:
        file_path: Path to YAML file.
        contents: Contents of YAML file as dict or list.
        sort_keys: If `True`, keys are sorted alphabetically. If `False`,
            the order in which the keys were inserted into the dict will
            be preserved.

    Raises:
        FileNotFoundError: if directory does not exist.
    """
    if not io_utils.is_remote(file_path):
        dir_ = str(Path(file_path).parent)
        if not fileio.isdir(dir_):
            raise FileNotFoundError(f"Directory {dir_} does not exist.")
    io_utils.write_file_contents_as_string(
        file_path, yaml.dump(contents, sort_keys=sort_keys)
    )