Tensorboard
zenml.integrations.tensorboard
Initialization for TensorBoard integration.
Attributes
TENSORBOARD = 'tensorboard'
module-attribute
Classes
Integration
Base class for integration in ZenML.
Functions
activate() -> None
classmethod
Abstract method to activate the integration.
Source code in src/zenml/integrations/integration.py
170 171 172 |
|
check_installation() -> bool
classmethod
Method to check whether the required packages are installed.
Returns:
Type | Description |
---|---|
bool
|
True if all required packages are installed, False otherwise. |
Source code in src/zenml/integrations/integration.py
65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 |
|
flavors() -> List[Type[Flavor]]
classmethod
Abstract method to declare new stack component flavors.
Returns:
Type | Description |
---|---|
List[Type[Flavor]]
|
A list of new stack component flavors. |
Source code in src/zenml/integrations/integration.py
174 175 176 177 178 179 180 181 |
|
get_requirements(target_os: Optional[str] = None) -> List[str]
classmethod
Method to get the requirements for the integration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
target_os
|
Optional[str]
|
The target operating system to get the requirements for. |
None
|
Returns:
Type | Description |
---|---|
List[str]
|
A list of requirements. |
Source code in src/zenml/integrations/integration.py
135 136 137 138 139 140 141 142 143 144 145 |
|
get_uninstall_requirements(target_os: Optional[str] = None) -> List[str]
classmethod
Method to get the uninstall requirements for the integration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
target_os
|
Optional[str]
|
The target operating system to get the requirements for. |
None
|
Returns:
Type | Description |
---|---|
List[str]
|
A list of requirements. |
Source code in src/zenml/integrations/integration.py
147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 |
|
plugin_flavors() -> List[Type[BasePluginFlavor]]
classmethod
Abstract method to declare new plugin flavors.
Returns:
Type | Description |
---|---|
List[Type[BasePluginFlavor]]
|
A list of new plugin flavors. |
Source code in src/zenml/integrations/integration.py
183 184 185 186 187 188 189 190 |
|
TensorBoardIntegration
Bases: Integration
Definition of TensorBoard integration for ZenML.
Functions
activate() -> None
classmethod
Activates the integration.
Source code in src/zenml/integrations/tensorboard/__init__.py
40 41 42 43 |
|
get_requirements(target_os: Optional[str] = None) -> List[str]
classmethod
Defines platform specific requirements for the integration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
target_os
|
Optional[str]
|
The target operating system. |
None
|
Returns:
Type | Description |
---|---|
List[str]
|
A list of requirements. |
Source code in src/zenml/integrations/tensorboard/__init__.py
27 28 29 30 31 32 33 34 35 36 37 38 |
|
Modules
services
Initialization for TensorBoard services.
Classes
Modules
tensorboard_service
Implementation of the TensorBoard service.
TensorboardService(config: Union[TensorboardServiceConfig, Dict[str, Any]], **attrs: Any)
Bases: LocalDaemonService
TensorBoard service.
This can be used to start a local TensorBoard server for one or more models.
Attributes:
Name | Type | Description |
---|---|---|
SERVICE_TYPE |
a service type descriptor with information describing the TensorBoard service class |
|
config |
TensorboardServiceConfig
|
service configuration |
endpoint |
LocalDaemonServiceEndpoint
|
optional service endpoint |
Initialization for TensorBoard service.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
Union[TensorboardServiceConfig, Dict[str, Any]]
|
service configuration |
required |
**attrs
|
Any
|
additional attributes |
{}
|
Source code in src/zenml/integrations/tensorboard/services/tensorboard_service.py
74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 |
|
run() -> None
Initialize and run the TensorBoard server.
Source code in src/zenml/integrations/tensorboard/services/tensorboard_service.py
108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 |
|
TensorboardServiceConfig(**data: Any)
Bases: LocalDaemonServiceConfig
TensorBoard service configuration.
Attributes:
Name | Type | Description |
---|---|---|
logdir |
str
|
location of TensorBoard log files. |
max_reload_threads |
int
|
the max number of threads that TensorBoard can use to reload runs. Each thread reloads one run at a time. |
reload_interval |
int
|
how often the backend should load more data, in seconds. Set to 0 to load just once at startup. |
Source code in src/zenml/services/service.py
130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 |
|
visualizers
Initialization for TensorBoard visualizer.
Classes
Functions
Modules
tensorboard_visualizer
Implementation of a TensorBoard visualizer step.
TensorboardVisualizer
The implementation of a TensorBoard Visualizer.
find_running_tensorboard_server(logdir: str) -> Optional[TensorBoardInfo]
classmethod
Find a local TensorBoard server instance.
Finds when it is running for the supplied logdir location and return its TCP port.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
logdir
|
str
|
The logdir location where the TensorBoard server is running. |
required |
Returns:
Type | Description |
---|---|
Optional[TensorBoardInfo]
|
The TensorBoardInfo describing the running TensorBoard server or |
Optional[TensorBoardInfo]
|
None if no server is running for the supplied logdir location. |
Source code in src/zenml/integrations/tensorboard/visualizers/tensorboard_visualizer.py
46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 |
|
stop(object: StepRunResponse) -> None
Stop the TensorBoard server previously started for a pipeline step.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
object
|
StepRunResponse
|
StepRunResponseModel fetched from get_step(). |
required |
Source code in src/zenml/integrations/tensorboard/visualizers/tensorboard_visualizer.py
151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 |
|
visualize(object: StepRunResponse, height: int = 800, *args: Any, **kwargs: Any) -> None
Start a TensorBoard server.
Allows for the visualization of all models logged as artifacts by the indicated step. The server will monitor and display all the models logged by past and future step runs.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
object
|
StepRunResponse
|
StepRunResponseModel fetched from get_step(). |
required |
height
|
int
|
Height of the generated visualization. |
800
|
*args
|
Any
|
Additional arguments. |
()
|
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Source code in src/zenml/integrations/tensorboard/visualizers/tensorboard_visualizer.py
71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 |
|
visualize_tensorboard(port: int, height: int) -> None
Generate a visualization of a TensorBoard.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
port
|
int
|
the TCP port where the TensorBoard server is listening for requests. |
required |
height
|
int
|
Height of the generated visualization. |
required |
Source code in src/zenml/integrations/tensorboard/visualizers/tensorboard_visualizer.py
129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 |
|
get_step(pipeline_name: str, step_name: str) -> StepRunResponse
Get the StepRunResponseModel for the specified pipeline and step name.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
pipeline_name
|
str
|
The name of the pipeline. |
required |
step_name
|
str
|
The name of the step. |
required |
Returns:
Type | Description |
---|---|
StepRunResponse
|
The StepRunResponseModel for the specified pipeline and step name. |
Raises:
Type | Description |
---|---|
RuntimeError
|
If the step is not found. |
Source code in src/zenml/integrations/tensorboard/visualizers/tensorboard_visualizer.py
190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 |
|
stop_tensorboard_server(pipeline_name: str, step_name: str) -> None
Stop the TensorBoard server previously started for a pipeline step.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
pipeline_name
|
str
|
the name of the pipeline |
required |
step_name
|
str
|
pipeline step name |
required |
Source code in src/zenml/integrations/tensorboard/visualizers/tensorboard_visualizer.py
234 235 236 237 238 239 240 241 242 |
|
visualize_tensorboard(pipeline_name: str, step_name: str) -> None
Start a TensorBoard server.
Allows for the visualization of all models logged as output by the named pipeline step. The server will monitor and display all the models logged by past and future step runs.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
pipeline_name
|
str
|
the name of the pipeline |
required |
step_name
|
str
|
pipeline step name |
required |
Source code in src/zenml/integrations/tensorboard/visualizers/tensorboard_visualizer.py
219 220 221 222 223 224 225 226 227 228 229 230 231 |
|