Coursera

TFX Components Walk-through

Learning objectives

  1. Develop a high level understanding of TFX pipeline components.
  2. Learn how to use a TFX Interactive Context for prototype development of TFX pipelines.
  3. Work with the Tensorflow Data Validation (TFDV) library to check and analyze input data.
  4. Utilize the Tensorflow Transform (TFT) library for scalable data preprocessing and feature transformations.
  5. Employ the Tensorflow Model Analysis (TFMA) library for model evaluation.

Introduction

In this notebook, you will work with the Covertype Data Set and use TFX to analyze, understand, and pre-process the dataset and train, analyze, validate, and deploy a multi-class classification model to predict the type of forest cover from cartographic features.

You will utilize TFX Interactive Context to work with the TFX components interactivelly in a Jupyter notebook environment. Working in an interactive notebook is useful when doing initial data exploration, experimenting with models, and designing ML pipelines. You should be aware that there are differences in the way interactive notebooks are orchestrated, and how they access metadata artifacts. In a production deployment of TFX on GCP, you will use an orchestrator such as Kubeflow Pipelines, or Cloud Composer. In an interactive mode, the notebook itself is the orchestrator, running each TFX component as you execute the notebook cells. In a production deployment, ML Metadata will be managed in a scalabe database like MySQL, and artifacts in apersistent store such as Google Cloud Storage. In an interactive mode, both properties and payloads are stored in a local file system of the Jupyter host.

Setup Note:

Currently, TFMA visualizations do not render properly in JupyterLab. It is recommended to run this notebook in Jupyter Classic Notebook. To switch to Classic Notebook select Launch Classic Notebook from the Help menu.

import absl
import os
import tempfile
import time

import tensorflow as tf
import tensorflow_data_validation as tfdv
import tensorflow_model_analysis as tfma
import tensorflow_transform as tft
import tfx

from pprint import pprint
from tensorflow_metadata.proto.v0 import schema_pb2, statistics_pb2, anomalies_pb2
from tensorflow_transform.tf_metadata import schema_utils
from tfx.components import CsvExampleGen
from tfx.components import Evaluator
from tfx.components import ExampleValidator
from tfx.components import InfraValidator
from tfx.components import Pusher
from tfx.components import ResolverNode
from tfx.components import SchemaGen
from tfx.components import StatisticsGen
from tfx.components import Trainer
from tfx.components import Transform
from tfx.components import Tuner
from tfx.dsl.components.base import executor_spec
from tfx.components.common_nodes.importer_node import ImporterNode
from tfx.components.trainer import executor as trainer_executor
from tfx.dsl.experimental import latest_blessed_model_resolver
from tfx.orchestration import metadata
from tfx.orchestration import pipeline
from tfx.orchestration.experimental.interactive.interactive_context import InteractiveContext
from tfx.proto import evaluator_pb2
from tfx.proto import example_gen_pb2
from tfx.proto import infra_validator_pb2
from tfx.proto import pusher_pb2
from tfx.proto import trainer_pb2
from tfx.proto.evaluator_pb2 import SingleSlicingSpec

from tfx.types import Channel
from tfx.types.standard_artifacts import Model
from tfx.types.standard_artifacts import HyperParameters
from tfx.types.standard_artifacts import ModelBlessing
from tfx.types.standard_artifacts import InfraBlessing
WARNING:absl:RuntimeParameter is only supported on Cloud-based DAG runner currently.

Note: this lab was developed and tested with the following TF ecosystem package versions:

Tensorflow Version: 2.3.1
TFX Version: 0.25.0
TFDV Version: 0.25.0
TFMA Version: 0.25.0

If you encounter errors with the above imports (e.g. TFX component not found), check your package versions in the cell below.

print("Tensorflow Version:", tf.__version__)
print("TFX Version:", tfx.__version__)
print("TFDV Version:", tfdv.__version__)
print("TFMA Version:", tfma.VERSION_STRING)

absl.logging.set_verbosity(absl.logging.INFO)
Tensorflow Version: 2.3.4
TFX Version: 0.25.0
TFDV Version: 0.25.0
TFMA Version: 0.25.0

If the versions above do not match, update your packages in the current Jupyter kernel below. The default %pip package installation location is not on your system installation PATH; use the command below to append the local installation path to pick up the latest package versions. Note that you may also need to restart your notebook kernel to pick up the specified package versions and re-run the imports cell above before proceeding with the lab.

os.environ['PATH'] += os.pathsep + '/home/jupyter/.local/bin'

(Optional) Run the cell below only if your package versions do not match the lab defaults.

%pip install --upgrade --user tensorflow==2.3.1
%pip install --upgrade --user tfx==0.25.0
%pip install --upgrade --user tensorflow_data_validation==0.25.0
%pip install --upgrade --user tensorflow_model_analysis==0.25.0
Collecting tensorflow==2.3.1
  Downloading tensorflow-2.3.1-cp37-cp37m-manylinux2010_x86_64.whl (320.4 MB)
     |████████████████████████████████| 320.4 MB 15 kB/s              B/s eta 0:00:04MB 72.2 MB/s eta 0:00:04
[?25hRequirement already satisfied: absl-py>=0.7.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (0.10.0)
Requirement already satisfied: keras-preprocessing<1.2,>=1.1.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (1.1.2)
Requirement already satisfied: protobuf>=3.9.2 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow==2.3.1) (3.20.3)
Requirement already satisfied: wheel>=0.26 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (0.37.0)
Requirement already satisfied: opt-einsum>=2.3.2 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (3.3.0)
Requirement already satisfied: tensorboard<3,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (2.3.0)
Requirement already satisfied: six>=1.12.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (1.16.0)
Requirement already satisfied: numpy<1.19.0,>=1.16.0 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow==2.3.1) (1.18.5)
Requirement already satisfied: google-pasta>=0.1.8 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (0.2.0)
Requirement already satisfied: tensorflow-estimator<2.4.0,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (2.3.0)
Requirement already satisfied: wrapt>=1.11.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (1.13.3)
Requirement already satisfied: grpcio>=1.8.6 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow==2.3.1) (1.62.1)
Requirement already satisfied: astunparse==1.6.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (1.6.3)
Requirement already satisfied: gast==0.3.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (0.3.3)
Requirement already satisfied: termcolor>=1.1.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (1.1.0)
Requirement already satisfied: h5py<2.11.0,>=2.10.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow==2.3.1) (2.10.0)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow==2.3.1) (1.8.0)
Requirement already satisfied: requests<3,>=2.21.0 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow==2.3.1) (2.26.0)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow==2.3.1) (0.4.6)
Requirement already satisfied: werkzeug>=0.11.15 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow==2.3.1) (2.0.2)
Requirement already satisfied: markdown>=2.6.8 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow==2.3.1) (3.3.6)
Requirement already satisfied: google-auth<2,>=1.6.3 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow==2.3.1) (1.35.0)
Requirement already satisfied: setuptools>=41.0.0 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow==2.3.1) (59.4.0)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.6.3->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (4.2.4)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.6.3->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (0.2.7)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.6.3->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (4.8)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/conda/lib/python3.7/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (1.3.0)
Requirement already satisfied: importlib-metadata>=4.4 in /opt/conda/lib/python3.7/site-packages (from markdown>=2.6.8->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (4.8.2)
Requirement already satisfied: charset-normalizer~=2.0.0 in /opt/conda/lib/python3.7/site-packages (from requests<3,>=2.21.0->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (2.0.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/conda/lib/python3.7/site-packages (from requests<3,>=2.21.0->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (1.26.7)
Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.7/site-packages (from requests<3,>=2.21.0->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (3.1)
Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.7/site-packages (from requests<3,>=2.21.0->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (2021.10.8)
Requirement already satisfied: typing-extensions>=3.6.4 in /home/jupyter/.local/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (3.7.4.3)
Requirement already satisfied: zipp>=0.5 in /opt/conda/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (3.6.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /opt/conda/lib/python3.7/site-packages (from pyasn1-modules>=0.2.1->google-auth<2,>=1.6.3->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (0.4.8)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/conda/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<3,>=2.3.0->tensorflow==2.3.1) (3.1.1)
Installing collected packages: tensorflow
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
explainable-ai-sdk 1.3.2 requires xai-image-widget, which is not installed.
fairness-indicators 0.26.0 requires tensorflow-data-validation<0.27,>=0.26, but you have tensorflow-data-validation 0.25.0 which is incompatible.
fairness-indicators 0.26.0 requires tensorflow-model-analysis<0.27,>=0.26, but you have tensorflow-model-analysis 0.25.0 which is incompatible.
Successfully installed tensorflow-2.3.1
Note: you may need to restart the kernel to use updated packages.
Requirement already satisfied: tfx==0.25.0 in /home/jupyter/.local/lib/python3.7/site-packages (0.25.0)
Requirement already satisfied: tensorflow-model-analysis<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (0.25.0)
Requirement already satisfied: keras-tuner<2,>=1 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (1.0.1)
Requirement already satisfied: tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (2.3.1)
Requirement already satisfied: apache-beam[gcp]<3,>=2.25 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (2.28.0)
Requirement already satisfied: attrs<21,>=19.3.0 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (20.3.0)
Requirement already satisfied: tensorflow-hub<0.10,>=0.9.0 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (0.9.0)
Requirement already satisfied: docker<5,>=4.1 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (4.4.4)
Requirement already satisfied: tensorflow-serving-api!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (2.3.0)
Requirement already satisfied: protobuf<4,>=3.12.2 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (3.20.3)
Requirement already satisfied: tensorflow-cloud<0.2,>=0.1 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (0.1.13)
Requirement already satisfied: ml-metadata<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (0.25.1)
Requirement already satisfied: grpcio<2,>=1.28.1 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (1.62.1)
Requirement already satisfied: tensorflow-transform<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (0.25.0)
Requirement already satisfied: tensorflow-data-validation<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (0.25.0)
Requirement already satisfied: absl-py<0.11,>=0.9 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (0.10.0)
Requirement already satisfied: kubernetes<12,>=10.0.1 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (11.0.0)
Requirement already satisfied: jinja2<3,>=2.7.3 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (2.11.3)
Requirement already satisfied: click<8,>=7 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (7.1.2)
Requirement already satisfied: tfx-bsl<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (0.25.0)
Requirement already satisfied: pyarrow<0.18,>=0.17 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (0.17.1)
Requirement already satisfied: google-api-python-client<2,>=1.7.8 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (1.12.11)
Requirement already satisfied: pyyaml<6,>=3.12 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx==0.25.0) (5.4.1)
Requirement already satisfied: six<2,>=1.10 in /opt/conda/lib/python3.7/site-packages (from tfx==0.25.0) (1.16.0)
Requirement already satisfied: oauth2client<5,>=2.0.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (4.1.3)
Requirement already satisfied: numpy<1.20.0,>=1.14.3 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.18.5)
Requirement already satisfied: pytz>=2018.3 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2021.3)
Requirement already satisfied: httplib2<0.18.0,>=0.8 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.17.4)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (3.12.2)
Requirement already satisfied: fastavro<2,>=0.21.4 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.4.7)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2.0.0)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.3.1.1)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2.6.0)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (3.7.4.3)
Requirement already satisfied: future<1.0.0,>=0.18.2 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.18.2)
Requirement already satisfied: crcmod<2.0,>=1.7 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.7)
Requirement already satisfied: pydot<2,>=1.2.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.4.2)
Requirement already satisfied: requests<3.0.0,>=2.24.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2.26.0)
Requirement already satisfied: avro-python3!=1.9.2,<1.10.0,>=1.8.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.9.2.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2.8.2)
Requirement already satisfied: google-cloud-dlp<2,>=0.12.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.0.0)
Requirement already satisfied: google-apitools<0.5.32,>=0.5.31 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.5.31)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.7.3)
Requirement already satisfied: google-cloud-datastore<2,>=1.7.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.15.5)
Requirement already satisfied: google-cloud-bigquery<2,>=1.6.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.28.3)
Requirement already satisfied: google-cloud-spanner<2,>=1.13.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.19.3)
Requirement already satisfied: cachetools<5,>=3.1.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (4.2.4)
Requirement already satisfied: google-cloud-vision<2,>=0.38.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.0.2)
Requirement already satisfied: google-cloud-language<2,>=1.3.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.3.2)
Requirement already satisfied: google-cloud-bigtable<2,>=0.31.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.7.3)
Requirement already satisfied: google-auth<2,>=1.18.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.35.0)
Requirement already satisfied: google-cloud-build<3,>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2.0.0)
Requirement already satisfied: google-cloud-pubsub<2,>=0.39.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.7.0)
Requirement already satisfied: google-cloud-videointelligence<2,>=1.8.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.16.3)
Requirement already satisfied: grpcio-gcp<1,>=0.2.2 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.2.2)
Requirement already satisfied: websocket-client>=0.32.0 in /opt/conda/lib/python3.7/site-packages (from docker<5,>=4.1->tfx==0.25.0) (1.2.1)
Requirement already satisfied: google-api-core<3dev,>=1.21.0 in /home/jupyter/.local/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.8->tfx==0.25.0) (1.34.1)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /opt/conda/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.8->tfx==0.25.0) (3.0.1)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /opt/conda/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.8->tfx==0.25.0) (0.1.0)
Requirement already satisfied: MarkupSafe>=0.23 in /opt/conda/lib/python3.7/site-packages (from jinja2<3,>=2.7.3->tfx==0.25.0) (1.1.1)
Requirement already satisfied: scipy in /opt/conda/lib/python3.7/site-packages (from keras-tuner<2,>=1->tfx==0.25.0) (1.7.3)
Requirement already satisfied: colorama in /opt/conda/lib/python3.7/site-packages (from keras-tuner<2,>=1->tfx==0.25.0) (0.4.4)
Requirement already satisfied: tabulate in /opt/conda/lib/python3.7/site-packages (from keras-tuner<2,>=1->tfx==0.25.0) (0.8.9)
Requirement already satisfied: terminaltables in /opt/conda/lib/python3.7/site-packages (from keras-tuner<2,>=1->tfx==0.25.0) (3.1.0)
Requirement already satisfied: tqdm in /opt/conda/lib/python3.7/site-packages (from keras-tuner<2,>=1->tfx==0.25.0) (4.62.3)
Requirement already satisfied: scikit-learn in /opt/conda/lib/python3.7/site-packages (from keras-tuner<2,>=1->tfx==0.25.0) (1.0.1)
Requirement already satisfied: setuptools>=21.0.0 in /opt/conda/lib/python3.7/site-packages (from kubernetes<12,>=10.0.1->tfx==0.25.0) (59.4.0)
Requirement already satisfied: requests-oauthlib in /opt/conda/lib/python3.7/site-packages (from kubernetes<12,>=10.0.1->tfx==0.25.0) (1.3.0)
Requirement already satisfied: urllib3>=1.24.2 in /opt/conda/lib/python3.7/site-packages (from kubernetes<12,>=10.0.1->tfx==0.25.0) (1.26.7)
Requirement already satisfied: certifi>=14.05.14 in /opt/conda/lib/python3.7/site-packages (from kubernetes<12,>=10.0.1->tfx==0.25.0) (2021.10.8)
Requirement already satisfied: gast==0.3.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (0.3.3)
Requirement already satisfied: keras-preprocessing<1.2,>=1.1.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (1.1.2)
Requirement already satisfied: tensorboard<3,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (2.3.0)
Requirement already satisfied: opt-einsum>=2.3.2 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (3.3.0)
Requirement already satisfied: astunparse==1.6.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (1.6.3)
Requirement already satisfied: h5py<2.11.0,>=2.10.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (2.10.0)
Requirement already satisfied: tensorflow-estimator<2.4.0,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (2.3.0)
Requirement already satisfied: wrapt>=1.11.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (1.13.3)
Requirement already satisfied: wheel>=0.26 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (0.37.0)
Requirement already satisfied: termcolor>=1.1.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (1.1.0)
Requirement already satisfied: google-pasta>=0.1.8 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (0.2.0)
Requirement already satisfied: tensorflow-datasets<3.1.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow-cloud<0.2,>=0.1->tfx==0.25.0) (3.0.0)
Requirement already satisfied: google-cloud-storage in /opt/conda/lib/python3.7/site-packages (from tensorflow-cloud<0.2,>=0.1->tfx==0.25.0) (1.43.0)
Requirement already satisfied: pandas<2,>=1.0 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow-data-validation<0.26,>=0.25->tfx==0.25.0) (1.3.5)
Requirement already satisfied: joblib<0.15,>=0.12 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow-data-validation<0.26,>=0.25->tfx==0.25.0) (0.14.1)
Requirement already satisfied: tensorflow-metadata<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow-data-validation<0.26,>=0.25->tfx==0.25.0) (0.25.0)
Requirement already satisfied: ipython<8,>=7 in /opt/conda/lib/python3.7/site-packages (from tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (7.30.0)
Requirement already satisfied: ipywidgets<8,>=7 in /opt/conda/lib/python3.7/site-packages (from tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (7.6.5)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.56.2 in /home/jupyter/.local/lib/python3.7/site-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client<2,>=1.7.8->tfx==0.25.0) (1.63.0)
Requirement already satisfied: fasteners>=0.14 in /opt/conda/lib/python3.7/site-packages (from google-apitools<0.5.32,>=0.5.31->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.16.3)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.18.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (4.8)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.18.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.2.7)
Requirement already satisfied: google-resumable-media<2.0dev,>=0.6.0 in /home/jupyter/.local/lib/python3.7/site-packages (from google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.3.3)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in /opt/conda/lib/python3.7/site-packages (from google-cloud-bigtable<2,>=0.31.1->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.12.3)
Requirement already satisfied: proto-plus>=0.4.0 in /opt/conda/lib/python3.7/site-packages (from google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.19.8)
Requirement already satisfied: libcst>=0.2.5 in /opt/conda/lib/python3.7/site-packages (from google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.3.23)
Requirement already satisfied: docopt in /opt/conda/lib/python3.7/site-packages (from hdfs<3.0.0,>=2.1.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.6.2)
Requirement already satisfied: traitlets>=4.2 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (5.1.1)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (3.0.22)
Requirement already satisfied: pickleshare in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.7.5)
Requirement already satisfied: backcall in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.2.0)
Requirement already satisfied: pygments in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (2.10.0)
Requirement already satisfied: pexpect>4.3 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (4.8.0)
Requirement already satisfied: decorator in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (5.1.0)
Requirement already satisfied: jedi>=0.16 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.18.1)
Requirement already satisfied: matplotlib-inline in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.1.3)
Requirement already satisfied: jupyterlab-widgets>=1.0.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (1.0.2)
Requirement already satisfied: widgetsnbextension~=3.5.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (3.5.2)
Requirement already satisfied: ipython-genutils~=0.2.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.2.0)
Requirement already satisfied: ipykernel>=4.5.1 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (6.5.1)
Requirement already satisfied: nbformat>=4.2.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (5.1.3)
Requirement already satisfied: pbr>=0.11 in /opt/conda/lib/python3.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (5.8.0)
Requirement already satisfied: pyasn1>=0.1.7 in /opt/conda/lib/python3.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.4.8)
Requirement already satisfied: pyparsing>=2.1.4 in /opt/conda/lib/python3.7/site-packages (from pydot<2,>=1.2.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (3.0.6)
Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (3.1)
Requirement already satisfied: charset-normalizer~=2.0.0 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2.0.8)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (0.4.6)
Requirement already satisfied: markdown>=2.6.8 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (3.3.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (1.8.0)
Requirement already satisfied: werkzeug>=0.11.15 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tfx==0.25.0) (2.0.2)
Requirement already satisfied: promise in /opt/conda/lib/python3.7/site-packages (from tensorflow-datasets<3.1.0->tensorflow-cloud<0.2,>=0.1->tfx==0.25.0) (2.3)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/conda/lib/python3.7/site-packages (from requests-oauthlib->kubernetes<12,>=10.0.1->tfx==0.25.0) (3.1.1)
Requirement already satisfied: threadpoolctl>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from scikit-learn->keras-tuner<2,>=1->tfx==0.25.0) (3.0.0)
Requirement already satisfied: grpcio-status<2.0dev,>=1.33.2 in /opt/conda/lib/python3.7/site-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client<2,>=1.7.8->tfx==0.25.0) (1.42.0)
Requirement already satisfied: google-crc32c<2.0dev,>=1.0 in /opt/conda/lib/python3.7/site-packages (from google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.1.2)
Requirement already satisfied: importlib-metadata<5 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (4.8.2)
Requirement already satisfied: debugpy<2.0,>=1.0.0 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (1.5.1)
Requirement already satisfied: jupyter-client<8.0 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (7.1.0)
Requirement already satisfied: tornado<7.0,>=4.2 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (6.1)
Requirement already satisfied: argcomplete>=1.12.3 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (1.12.3)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /opt/conda/lib/python3.7/site-packages (from jedi>=0.16->ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.8.3)
Requirement already satisfied: typing-inspect>=0.4.0 in /opt/conda/lib/python3.7/site-packages (from libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.7.1)
Requirement already satisfied: jupyter-core in /opt/conda/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (4.9.1)
Requirement already satisfied: jsonschema!=2.5.0,>=2.4 in /opt/conda/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (4.2.1)
Requirement already satisfied: ptyprocess>=0.5 in /opt/conda/lib/python3.7/site-packages (from pexpect>4.3->ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.7.0)
Requirement already satisfied: wcwidth in /opt/conda/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.2.5)
Requirement already satisfied: notebook>=4.4.1 in /opt/conda/lib/python3.7/site-packages (from widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (6.4.6)
Requirement already satisfied: cffi>=1.0.0 in /opt/conda/lib/python3.7/site-packages (from google-crc32c<2.0dev,>=1.0->google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (1.15.0)
Requirement already satisfied: zipp>=0.5 in /opt/conda/lib/python3.7/site-packages (from importlib-metadata<5->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (3.6.0)
Requirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /opt/conda/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.18.0)
Requirement already satisfied: importlib-resources>=1.4.0 in /opt/conda/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (5.4.0)
Requirement already satisfied: nest-asyncio>=1.5 in /opt/conda/lib/python3.7/site-packages (from jupyter-client<8.0->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (1.5.4)
Requirement already satisfied: pyzmq>=13 in /opt/conda/lib/python3.7/site-packages (from jupyter-client<8.0->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (22.3.0)
Requirement already satisfied: entrypoints in /opt/conda/lib/python3.7/site-packages (from jupyter-client<8.0->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.3)
Requirement already satisfied: argon2-cffi in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (21.1.0)
Requirement already satisfied: nbconvert in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (6.3.0)
Requirement already satisfied: prometheus-client in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.12.0)
Requirement already satisfied: Send2Trash>=1.8.0 in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (1.8.0)
Requirement already satisfied: terminado>=0.8.3 in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.12.1)
Requirement already satisfied: mypy-extensions>=0.3.0 in /opt/conda/lib/python3.7/site-packages (from typing-inspect>=0.4.0->libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (0.4.3)
Requirement already satisfied: pycparser in /opt/conda/lib/python3.7/site-packages (from cffi>=1.0.0->google-crc32c<2.0dev,>=1.0->google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tfx==0.25.0) (2.21)
Requirement already satisfied: pandocfilters>=1.4.1 in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (1.5.0)
Requirement already satisfied: defusedxml in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.7.1)
Requirement already satisfied: jupyterlab-pygments in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.1.2)
Requirement already satisfied: testpath in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.5.0)
Requirement already satisfied: bleach in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (4.1.0)
Requirement already satisfied: nbclient<0.6.0,>=0.5.0 in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.5.9)
Requirement already satisfied: mistune<2,>=0.8.1 in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.8.4)
Requirement already satisfied: packaging in /opt/conda/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (21.3)
Requirement already satisfied: webencodings in /opt/conda/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow-model-analysis<0.26,>=0.25->tfx==0.25.0) (0.5.1)
Note: you may need to restart the kernel to use updated packages.
Requirement already satisfied: tensorflow_data_validation==0.25.0 in /home/jupyter/.local/lib/python3.7/site-packages (0.25.0)
Requirement already satisfied: tfx-bsl<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (0.25.0)
Requirement already satisfied: pyarrow<0.18,>=0.17 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (0.17.1)
Requirement already satisfied: six<2,>=1.12 in /opt/conda/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (1.16.0)
Requirement already satisfied: pandas<2,>=1.0 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (1.3.5)
Requirement already satisfied: tensorflow-metadata<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (0.25.0)
Requirement already satisfied: absl-py<0.11,>=0.9 in /opt/conda/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (0.10.0)
Requirement already satisfied: numpy<2,>=1.16 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (1.18.5)
Requirement already satisfied: tensorflow-transform<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (0.25.0)
Requirement already satisfied: apache-beam[gcp]<3,>=2.25 in /opt/conda/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (2.28.0)
Requirement already satisfied: tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (2.3.1)
Requirement already satisfied: protobuf<4,>=3.9.2 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (3.20.3)
Requirement already satisfied: joblib<0.15,>=0.12 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_data_validation==0.25.0) (0.14.1)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2.8.2)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2.0.0)
Requirement already satisfied: pytz>=2018.3 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2021.3)
Requirement already satisfied: fastavro<2,>=0.21.4 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.4.7)
Requirement already satisfied: future<1.0.0,>=0.18.2 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.18.2)
Requirement already satisfied: pydot<2,>=1.2.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.4.2)
Requirement already satisfied: requests<3.0.0,>=2.24.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2.26.0)
Requirement already satisfied: avro-python3!=1.9.2,<1.10.0,>=1.8.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.9.2.1)
Requirement already satisfied: httplib2<0.18.0,>=0.8 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.17.4)
Requirement already satisfied: crcmod<2.0,>=1.7 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.7)
Requirement already satisfied: oauth2client<5,>=2.0.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (4.1.3)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.3.1.1)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (3.12.2)
Requirement already satisfied: grpcio<2,>=1.29.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.62.1)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2.6.0)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (3.7.4.3)
Requirement already satisfied: google-cloud-spanner<2,>=1.13.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.19.3)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.7.3)
Requirement already satisfied: google-cloud-bigtable<2,>=0.31.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.7.3)
Requirement already satisfied: google-cloud-vision<2,>=0.38.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.0.2)
Requirement already satisfied: google-apitools<0.5.32,>=0.5.31 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.5.31)
Requirement already satisfied: google-cloud-build<3,>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2.0.0)
Requirement already satisfied: google-cloud-bigquery<2,>=1.6.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.28.3)
Requirement already satisfied: google-cloud-language<2,>=1.3.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.3.2)
Requirement already satisfied: google-cloud-dlp<2,>=0.12.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.0.0)
Requirement already satisfied: google-cloud-videointelligence<2,>=1.8.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.16.3)
Requirement already satisfied: google-cloud-datastore<2,>=1.7.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.15.5)
Requirement already satisfied: cachetools<5,>=3.1.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (4.2.4)
Requirement already satisfied: google-cloud-pubsub<2,>=0.39.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.7.0)
Requirement already satisfied: grpcio-gcp<1,>=0.2.2 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.2.2)
Requirement already satisfied: google-auth<2,>=1.18.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.35.0)
Requirement already satisfied: wheel>=0.26 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (0.37.0)
Requirement already satisfied: h5py<2.11.0,>=2.10.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (2.10.0)
Requirement already satisfied: termcolor>=1.1.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (1.1.0)
Requirement already satisfied: google-pasta>=0.1.8 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (0.2.0)
Requirement already satisfied: wrapt>=1.11.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (1.13.3)
Requirement already satisfied: gast==0.3.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (0.3.3)
Requirement already satisfied: astunparse==1.6.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (1.6.3)
Requirement already satisfied: opt-einsum>=2.3.2 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (3.3.0)
Requirement already satisfied: tensorboard<3,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (2.3.0)
Requirement already satisfied: tensorflow-estimator<2.4.0,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (2.3.0)
Requirement already satisfied: keras-preprocessing<1.2,>=1.1.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (1.1.2)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow-metadata<0.26,>=0.25->tensorflow_data_validation==0.25.0) (1.63.0)
Requirement already satisfied: google-api-python-client<2,>=1.7.11 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx-bsl<0.26,>=0.25->tensorflow_data_validation==0.25.0) (1.12.11)
Requirement already satisfied: tensorflow-serving-api!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15 in /opt/conda/lib/python3.7/site-packages (from tfx-bsl<0.26,>=0.25->tensorflow_data_validation==0.25.0) (2.3.0)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /opt/conda/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_data_validation==0.25.0) (3.0.1)
Requirement already satisfied: google-api-core<3dev,>=1.21.0 in /home/jupyter/.local/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_data_validation==0.25.0) (1.34.1)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /opt/conda/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_data_validation==0.25.0) (0.1.0)
Requirement already satisfied: fasteners>=0.14 in /opt/conda/lib/python3.7/site-packages (from google-apitools<0.5.32,>=0.5.31->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.16.3)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.18.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (4.8)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.18.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.2.7)
Requirement already satisfied: setuptools>=40.3.0 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.18.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (59.4.0)
Requirement already satisfied: google-resumable-media<2.0dev,>=0.6.0 in /home/jupyter/.local/lib/python3.7/site-packages (from google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.3.3)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in /opt/conda/lib/python3.7/site-packages (from google-cloud-bigtable<2,>=0.31.1->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.12.3)
Requirement already satisfied: libcst>=0.2.5 in /opt/conda/lib/python3.7/site-packages (from google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.3.23)
Requirement already satisfied: proto-plus>=0.4.0 in /opt/conda/lib/python3.7/site-packages (from google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.19.8)
Requirement already satisfied: docopt in /opt/conda/lib/python3.7/site-packages (from hdfs<3.0.0,>=2.1.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.6.2)
Requirement already satisfied: pbr>=0.11 in /opt/conda/lib/python3.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (5.8.0)
Requirement already satisfied: pyasn1>=0.1.7 in /opt/conda/lib/python3.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.4.8)
Requirement already satisfied: pyparsing>=2.1.4 in /opt/conda/lib/python3.7/site-packages (from pydot<2,>=1.2.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (3.0.6)
Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (3.1)
Requirement already satisfied: charset-normalizer~=2.0.0 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2.0.8)
Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2021.10.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.26.7)
Requirement already satisfied: werkzeug>=0.11.15 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (2.0.2)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (0.4.6)
Requirement already satisfied: markdown>=2.6.8 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (3.3.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (1.8.0)
Requirement already satisfied: grpcio-status<2.0dev,>=1.33.2 in /opt/conda/lib/python3.7/site-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_data_validation==0.25.0) (1.42.0)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/conda/lib/python3.7/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (1.3.0)
Requirement already satisfied: google-crc32c<2.0dev,>=1.0 in /opt/conda/lib/python3.7/site-packages (from google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.1.2)
Requirement already satisfied: typing-inspect>=0.4.0 in /opt/conda/lib/python3.7/site-packages (from libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.7.1)
Requirement already satisfied: pyyaml>=5.2 in /home/jupyter/.local/lib/python3.7/site-packages (from libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (5.4.1)
Requirement already satisfied: importlib-metadata>=4.4 in /opt/conda/lib/python3.7/site-packages (from markdown>=2.6.8->tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (4.8.2)
Requirement already satisfied: cffi>=1.0.0 in /opt/conda/lib/python3.7/site-packages (from google-crc32c<2.0dev,>=1.0->google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (1.15.0)
Requirement already satisfied: zipp>=0.5 in /opt/conda/lib/python3.7/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (3.6.0)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/conda/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_data_validation==0.25.0) (3.1.1)
Requirement already satisfied: mypy-extensions>=0.3.0 in /opt/conda/lib/python3.7/site-packages (from typing-inspect>=0.4.0->libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (0.4.3)
Requirement already satisfied: pycparser in /opt/conda/lib/python3.7/site-packages (from cffi>=1.0.0->google-crc32c<2.0dev,>=1.0->google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_data_validation==0.25.0) (2.21)
Note: you may need to restart the kernel to use updated packages.
Requirement already satisfied: tensorflow_model_analysis==0.25.0 in /home/jupyter/.local/lib/python3.7/site-packages (0.25.0)
Requirement already satisfied: pyarrow<0.18,>=0.17 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (0.17.1)
Requirement already satisfied: tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (2.3.1)
Requirement already satisfied: pandas<2,>=1.0 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (1.3.5)
Requirement already satisfied: numpy<2,>=1.16 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (1.18.5)
Requirement already satisfied: six<2,>=1.12 in /opt/conda/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (1.16.0)
Requirement already satisfied: absl-py<0.11,>=0.9 in /opt/conda/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (0.10.0)
Requirement already satisfied: apache-beam[gcp]<3,>=2.25 in /opt/conda/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (2.28.0)
Requirement already satisfied: protobuf<4,>=3.9.2 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (3.20.3)
Requirement already satisfied: tensorflow-metadata<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (0.25.0)
Requirement already satisfied: tfx-bsl<0.26,>=0.25 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (0.25.0)
Requirement already satisfied: ipython<8,>=7 in /opt/conda/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (7.30.0)
Requirement already satisfied: ipywidgets<8,>=7 in /opt/conda/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (7.6.5)
Requirement already satisfied: scipy<2,>=1.4.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow_model_analysis==0.25.0) (1.7.3)
Requirement already satisfied: future<1.0.0,>=0.18.2 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.18.2)
Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2.6.0)
Requirement already satisfied: pytz>=2018.3 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2021.3)
Requirement already satisfied: crcmod<2.0,>=1.7 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.7)
Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.3.1.1)
Requirement already satisfied: httplib2<0.18.0,>=0.8 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.17.4)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2.0.0)
Requirement already satisfied: avro-python3!=1.9.2,<1.10.0,>=1.8.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.9.2.1)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (3.12.2)
Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (3.7.4.3)
Requirement already satisfied: fastavro<2,>=0.21.4 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.4.7)
Requirement already satisfied: pydot<2,>=1.2.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.4.2)
Requirement already satisfied: requests<3.0.0,>=2.24.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2.26.0)
Requirement already satisfied: python-dateutil<3,>=2.8.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2.8.2)
Requirement already satisfied: grpcio<2,>=1.29.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.62.1)
Requirement already satisfied: oauth2client<5,>=2.0.1 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (4.1.3)
Requirement already satisfied: google-cloud-build<3,>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2.0.0)
Requirement already satisfied: google-apitools<0.5.32,>=0.5.31 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.5.31)
Requirement already satisfied: google-cloud-vision<2,>=0.38.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.0.2)
Requirement already satisfied: google-cloud-spanner<2,>=1.13.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.19.3)
Requirement already satisfied: grpcio-gcp<1,>=0.2.2 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.2.2)
Requirement already satisfied: cachetools<5,>=3.1.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (4.2.4)
Requirement already satisfied: google-auth<2,>=1.18.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.35.0)
Requirement already satisfied: google-cloud-datastore<2,>=1.7.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.15.5)
Requirement already satisfied: google-cloud-pubsub<2,>=0.39.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.7.0)
Requirement already satisfied: google-cloud-bigquery<2,>=1.6.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.28.3)
Requirement already satisfied: google-cloud-videointelligence<2,>=1.8.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.16.3)
Requirement already satisfied: google-cloud-core<2,>=0.28.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.7.3)
Requirement already satisfied: google-cloud-language<2,>=1.3.0 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.3.2)
Requirement already satisfied: google-cloud-bigtable<2,>=0.31.1 in /home/jupyter/.local/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.7.3)
Requirement already satisfied: google-cloud-dlp<2,>=0.12.0 in /opt/conda/lib/python3.7/site-packages (from apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.0.0)
Requirement already satisfied: jedi>=0.16 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (0.18.1)
Requirement already satisfied: matplotlib-inline in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (0.1.3)
Requirement already satisfied: setuptools>=18.5 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (59.4.0)
Requirement already satisfied: pickleshare in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (0.7.5)
Requirement already satisfied: pygments in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (2.10.0)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (3.0.22)
Requirement already satisfied: traitlets>=4.2 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (5.1.1)
Requirement already satisfied: pexpect>4.3 in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (4.8.0)
Requirement already satisfied: decorator in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (5.1.0)
Requirement already satisfied: backcall in /opt/conda/lib/python3.7/site-packages (from ipython<8,>=7->tensorflow_model_analysis==0.25.0) (0.2.0)
Requirement already satisfied: ipykernel>=4.5.1 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (6.5.1)
Requirement already satisfied: nbformat>=4.2.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (5.1.3)
Requirement already satisfied: ipython-genutils~=0.2.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.2.0)
Requirement already satisfied: jupyterlab-widgets>=1.0.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (1.0.2)
Requirement already satisfied: widgetsnbextension~=3.5.0 in /opt/conda/lib/python3.7/site-packages (from ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (3.5.2)
Requirement already satisfied: wheel>=0.26 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (0.37.0)
Requirement already satisfied: opt-einsum>=2.3.2 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (3.3.0)
Requirement already satisfied: tensorboard<3,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (2.3.0)
Requirement already satisfied: tensorflow-estimator<2.4.0,>=2.3.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (2.3.0)
Requirement already satisfied: termcolor>=1.1.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (1.1.0)
Requirement already satisfied: keras-preprocessing<1.2,>=1.1.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (1.1.2)
Requirement already satisfied: gast==0.3.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (0.3.3)
Requirement already satisfied: wrapt>=1.11.1 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (1.13.3)
Requirement already satisfied: google-pasta>=0.1.8 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (0.2.0)
Requirement already satisfied: h5py<2.11.0,>=2.10.0 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (2.10.0)
Requirement already satisfied: astunparse==1.6.3 in /opt/conda/lib/python3.7/site-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (1.6.3)
Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /home/jupyter/.local/lib/python3.7/site-packages (from tensorflow-metadata<0.26,>=0.25->tensorflow_model_analysis==0.25.0) (1.63.0)
Requirement already satisfied: tensorflow-serving-api!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15 in /opt/conda/lib/python3.7/site-packages (from tfx-bsl<0.26,>=0.25->tensorflow_model_analysis==0.25.0) (2.3.0)
Requirement already satisfied: google-api-python-client<2,>=1.7.11 in /home/jupyter/.local/lib/python3.7/site-packages (from tfx-bsl<0.26,>=0.25->tensorflow_model_analysis==0.25.0) (1.12.11)
Requirement already satisfied: google-api-core<3dev,>=1.21.0 in /home/jupyter/.local/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_model_analysis==0.25.0) (1.34.1)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /opt/conda/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_model_analysis==0.25.0) (3.0.1)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /opt/conda/lib/python3.7/site-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_model_analysis==0.25.0) (0.1.0)
Requirement already satisfied: fasteners>=0.14 in /opt/conda/lib/python3.7/site-packages (from google-apitools<0.5.32,>=0.5.31->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.16.3)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.18.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.2.7)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.7/site-packages (from google-auth<2,>=1.18.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (4.8)
Requirement already satisfied: google-resumable-media<2.0dev,>=0.6.0 in /home/jupyter/.local/lib/python3.7/site-packages (from google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.3.3)
Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in /opt/conda/lib/python3.7/site-packages (from google-cloud-bigtable<2,>=0.31.1->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.12.3)
Requirement already satisfied: proto-plus>=0.4.0 in /opt/conda/lib/python3.7/site-packages (from google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.19.8)
Requirement already satisfied: libcst>=0.2.5 in /opt/conda/lib/python3.7/site-packages (from google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.3.23)
Requirement already satisfied: docopt in /opt/conda/lib/python3.7/site-packages (from hdfs<3.0.0,>=2.1.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.6.2)
Requirement already satisfied: importlib-metadata<5 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (4.8.2)
Requirement already satisfied: debugpy<2.0,>=1.0.0 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (1.5.1)
Requirement already satisfied: tornado<7.0,>=4.2 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (6.1)
Requirement already satisfied: jupyter-client<8.0 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (7.1.0)
Requirement already satisfied: argcomplete>=1.12.3 in /opt/conda/lib/python3.7/site-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (1.12.3)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /opt/conda/lib/python3.7/site-packages (from jedi>=0.16->ipython<8,>=7->tensorflow_model_analysis==0.25.0) (0.8.3)
Requirement already satisfied: pbr>=0.11 in /opt/conda/lib/python3.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (5.8.0)
Requirement already satisfied: jupyter-core in /opt/conda/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (4.9.1)
Requirement already satisfied: jsonschema!=2.5.0,>=2.4 in /opt/conda/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (4.2.1)
Requirement already satisfied: pyasn1>=0.1.7 in /opt/conda/lib/python3.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.4.8)
Requirement already satisfied: ptyprocess>=0.5 in /opt/conda/lib/python3.7/site-packages (from pexpect>4.3->ipython<8,>=7->tensorflow_model_analysis==0.25.0) (0.7.0)
Requirement already satisfied: wcwidth in /opt/conda/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython<8,>=7->tensorflow_model_analysis==0.25.0) (0.2.5)
Requirement already satisfied: pyparsing>=2.1.4 in /opt/conda/lib/python3.7/site-packages (from pydot<2,>=1.2.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (3.0.6)
Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2021.10.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.26.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2.0.8)
Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (3.1)
Requirement already satisfied: werkzeug>=0.11.15 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (2.0.2)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (0.4.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (1.8.0)
Requirement already satisfied: markdown>=2.6.8 in /opt/conda/lib/python3.7/site-packages (from tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (3.3.6)
Requirement already satisfied: notebook>=4.4.1 in /opt/conda/lib/python3.7/site-packages (from widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (6.4.6)
Requirement already satisfied: grpcio-status<2.0dev,>=1.33.2 in /opt/conda/lib/python3.7/site-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client<2,>=1.7.11->tfx-bsl<0.26,>=0.25->tensorflow_model_analysis==0.25.0) (1.42.0)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /opt/conda/lib/python3.7/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (1.3.0)
Requirement already satisfied: google-crc32c<2.0dev,>=1.0 in /opt/conda/lib/python3.7/site-packages (from google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.1.2)
Requirement already satisfied: zipp>=0.5 in /opt/conda/lib/python3.7/site-packages (from importlib-metadata<5->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (3.6.0)
Requirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /opt/conda/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.18.0)
Requirement already satisfied: importlib-resources>=1.4.0 in /opt/conda/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (5.4.0)
Requirement already satisfied: attrs>=17.4.0 in /home/jupyter/.local/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (20.3.0)
Requirement already satisfied: pyzmq>=13 in /opt/conda/lib/python3.7/site-packages (from jupyter-client<8.0->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (22.3.0)
Requirement already satisfied: nest-asyncio>=1.5 in /opt/conda/lib/python3.7/site-packages (from jupyter-client<8.0->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (1.5.4)
Requirement already satisfied: entrypoints in /opt/conda/lib/python3.7/site-packages (from jupyter-client<8.0->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.3)
Requirement already satisfied: typing-inspect>=0.4.0 in /opt/conda/lib/python3.7/site-packages (from libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.7.1)
Requirement already satisfied: pyyaml>=5.2 in /home/jupyter/.local/lib/python3.7/site-packages (from libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (5.4.1)
Requirement already satisfied: argon2-cffi in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (21.1.0)
Requirement already satisfied: jinja2 in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (2.11.3)
Requirement already satisfied: nbconvert in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (6.3.0)
Requirement already satisfied: Send2Trash>=1.8.0 in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (1.8.0)
Requirement already satisfied: prometheus-client in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.12.0)
Requirement already satisfied: terminado>=0.8.3 in /opt/conda/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.12.1)
Requirement already satisfied: cffi>=1.0.0 in /opt/conda/lib/python3.7/site-packages (from google-crc32c<2.0dev,>=1.0->google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (1.15.0)
Requirement already satisfied: oauthlib>=3.0.0 in /opt/conda/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<3,>=2.3.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,<3,>=1.15.2->tensorflow_model_analysis==0.25.0) (3.1.1)
Requirement already satisfied: mypy-extensions>=0.3.0 in /opt/conda/lib/python3.7/site-packages (from typing-inspect>=0.4.0->libcst>=0.2.5->google-cloud-build<3,>=2.0.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (0.4.3)
Requirement already satisfied: MarkupSafe>=0.23 in /opt/conda/lib/python3.7/site-packages (from jinja2->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (1.1.1)
Requirement already satisfied: jupyterlab-pygments in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.1.2)
Requirement already satisfied: nbclient<0.6.0,>=0.5.0 in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.5.9)
Requirement already satisfied: mistune<2,>=0.8.1 in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.8.4)
Requirement already satisfied: testpath in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.5.0)
Requirement already satisfied: pandocfilters>=1.4.1 in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (1.5.0)
Requirement already satisfied: defusedxml in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.7.1)
Requirement already satisfied: bleach in /opt/conda/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (4.1.0)
Requirement already satisfied: pycparser in /opt/conda/lib/python3.7/site-packages (from cffi>=1.0.0->google-crc32c<2.0dev,>=1.0->google-resumable-media<2.0dev,>=0.6.0->google-cloud-bigquery<2,>=1.6.0->apache-beam[gcp]<3,>=2.25->tensorflow_model_analysis==0.25.0) (2.21)
Requirement already satisfied: webencodings in /opt/conda/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (0.5.1)
Requirement already satisfied: packaging in /opt/conda/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets<8,>=7->tensorflow_model_analysis==0.25.0) (21.3)
Note: you may need to restart the kernel to use updated packages.

Restart the kernel by using Kernel > Restart kernel > Restart.

Configure lab settings

Set constants, location paths and other environment settings.

import absl
import os
import tempfile
import time

import tensorflow as tf
import tensorflow_data_validation as tfdv
import tensorflow_model_analysis as tfma
import tensorflow_transform as tft
import tfx

from pprint import pprint
from tensorflow_metadata.proto.v0 import schema_pb2, statistics_pb2, anomalies_pb2
from tensorflow_transform.tf_metadata import schema_utils
from tfx.components import CsvExampleGen
from tfx.components import Evaluator
from tfx.components import ExampleValidator
from tfx.components import InfraValidator
from tfx.components import Pusher
from tfx.components import ResolverNode
from tfx.components import SchemaGen
from tfx.components import StatisticsGen
from tfx.components import Trainer
from tfx.components import Transform
from tfx.components import Tuner
from tfx.dsl.components.base import executor_spec
from tfx.components.common_nodes.importer_node import ImporterNode
from tfx.components.trainer import executor as trainer_executor
from tfx.dsl.experimental import latest_blessed_model_resolver
from tfx.orchestration import metadata
from tfx.orchestration import pipeline
from tfx.orchestration.experimental.interactive.interactive_context import InteractiveContext
from tfx.proto import evaluator_pb2
from tfx.proto import example_gen_pb2
from tfx.proto import infra_validator_pb2
from tfx.proto import pusher_pb2
from tfx.proto import trainer_pb2
from tfx.proto.evaluator_pb2 import SingleSlicingSpec

from tfx.types import Channel
from tfx.types.standard_artifacts import Model
from tfx.types.standard_artifacts import HyperParameters
from tfx.types.standard_artifacts import ModelBlessing
from tfx.types.standard_artifacts import InfraBlessing
2024-03-24 14:09:55.240028: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcudart.so.10.1'; dlerror: libcudart.so.10.1: cannot open shared object file: No such file or directory
2024-03-24 14:09:55.240080: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
WARNING:absl:RuntimeParameter is only supported on Cloud-based DAG runner currently.
ARTIFACT_STORE = os.path.join(os.sep, 'home', 'jupyter', 'artifact-store')
SERVING_MODEL_DIR=os.path.join(os.sep, 'home', 'jupyter', 'serving_model')
DATA_ROOT = 'gs://cloud-training/OCBL203/workshop-datasets'

Creating Interactive Context

TFX Interactive Context allows you to create and run TFX Components in an interactive mode. It is designed to support experimentation and development in a Jupyter Notebook environment. It is an experimental feature and major changes to interface and functionality are expected. When creating the interactive context you can specifiy the following parameters:

PIPELINE_NAME = 'tfx-covertype-classifier'
PIPELINE_ROOT = os.path.join(ARTIFACT_STORE, PIPELINE_NAME, time.strftime("%Y%m%d_%H%M%S"))
os.makedirs(PIPELINE_ROOT, exist_ok=True)

context = InteractiveContext(
    pipeline_name=PIPELINE_NAME,
    pipeline_root=PIPELINE_ROOT,
    metadata_connection_config=None)
WARNING:absl:InteractiveContext metadata_connection_config not provided: using SQLite ML Metadata database at /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/metadata.sqlite.

Ingesting data using ExampleGen

In any ML development process the first step is to ingest the training and test datasets. The ExampleGen component ingests data into a TFX pipeline. It consumes external files/services to generate a set file files in the TFRecord format, which will be used by other TFX components. It can also shuffle the data and split into an arbitrary number of partitions.

Configure and run CsvExampleGen

In this exercise, you use the CsvExampleGen specialization of ExampleGen to ingest CSV files from a GCS location and emit them as tf.Example records for consumption by downstream TFX pipeline components. Your task is to configure the component to create 80-20 train and eval splits. Hint: review the ExampleGen proto definition to split your data with hash buckets.

output_config = example_gen_pb2.Output(
    split_config=example_gen_pb2.SplitConfig(splits=[
        # TODO: Your code to configure train data split
        example_gen_pb2.SplitConfig.Split(name="train", hash_buckets=4),
        # TODO: Your code to configure eval data split
        example_gen_pb2.SplitConfig.Split(name="eval", hash_buckets=1),
    ]))

example_gen = tfx.components.CsvExampleGen(
    input_base=DATA_ROOT,
    output_config=output_config)
context.run(example_gen)
WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.




WARNING:apache_beam.io.tfrecordio:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be.
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f86cd4ab810
.execution_id1
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs{}
.component.outputs
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Examine the ingested data

examples_uri = example_gen.outputs['examples'].get()[0].uri
tfrecord_filenames = [os.path.join(examples_uri, 'train', name)
                      for name in os.listdir(os.path.join(examples_uri, 'train'))]
dataset = tf.data.TFRecordDataset(tfrecord_filenames, compression_type="GZIP")
for tfrecord in dataset.take(2):
  example = tf.train.Example()
  example.ParseFromString(tfrecord.numpy())
  for name, feature in example.features.feature.items():
    if feature.HasField('bytes_list'):
        value = feature.bytes_list.value
    if feature.HasField('float_list'):
        value = feature.float_list.value
    if feature.HasField('int64_list'):
        value = feature.int64_list.value
    print('{}: {}'.format(name, value))
  print('******')
Slope: [9]
Horizontal_Distance_To_Hydrology: [648]
Hillshade_3pm: [157]
Elevation: [3142]
Aspect: [183]
Hillshade_9am: [223]
Vertical_Distance_To_Hydrology: [101]
Horizontal_Distance_To_Fire_Points: [1871]
Horizontal_Distance_To_Roadways: [757]
Cover_Type: [1]
Soil_Type: [b'C7757']
Wilderness_Area: [b'Commanche']
Hillshade_Noon: [247]
******
Horizontal_Distance_To_Hydrology: [60]
Aspect: [124]
Horizontal_Distance_To_Fire_Points: [451]
Horizontal_Distance_To_Roadways: [124]
Hillshade_Noon: [227]
Cover_Type: [2]
Soil_Type: [b'C2704']
Vertical_Distance_To_Hydrology: [9]
Hillshade_3pm: [105]
Elevation: [1967]
Wilderness_Area: [b'Cache']
Slope: [16]
Hillshade_9am: [245]
******


2024-03-24 14:16:08.199914: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory
2024-03-24 14:16:08.199981: W tensorflow/stream_executor/cuda/cuda_driver.cc:312] failed call to cuInit: UNKNOWN ERROR (303)
2024-03-24 14:16:08.200031: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (tfx-on-googlecloud): /proc/driver/nvidia/version does not exist
2024-03-24 14:16:08.200559: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN)to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-03-24 14:16:08.211594: I tensorflow/core/platform/profile_utils/cpu_utils.cc:104] CPU Frequency: 2199995000 Hz
2024-03-24 14:16:08.211825: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x562a9d950870 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2024-03-24 14:16:08.211851: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version

Generating statistics using StatisticsGen

The StatisticsGen component generates data statistics that can be used by other TFX components. StatisticsGen uses TensorFlow Data Validation. StatisticsGen generate statistics for each split in the ExampleGen component’s output. In our case there two splits: train and eval.

Configure and run the StatisticsGen component

statistics_gen = tfx.components.StatisticsGen(
    examples=example_gen.outputs['examples'])
context.run(statistics_gen)
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f8643cbe450
.execution_id2
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['statistics']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Visualize statistics

The generated statistics can be visualized using the tfdv.visualize_statistics() function from the TensorFlow Data Validation library or using a utility method of the InteractiveContext object. In fact, most of the artifacts generated by the TFX components can be visualized using InteractiveContext.

context.show(statistics_gen.outputs['statistics'])

Artifact at /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/StatisticsGen/statistics/2

'train' split:

WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow_data_validation/utils/stats_util.py:247: tf_record_iterator (from tensorflow.python.lib.io.tf_record) is deprecated and will be removed in a future version.
Instructions for updating:
Use eager execution and: 
`tf.data.TFRecordDataset(path)`


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow_data_validation/utils/stats_util.py:247: tf_record_iterator (from tensorflow.python.lib.io.tf_record) is deprecated and will be removed in a future version.
Instructions for updating:
Use eager execution and: 
`tf.data.TFRecordDataset(path)`
<iframe id='facets-iframe' width="100%" height="500px"> <script> facets_iframe = document.getElementById('facets-iframe'); facets_html = '<script src="https://cdnjs.cloudflare.com/ajax/libs/webcomponentsjs/1.3.3/webcomponents-lite.js"><\/script>'; facets_iframe.srcdoc = facets_html; facets_iframe.id = ""; setTimeout(() => { facets_iframe.setAttribute('height', facets_iframe.contentWindow.document.body.offsetHeight + 'px') }, 1500)
'eval' split:

<iframe id='facets-iframe' width="100%" height="500px"> <script> facets_iframe = document.getElementById('facets-iframe'); facets_html = '<script src="https://cdnjs.cloudflare.com/ajax/libs/webcomponentsjs/1.3.3/webcomponents-lite.js"><\/script>'; facets_iframe.srcdoc = facets_html; facets_iframe.id = ""; setTimeout(() => { facets_iframe.setAttribute('height', facets_iframe.contentWindow.document.body.offsetHeight + 'px') }, 1500)

Infering data schema using SchemaGen

Some TFX components use a description input data called a schema. The schema is an instance of schema.proto. It can specify data types for feature values, whether a feature has to be present in all examples, allowed value ranges, and other properties. SchemaGen automatically generates the schema by inferring types, categories, and ranges from data statistics. The auto-generated schema is best-effort and only tries to infer basic properties of the data. It is expected that developers review and modify it as needed. SchemaGen uses TensorFlow Data Validation.

The SchemaGen component generates the schema using the statistics for the train split. The statistics for other splits are ignored.

Configure and run the SchemaGen components

schema_gen = SchemaGen(
    statistics=statistics_gen.outputs['statistics'],
    infer_feature_shape=False)
context.run(schema_gen)
2024-03-24 14:16:17.415869: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f8648387f90
.execution_id3
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['statistics']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['schema']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Visualize the inferred schema

context.show(schema_gen.outputs['schema'])

Artifact at /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/SchemaGen/schema/3

<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
Type Presence Valency Domain
Feature name
'Aspect' INT required single -
'Cover_Type' INT required single -
'Elevation' INT required single -
'Hillshade_3pm' INT required single -
'Hillshade_9am' INT required single -
'Hillshade_Noon' INT required single -
'Horizontal_Distance_To_Fire_Points' INT required single -
'Horizontal_Distance_To_Hydrology' INT required single -
'Horizontal_Distance_To_Roadways' INT required single -
'Slope' INT required single -
'Soil_Type' STRING required single 'Soil_Type'
'Vertical_Distance_To_Hydrology' INT required single -
'Wilderness_Area' STRING required single 'Wilderness_Area'
/home/jupyter/.local/lib/python3.7/site-packages/tensorflow_data_validation/utils/display_util.py:151: FutureWarning: Passing a negative integer is deprecated in version 1.0 and will not be supported in future version. Instead, use None to not limit the column width.
  pd.set_option('max_colwidth', -1)
<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
Values
Domain
'Soil_Type' 'C2702', 'C2703', 'C2704', 'C2705', 'C2706', 'C2717', 'C3501', 'C3502', 'C4201', 'C4703', 'C4704', 'C4744', 'C4758', 'C5101', 'C5151', 'C6101', 'C6102', 'C6731', 'C7101', 'C7102', 'C7103', 'C7201', 'C7202', 'C7700', 'C7701', 'C7702', 'C7709', 'C7710', 'C7745', 'C7746', 'C7755', 'C7756', 'C7757', 'C7790', 'C8703', 'C8707', 'C8708', 'C8771', 'C8772', 'C8776'
'Wilderness_Area' 'Cache', 'Commanche', 'Neota', 'Rawah'

Updating the auto-generated schema

In most cases the auto-generated schemas must be fine-tuned manually using insights from data exploration and/or domain knowledge about the data. For example, you know that in the covertype dataset there are seven types of forest cover (coded using 1-7 range) and that the value of the Slope feature should be in the 0-90 range. You can manually add these constraints to the auto-generated schema by setting the feature domain.

Load the auto-generated schema proto file

schema_proto_path = '{}/{}'.format(schema_gen.outputs['schema'].get()[0].uri, 'schema.pbtxt')
schema = tfdv.load_schema_text(schema_proto_path)

Modify the schema

You can use the protocol buffer APIs to modify the schema.

Hint: Review the TFDV library API documentation on setting a feature’s domain. You can use the protocol buffer APIs to modify the schema. Review the Tensorflow Metadata proto definition for configuration options.

# TODO: Your code to restrict the categorical feature Cover_Type between the values of 0 and 6.
tfdv.set_domain(
    schema,
    "Cover_Type",
    schema_pb2.IntDomain(name="Cover_Type", min=0, max=6, is_categorical=True)
)
# TODO: Your code to restrict the numeric feature Slope between 0 and 90.
tfdv.set_domain(
    schema,
    "Slope",
    schema_pb2.IntDomain(name="Slope", min=0, max=90)
)

tfdv.display_schema(schema=schema)
<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
Type Presence Valency Domain
Feature name
'Aspect' INT required single -
'Cover_Type' INT required single [0,6]
'Elevation' INT required single -
'Hillshade_3pm' INT required single -
'Hillshade_9am' INT required single -
'Hillshade_Noon' INT required single -
'Horizontal_Distance_To_Fire_Points' INT required single -
'Horizontal_Distance_To_Hydrology' INT required single -
'Horizontal_Distance_To_Roadways' INT required single -
'Slope' INT required single [0,90]
'Soil_Type' STRING required single 'Soil_Type'
'Vertical_Distance_To_Hydrology' INT required single -
'Wilderness_Area' STRING required single 'Wilderness_Area'
/home/jupyter/.local/lib/python3.7/site-packages/tensorflow_data_validation/utils/display_util.py:151: FutureWarning: Passing a negative integer is deprecated in version 1.0 and will not be supported in future version. Instead, use None to not limit the column width.
  pd.set_option('max_colwidth', -1)
<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
Values
Domain
'Soil_Type' 'C2702', 'C2703', 'C2704', 'C2705', 'C2706', 'C2717', 'C3501', 'C3502', 'C4201', 'C4703', 'C4704', 'C4744', 'C4758', 'C5101', 'C5151', 'C6101', 'C6102', 'C6731', 'C7101', 'C7102', 'C7103', 'C7201', 'C7202', 'C7700', 'C7701', 'C7702', 'C7709', 'C7710', 'C7745', 'C7746', 'C7755', 'C7756', 'C7757', 'C7790', 'C8703', 'C8707', 'C8708', 'C8771', 'C8772', 'C8776'
'Wilderness_Area' 'Cache', 'Commanche', 'Neota', 'Rawah'

Save the updated schema

schema_dir = os.path.join(ARTIFACT_STORE, 'schema')
tf.io.gfile.makedirs(schema_dir)
schema_file = os.path.join(schema_dir, 'schema.pbtxt')

tfdv.write_schema_text(schema, schema_file)

!cat {schema_file}
feature {
  name: "Aspect"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Cover_Type"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  int_domain {
    name: "Cover_Type"
    min: 0
    max: 6
    is_categorical: true
  }
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Elevation"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Hillshade_3pm"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Hillshade_9am"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Hillshade_Noon"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Horizontal_Distance_To_Fire_Points"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Horizontal_Distance_To_Hydrology"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Horizontal_Distance_To_Roadways"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Slope"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  int_domain {
    name: "Slope"
    min: 0
    max: 90
  }
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Soil_Type"
  value_count {
    min: 1
    max: 1
  }
  type: BYTES
  domain: "Soil_Type"
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Vertical_Distance_To_Hydrology"
  value_count {
    min: 1
    max: 1
  }
  type: INT
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
feature {
  name: "Wilderness_Area"
  value_count {
    min: 1
    max: 1
  }
  type: BYTES
  domain: "Wilderness_Area"
  presence {
    min_fraction: 1.0
    min_count: 1
  }
}
string_domain {
  name: "Soil_Type"
  value: "C2702"
  value: "C2703"
  value: "C2704"
  value: "C2705"
  value: "C2706"
  value: "C2717"
  value: "C3501"
  value: "C3502"
  value: "C4201"
  value: "C4703"
  value: "C4704"
  value: "C4744"
  value: "C4758"
  value: "C5101"
  value: "C5151"
  value: "C6101"
  value: "C6102"
  value: "C6731"
  value: "C7101"
  value: "C7102"
  value: "C7103"
  value: "C7201"
  value: "C7202"
  value: "C7700"
  value: "C7701"
  value: "C7702"
  value: "C7709"
  value: "C7710"
  value: "C7745"
  value: "C7746"
  value: "C7755"
  value: "C7756"
  value: "C7757"
  value: "C7790"
  value: "C8703"
  value: "C8707"
  value: "C8708"
  value: "C8771"
  value: "C8772"
  value: "C8776"
}
string_domain {
  name: "Wilderness_Area"
  value: "Cache"
  value: "Commanche"
  value: "Neota"
  value: "Rawah"
}

Importing the updated schema using ImporterNode

The ImporterNode component allows you to import an external artifact, including the schema file, so it can be used by other TFX components in your workflow.

Configure and run the ImporterNode component

schema_importer = ImporterNode(
    instance_name='Schema_Importer',
    source_uri=schema_dir,
    artifact_type=tfx.types.standard_artifacts.Schema,
    reimport=False)
WARNING:absl:`instance_name` is deprecated, please set node id directly using`with_id()` or `.id` setter.
context.run(schema_importer)
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f864832c550
.execution_id4
.component<tfx.components.common_nodes.importer_node.ImporterNode object at 0x7f862e2480d0>
.component.inputs{}
.component.outputs
['result']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Visualize the imported schema

context.show(schema_importer.outputs['result'])

Artifact at /home/jupyter/artifact-store/schema

<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
Type Presence Valency Domain
Feature name
'Aspect' INT required single -
'Cover_Type' INT required single [0,6]
'Elevation' INT required single -
'Hillshade_3pm' INT required single -
'Hillshade_9am' INT required single -
'Hillshade_Noon' INT required single -
'Horizontal_Distance_To_Fire_Points' INT required single -
'Horizontal_Distance_To_Hydrology' INT required single -
'Horizontal_Distance_To_Roadways' INT required single -
'Slope' INT required single [0,90]
'Soil_Type' STRING required single 'Soil_Type'
'Vertical_Distance_To_Hydrology' INT required single -
'Wilderness_Area' STRING required single 'Wilderness_Area'
/home/jupyter/.local/lib/python3.7/site-packages/tensorflow_data_validation/utils/display_util.py:151: FutureWarning: Passing a negative integer is deprecated in version 1.0 and will not be supported in future version. Instead, use None to not limit the column width.
  pd.set_option('max_colwidth', -1)
<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
Values
Domain
'Soil_Type' 'C2702', 'C2703', 'C2704', 'C2705', 'C2706', 'C2717', 'C3501', 'C3502', 'C4201', 'C4703', 'C4704', 'C4744', 'C4758', 'C5101', 'C5151', 'C6101', 'C6102', 'C6731', 'C7101', 'C7102', 'C7103', 'C7201', 'C7202', 'C7700', 'C7701', 'C7702', 'C7709', 'C7710', 'C7745', 'C7746', 'C7755', 'C7756', 'C7757', 'C7790', 'C8703', 'C8707', 'C8708', 'C8771', 'C8772', 'C8776'
'Wilderness_Area' 'Cache', 'Commanche', 'Neota', 'Rawah'

Validating data with ExampleValidator

The ExampleValidator component identifies anomalies in data. It identifies anomalies by comparing data statistics computed by the StatisticsGen component against a schema generated by SchemaGen or imported by ImporterNode.

ExampleValidator can detect different classes of anomalies. For example it can:

The ExampleValidator component validates the data in the eval split only. Other splits are ignored.

Configure and run the ExampleValidator component

# TODO: Complete ExampleValidator
# Hint: review the visual above and review the documentation on ExampleValidator's inputs and outputs: 
# https://www.tensorflow.org/tfx/guide/exampleval
# Make sure you use the output of the schema_importer component created above.

example_validator = ExampleValidator(
    instance_name="Data_Validator",
    statistics=statistics_gen.outputs["statistics"],
    schema=schema_importer.outputs["result"]
)
WARNING:absl:`instance_name` is deprecated, please set node id directly using`with_id()` or `.id` setter.
context.run(example_validator)
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f862e191610
.execution_id5
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['statistics']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['schema']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['anomalies']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Examine the output of ExampleValidator

The output artifact of the ExampleValidator is the anomalies.pbtxt file describing an anomalies_pb2.Anomalies protobuf.

train_uri = example_validator.outputs['anomalies'].get()[0].uri
train_anomalies_filename = os.path.join(train_uri, "train/anomalies.pbtxt")
!cat $train_anomalies_filename
baseline {
  feature {
    name: "Aspect"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Cover_Type"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    int_domain {
      name: "Cover_Type"
      min: 0
      max: 6
      is_categorical: true
    }
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Elevation"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Hillshade_3pm"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Hillshade_9am"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Hillshade_Noon"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Horizontal_Distance_To_Fire_Points"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Horizontal_Distance_To_Hydrology"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Horizontal_Distance_To_Roadways"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Slope"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    int_domain {
      name: "Slope"
      min: 0
      max: 90
    }
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Soil_Type"
    value_count {
      min: 1
      max: 1
    }
    type: BYTES
    domain: "Soil_Type"
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Vertical_Distance_To_Hydrology"
    value_count {
      min: 1
      max: 1
    }
    type: INT
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  feature {
    name: "Wilderness_Area"
    value_count {
      min: 1
      max: 1
    }
    type: BYTES
    domain: "Wilderness_Area"
    presence {
      min_fraction: 1.0
      min_count: 1
    }
  }
  string_domain {
    name: "Soil_Type"
    value: "C2702"
    value: "C2703"
    value: "C2704"
    value: "C2705"
    value: "C2706"
    value: "C2717"
    value: "C3501"
    value: "C3502"
    value: "C4201"
    value: "C4703"
    value: "C4704"
    value: "C4744"
    value: "C4758"
    value: "C5101"
    value: "C5151"
    value: "C6101"
    value: "C6102"
    value: "C6731"
    value: "C7101"
    value: "C7102"
    value: "C7103"
    value: "C7201"
    value: "C7202"
    value: "C7700"
    value: "C7701"
    value: "C7702"
    value: "C7709"
    value: "C7710"
    value: "C7745"
    value: "C7746"
    value: "C7755"
    value: "C7756"
    value: "C7757"
    value: "C7790"
    value: "C8703"
    value: "C8707"
    value: "C8708"
    value: "C8771"
    value: "C8772"
    value: "C8776"
  }
  string_domain {
    name: "Wilderness_Area"
    value: "Cache"
    value: "Commanche"
    value: "Neota"
    value: "Rawah"
  }
}
anomaly_name_format: SERIALIZED_PATH

Visualize validation results

The file anomalies.pbtxt can be visualized using context.show.

context.show(example_validator.outputs['output'])

Artifact at /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/ExampleValidator.Data_Validator/anomalies/5

'train' split:

No anomalies found.

'eval' split:

No anomalies found.

In our case no anomalies were detected in the eval split.

For a detailed deep dive into data validation and schema generation refer to the lab-31-tfdv-structured-data lab.

Preprocessing data with Transform

The Transform component performs data transformation and feature engineering. The Transform component consumes tf.Examples emitted from the ExampleGen component and emits the transformed feature data and the SavedModel graph that was used to process the data. The emitted SavedModel can then be used by serving components to make sure that the same data pre-processing logic is applied at training and serving.

The Transform component requires more code than many other components because of the arbitrary complexity of the feature engineering that you may need for the data and/or model that you’re working with. It requires code files to be available which define the processing needed.

Define the pre-processing module

To configure Trainsform, you need to encapsulate your pre-processing code in the Python preprocessing_fn function and save it to a python module that is then provided to the Transform component as an input. This module will be loaded by transform and the preprocessing_fn function will be called when the Transform component runs.

In most cases, your implementation of the preprocessing_fn makes extensive use of TensorFlow Transform for performing feature engineering on your dataset.

TRANSFORM_MODULE = 'preprocessing.py'
!cat {TRANSFORM_MODULE}
# Copyright 2021 Google LLC. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Covertype preprocessing.
This file defines a template for TFX Transform component.
"""

import tensorflow as tf
import tensorflow_transform as tft

import features

def _fill_in_missing(x):
  """Replace missing values in a SparseTensor.
  Fills in missing values of `x` with '' or 0, and converts to a dense tensor.
  Args:
    x: A `SparseTensor` of rank 2.  Its dense shape should have size at most 1
      in the second dimension.
  Returns:
    A rank 1 tensor where missing values of `x` have been filled in.
  """
  default_value = '' if x.dtype == tf.string else 0
  return tf.squeeze(
      tf.sparse.to_dense(
          tf.SparseTensor(x.indices, x.values, [x.dense_shape[0], 1]),
          default_value),
      axis=1)

def preprocessing_fn(inputs):
  """Preprocesses Covertype Dataset."""

  outputs = {}

  # Scale numerical features.
  for key in features.NUMERIC_FEATURE_KEYS:
    outputs[features.transformed_name(key)] = tft.scale_to_z_score(
        _fill_in_missing(inputs[key]))

  # Generate vocabularies and maps categorical features.
  for key in features.CATEGORICAL_FEATURE_KEYS:
    outputs[features.transformed_name(key)] = tft.compute_and_apply_vocabulary(
        x=_fill_in_missing(inputs[key]), num_oov_buckets=1, vocab_filename=key)

  # Convert Cover_Type to dense tensor.
  outputs[features.transformed_name(features.LABEL_KEY)] = _fill_in_missing(
      inputs[features.LABEL_KEY])

  return outputs

Configure and run the Transform component.

transform = Transform(
    examples=example_gen.outputs['examples'],
    schema=schema_importer.outputs['result'],
    module_file=TRANSFORM_MODULE)
context.run(transform)
WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tfx/components/transform/executor.py:528: Schema (from tensorflow_transform.tf_metadata.dataset_schema) is deprecated and will be removed in a future version.
Instructions for updating:
Schema is a deprecated, use schema_utils.schema_from_feature_spec to create a `Schema`


2024-03-24 14:20:54.914486: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
2024-03-24 14:20:54.918839: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tfx/components/transform/executor.py:528: Schema (from tensorflow_transform.tf_metadata.dataset_schema) is deprecated and will be removed in a future version.
Instructions for updating:
Schema is a deprecated, use schema_utils.schema_from_feature_spec to create a `Schema`


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow_transform/tf_utils.py:250: Tensor.experimental_ref (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use ref() instead.


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow_transform/tf_utils.py:250: Tensor.experimental_ref (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use ref() instead.


WARNING:tensorflow:TFT beam APIs accept both the TFXIO format and the instance dict format now. There is no need to set use_tfxio any more and it will be removed soon.


WARNING:tensorflow:TFT beam APIs accept both the TFXIO format and the instance dict format now. There is no need to set use_tfxio any more and it will be removed soon.
WARNING:root:This output type hint will be ignored and not used for type-checking purposes. Typically, output type hints for a PTransform are single (or nested) types wrapped by a PCollection, PDone, or None. Got: Tuple[Dict[str, Union[NoneType, _Dataset]], Union[Dict[str, Dict[str, PCollection]], NoneType]] instead.
WARNING:root:This output type hint will be ignored and not used for type-checking purposes. Typically, output type hints for a PTransform are single (or nested) types wrapped by a PCollection, PDone, or None. Got: Tuple[Dict[str, Union[NoneType, _Dataset]], Union[Dict[str, Dict[str, PCollection]], NoneType]] instead.


WARNING:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. 


WARNING:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. 


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/saved_model/signature_def_utils_impl.py:201: build_tensor_info (from tensorflow.python.saved_model.utils_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/saved_model/signature_def_utils_impl.py:201: build_tensor_info (from tensorflow.python.saved_model.utils_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.


INFO:tensorflow:Assets added to graph.


INFO:tensorflow:Assets added to graph.


INFO:tensorflow:No assets to write.


INFO:tensorflow:No assets to write.


WARNING:tensorflow:Issue encountered when serializing tft_mapper_use.
Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.
'Counter' object has no attribute 'name'


WARNING:tensorflow:Issue encountered when serializing tft_mapper_use.
Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.
'Counter' object has no attribute 'name'


INFO:tensorflow:SavedModel written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/542f0c1f8ca8462d80573de97a664857/saved_model.pb


INFO:tensorflow:SavedModel written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/542f0c1f8ca8462d80573de97a664857/saved_model.pb


INFO:tensorflow:Assets added to graph.


INFO:tensorflow:Assets added to graph.


INFO:tensorflow:No assets to write.


INFO:tensorflow:No assets to write.


WARNING:tensorflow:Issue encountered when serializing tft_mapper_use.
Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.
'Counter' object has no attribute 'name'


WARNING:tensorflow:Issue encountered when serializing tft_mapper_use.
Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.
'Counter' object has no attribute 'name'


INFO:tensorflow:SavedModel written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/f77080f04c224c0f9d12a1713b444703/saved_model.pb


INFO:tensorflow:SavedModel written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/f77080f04c224c0f9d12a1713b444703/saved_model.pb


WARNING:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. 


WARNING:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. 
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>


WARNING:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. 


WARNING:tensorflow:Tensorflow version (2.3.1) found. Note that Tensorflow Transform support for TF 2.0 is currently in beta, and features such as tf.function may not work as intended. 
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring send_type hint: <class 'NoneType'>
WARNING:apache_beam.typehints.typehints:Ignoring return_type hint: <class 'NoneType'>


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Assets added to graph.


INFO:tensorflow:Assets added to graph.


INFO:tensorflow:Assets written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/8eda653e36fc492dbd6b1785b70b5a39/assets


INFO:tensorflow:Assets written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/8eda653e36fc492dbd6b1785b70b5a39/assets


INFO:tensorflow:SavedModel written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/8eda653e36fc492dbd6b1785b70b5a39/saved_model.pb


INFO:tensorflow:SavedModel written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Transform/transform_graph/6/.temp_path/tftransform_tmp/8eda653e36fc492dbd6b1785b70b5a39/saved_model.pb


WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_3:0\022\017Wilderness_Area"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_3:0\022\017Wilderness_Area"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_5:0\022\tSoil_Type"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_5:0\022\tSoil_Type"



INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_3:0\022\017Wilderness_Area"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_3:0\022\017Wilderness_Area"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_5:0\022\tSoil_Type"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_5:0\022\tSoil_Type"



INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_3:0\022\017Wilderness_Area"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_3:0\022\017Wilderness_Area"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_5:0\022\tSoil_Type"



WARNING:tensorflow:Expected binary or unicode string, got type_url: "type.googleapis.com/tensorflow.AssetFileDef"
value: "\n\013\n\tConst_5:0\022\tSoil_Type"



INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f862e0e4750
.execution_id6
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['schema']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['transform_graph']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['transformed_examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['updated_analyzer_cache']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Examine the Transform component’s outputs

The Transform component has 2 outputs:

Take a peek at the transform_graph artifact: it points to a directory containing 3 subdirectories:

os.listdir(transform.outputs['transform_graph'].get()[0].uri)
['transform_fn', 'transformed_metadata', 'metadata']

And the transform.examples artifact

os.listdir(transform.outputs['transformed_examples'].get()[0].uri)
['eval', 'train']
transform_uri = transform.outputs['transformed_examples'].get()[0].uri
tfrecord_filenames = [os.path.join(transform_uri,  'train', name)
                      for name in os.listdir(os.path.join(transform_uri, 'train'))]
dataset = tf.data.TFRecordDataset(tfrecord_filenames, compression_type="GZIP")
for tfrecord in dataset.take(2):
  example = tf.train.Example()
  example.ParseFromString(tfrecord.numpy())
  for name, feature in example.features.feature.items():
    if feature.HasField('bytes_list'):
        value = feature.bytes_list.value
    if feature.HasField('float_list'):
        value = feature.float_list.value
    if feature.HasField('int64_list'):
        value = feature.int64_list.value
    print('{}: {}'.format(name, value))
  print('******')
Horizontal_Distance_To_Fire_Points_xf: [-0.08278580754995346]
Aspect_xf: [0.24982304871082306]
Horizontal_Distance_To_Hydrology_xf: [1.7856396436691284]
Vertical_Distance_To_Hydrology_xf: [0.950199544429779]
Soil_Type_xf: [3]
Wilderness_Area_xf: [1]
Horizontal_Distance_To_Roadways_xf: [-1.0252416133880615]
Hillshade_3pm_xf: [0.3838464617729187]
Slope_xf: [-0.6797884106636047]
Elevation_xf: [0.651531994342804]
Cover_Type_xf: [1]
Hillshade_9am_xf: [0.4025527238845825]
Hillshade_Noon_xf: [1.1993097066879272]
******
Vertical_Distance_To_Hydrology_xf: [-0.6439017057418823]
Hillshade_Noon_xf: [0.1884450614452362]
Hillshade_3pm_xf: [-0.9800231456756592]
Slope_xf: [0.25757989287376404]
Cover_Type_xf: [2]
Hillshade_9am_xf: [1.2273873090744019]
Horizontal_Distance_To_Roadways_xf: [-1.4311398267745972]
Horizontal_Distance_To_Fire_Points_xf: [-1.1525152921676636]
Aspect_xf: [-0.2777789831161499]
Horizontal_Distance_To_Hydrology_xf: [-0.9832831621170044]
Soil_Type_xf: [19]
Wilderness_Area_xf: [2]
Elevation_xf: [-3.5473971366882324]
******

Train your TensorFlow model with the Trainer component

The Trainer component trains a model using TensorFlow.

Trainer takes:

Define the trainer module

To configure Trainer, you need to encapsulate your training code in a Python module that is then provided to the Trainer as an input.

TRAINER_MODULE_FILE = 'model.py'
!cat {TRAINER_MODULE_FILE}
# Copyright 2021 Google LLC. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Covertype Keras WideDeep Classifier."""

import functools
import absl
import os
from typing import List, Text

import kerastuner
import tensorflow as tf
import tensorflow_model_analysis as tfma
import tensorflow_transform as tft
from tensorflow_transform.tf_metadata import schema_utils

from tfx.components.trainer.executor import TrainerFnArgs
from tfx.components.trainer.fn_args_utils import DataAccessor
from tfx.components.tuner.component import TunerFnResult
from tfx_bsl.tfxio import dataset_options

import features

EPOCHS = 1
TRAIN_BATCH_SIZE = 64
EVAL_BATCH_SIZE = 64


def _gzip_reader_fn(filenames):
  """Small utility returning a record reader that can read gzip'ed files."""
  return tf.data.TFRecordDataset(filenames, compression_type='GZIP')


def _get_serve_tf_examples_fn(model, tf_transform_output):
  """Returns a function that parses a serialized tf.Example and applies TFT."""

  model.tft_layer = tf_transform_output.transform_features_layer()

  @tf.function
  def serve_tf_examples_fn(serialized_tf_examples):
    """Returns the output to be used in the serving signature."""
    feature_spec = tf_transform_output.raw_feature_spec()
    feature_spec.pop(features.LABEL_KEY)
    parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec)

    transformed_features = model.tft_layer(parsed_features)

    return model(transformed_features)

  return serve_tf_examples_fn


def _input_fn(file_pattern: List[Text],
              data_accessor: DataAccessor,
              tf_transform_output: tft.TFTransformOutput,
              batch_size: int = 200) -> tf.data.Dataset:
  """Generates features and label for tuning/training.

  Args:
    file_pattern: List of paths or patterns of input tfrecord files.
    data_accessor: DataAccessor for converting input to RecordBatch.
    tf_transform_output: A TFTransformOutput.
    batch_size: representing the number of consecutive elements of returned
      dataset to combine in a single batch

  Returns:
    A dataset that contains (features, indices) tuple where features is a
      dictionary of Tensors, and indices is a single Tensor of label indices.
  """
  dataset = data_accessor.tf_dataset_factory(
      file_pattern,
      dataset_options.TensorFlowDatasetOptions(
          batch_size=batch_size, label_key=features.transformed_name(features.LABEL_KEY)),
      tf_transform_output.transformed_metadata.schema)
    
  return dataset


def _get_hyperparameters() -> kerastuner.HyperParameters:
  """Returns hyperparameters for building Keras model."""
  hp = kerastuner.HyperParameters()
  # Defines search space.
  hp.Choice('learning_rate', [1e-2, 1e-3, 1e-4], default=1e-3)
  hp.Int('n_layers', 1, 2, default=1)
  with hp.conditional_scope('n_layers', 1):
        hp.Int('n_units_1', min_value=8, max_value=128, step=8, default=8)
  with hp.conditional_scope('n_layers', 2):
        hp.Int('n_units_1', min_value=8, max_value=128, step=8, default=8)
        hp.Int('n_units_2', min_value=8, max_value=128, step=8, default=8)        

  return hp


def _build_keras_model(hparams: kerastuner.HyperParameters, 
                       tf_transform_output: tft.TFTransformOutput) -> tf.keras.Model:
  """Creates a Keras WideDeep Classifier model.
  Args:
    hparams: Holds HyperParameters for tuning.
    tf_transform_output: A TFTransformOutput.
  Returns:
    A keras Model.
  """
  deep_columns = [
      tf.feature_column.numeric_column(
          key=features.transformed_name(key), 
          shape=())
      for key in features.NUMERIC_FEATURE_KEYS
  ]
    
  input_layers = {
      column.key: tf.keras.layers.Input(name=column.key, shape=(), dtype=tf.float32)
      for column in deep_columns
  }    

  categorical_columns = [
      tf.feature_column.categorical_column_with_identity(
          key=features.transformed_name(key), 
          num_buckets=tf_transform_output.num_buckets_for_transformed_feature(features.transformed_name(key)), 
          default_value=0)
      for key in features.CATEGORICAL_FEATURE_KEYS
  ]

  wide_columns = [
      tf.feature_column.indicator_column(categorical_column)
      for categorical_column in categorical_columns
  ]
    
  input_layers.update({
      column.categorical_column.key: tf.keras.layers.Input(name=column.categorical_column.key, shape=(), dtype=tf.int32)
      for column in wide_columns
  })


  deep = tf.keras.layers.DenseFeatures(deep_columns)(input_layers)
  for n in range(int(hparams.get('n_layers'))):
    deep = tf.keras.layers.Dense(units=hparams.get('n_units_' + str(n + 1)))(deep)

  wide = tf.keras.layers.DenseFeatures(wide_columns)(input_layers)

  output = tf.keras.layers.Dense(features.NUM_CLASSES, activation='softmax')(
               tf.keras.layers.concatenate([deep, wide]))

  model = tf.keras.Model(input_layers, output)
  model.compile(
      loss='sparse_categorical_crossentropy',
      optimizer=tf.keras.optimizers.Adam(lr=hparams.get('learning_rate')),
      metrics=[tf.keras.metrics.SparseCategoricalAccuracy()])
  model.summary(print_fn=absl.logging.info)

  return model    


# TFX Tuner will call this function.
def tuner_fn(fn_args: TrainerFnArgs) -> TunerFnResult:
  """Build the tuner using the KerasTuner API.
  Args:
    fn_args: Holds args as name/value pairs.
      - working_dir: working dir for tuning.
      - train_files: List of file paths containing training tf.Example data.
      - eval_files: List of file paths containing eval tf.Example data.
      - train_steps: number of train steps.
      - eval_steps: number of eval steps.
      - schema_path: optional schema of the input data.
      - transform_graph_path: optional transform graph produced by TFT.
  Returns:
    A namedtuple contains the following:
      - tuner: A BaseTuner that will be used for tuning.
      - fit_kwargs: Args to pass to tuner's run_trial function for fitting the
                    model , e.g., the training and validation dataset. Required
                    args depend on the above tuner's implementation.
  """
  transform_graph = tft.TFTransformOutput(fn_args.transform_graph_path)
  
  # Construct a build_keras_model_fn that just takes hyperparams from get_hyperparameters as input.
  build_keras_model_fn = functools.partial(
      _build_keras_model, tf_transform_output=transform_graph)  

  # BayesianOptimization is a subclass of kerastuner.Tuner which inherits from BaseTuner.    
  tuner = kerastuner.BayesianOptimization(
      build_keras_model_fn,
      max_trials=10,
      hyperparameters=_get_hyperparameters(),
      # New entries allowed for n_units hyperparameter construction conditional on n_layers selected.
#       allow_new_entries=True,
#       tune_new_entries=True,
      objective=kerastuner.Objective('val_sparse_categorical_accuracy', 'max'),
      directory=fn_args.working_dir,
      project_name='covertype_tuning')
  
  train_dataset = _input_fn(
      fn_args.train_files,
      fn_args.data_accessor,
      transform_graph,
      batch_size=TRAIN_BATCH_SIZE)

  eval_dataset = _input_fn(
      fn_args.eval_files,
      fn_args.data_accessor,
      transform_graph,
      batch_size=EVAL_BATCH_SIZE)

  return TunerFnResult(
      tuner=tuner,
      fit_kwargs={
          'x': train_dataset,
          'validation_data': eval_dataset,
          'steps_per_epoch': fn_args.train_steps,
          'validation_steps': fn_args.eval_steps
      })


# TFX Trainer will call this function.
def run_fn(fn_args: TrainerFnArgs):
  """Train the model based on given args.
  Args:
    fn_args: Holds args used to train the model as name/value pairs.
  """

  tf_transform_output = tft.TFTransformOutput(fn_args.transform_output)

  train_dataset = _input_fn(
      fn_args.train_files, 
      fn_args.data_accessor, 
      tf_transform_output, 
      TRAIN_BATCH_SIZE)

  eval_dataset = _input_fn(
      fn_args.eval_files, 
      fn_args.data_accessor,
      tf_transform_output, 
      EVAL_BATCH_SIZE)

  if fn_args.hyperparameters:
    hparams = kerastuner.HyperParameters.from_config(fn_args.hyperparameters)
  else:
    # This is a shown case when hyperparameters is decided and Tuner is removed
    # from the pipeline. User can also inline the hyperparameters directly in
    # _build_keras_model.
    hparams = _get_hyperparameters()
  absl.logging.info('HyperParameters for training: %s' % hparams.get_config())
  
  # Distribute training over multiple replicas on the same machine.
  mirrored_strategy = tf.distribute.MirroredStrategy()
  with mirrored_strategy.scope():
        model = _build_keras_model(
            hparams=hparams,
            tf_transform_output=tf_transform_output)

  tensorboard_callback = tf.keras.callbacks.TensorBoard(
      log_dir=fn_args.model_run_dir, update_freq='batch')

  model.fit(
      train_dataset,
      epochs=EPOCHS,
      steps_per_epoch=fn_args.train_steps,
      validation_data=eval_dataset,
      validation_steps=fn_args.eval_steps,
      callbacks=[tensorboard_callback])
    
  signatures = {
      'serving_default':
          _get_serve_tf_examples_fn(model,
                                    tf_transform_output).get_concrete_function(
                                        tf.TensorSpec(
                                            shape=[None],
                                            dtype=tf.string,
                                            name='examples')),
  }
  
  model.save(fn_args.serving_model_dir, save_format='tf', signatures=signatures)

Create and run the Trainer component

As of the 0.25.0 release of TFX, the Trainer component only supports passing a single field - num_steps - through the train_args and eval_args arguments.

trainer = Trainer(
    custom_executor_spec=executor_spec.ExecutorClassSpec(trainer_executor.GenericExecutor),
    module_file=TRAINER_MODULE_FILE,
    transformed_examples=transform.outputs.transformed_examples,
    schema=schema_importer.outputs.result,
    transform_graph=transform.outputs.transform_graph,
    train_args=trainer_pb2.TrainArgs(splits=['train'], num_steps=5000),
    eval_args=trainer_pb2.EvalArgs(splits=['eval'], num_steps=1000))
context.run(trainer)
2024-03-24 14:24:42.794142: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
2024-03-24 14:24:42.800231: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE


WARNING:tensorflow:There are non-GPU devices in `tf.distribute.Strategy`, not using nccl allreduce.


WARNING:tensorflow:There are non-GPU devices in `tf.distribute.Strategy`, not using nccl allreduce.


INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:CPU:0',)


INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:CPU:0',)
2024-03-24 14:24:43.131438: I tensorflow/core/profiler/lib/profiler_session.cc:164] Profiler session started.


   1/5000 [..............................] - ETA: 0s - loss: 2.0798 - sparse_categorical_accuracy: 0.1562WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/ops/summary_ops_v2.py:1277: stop (from tensorflow.python.eager.profiler) is deprecated and will be removed after 2020-07-01.
Instructions for updating:
use `tf.profiler.experimental.stop` instead.


2024-03-24 14:24:46.085352: I tensorflow/core/profiler/lib/profiler_session.cc:164] Profiler session started.
WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/ops/summary_ops_v2.py:1277: stop (from tensorflow.python.eager.profiler) is deprecated and will be removed after 2020-07-01.
Instructions for updating:
use `tf.profiler.experimental.stop` instead.


   2/5000 [..............................] - ETA: 6:37 - loss: 2.0330 - sparse_categorical_accuracy: 0.1875WARNING:tensorflow:Callbacks method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0133s vs `on_train_batch_end` time: 0.1509s). Check your callbacks.


2024-03-24 14:24:46.145452: I tensorflow/core/profiler/rpc/client/save_profile.cc:176] Creating directory: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46
2024-03-24 14:24:46.147603: I tensorflow/core/profiler/rpc/client/save_profile.cc:182] Dumped gzipped tool data for trace.json.gz to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46/tfx-on-googlecloud.trace.json.gz
2024-03-24 14:24:46.238889: I tensorflow/core/profiler/rpc/client/save_profile.cc:176] Creating directory: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46
2024-03-24 14:24:46.239146: I tensorflow/core/profiler/rpc/client/save_profile.cc:182] Dumped gzipped tool data for memory_profile.json.gz to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46/tfx-on-googlecloud.memory_profile.json.gz
2024-03-24 14:24:46.239454: I tensorflow/python/profiler/internal/profiler_wrapper.cc:111] Creating directory: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46Dumped tool data for xplane.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46/tfx-on-googlecloud.xplane.pb
Dumped tool data for overview_page.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46/tfx-on-googlecloud.overview_page.pb
Dumped tool data for input_pipeline.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46/tfx-on-googlecloud.input_pipeline.pb
Dumped tool data for tensorflow_stats.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46/tfx-on-googlecloud.tensorflow_stats.pb
Dumped tool data for kernel_stats.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7/train/plugins/profile/2024_03_24_14_24_46/tfx-on-googlecloud.kernel_stats.pb

WARNING:tensorflow:Callbacks method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0133s vs `on_train_batch_end` time: 0.1509s). Check your callbacks.


5000/5000 [==============================] - ETA: 1:20 - loss: 1.9535 - sparse_categorical_accuracy: 0.204 - ETA: 43s - loss: 1.8994 - sparse_categorical_accuracy: 0.246 - ETA: 36s - loss: 1.8682 - sparse_categorical_accuracy: 0.27 - ETA: 34s - loss: 1.8464 - sparse_categorical_accuracy: 0.29 - ETA: 29s - loss: 1.8006 - sparse_categorical_accuracy: 0.33 - ETA: 26s - loss: 1.7618 - sparse_categorical_accuracy: 0.36 - ETA: 25s - loss: 1.7299 - sparse_categorical_accuracy: 0.38 - ETA: 23s - loss: 1.6980 - sparse_categorical_accuracy: 0.40 - ETA: 22s - loss: 1.6588 - sparse_categorical_accuracy: 0.42 - ETA: 21s - loss: 1.6195 - sparse_categorical_accuracy: 0.44 - ETA: 20s - loss: 1.5884 - sparse_categorical_accuracy: 0.46 - ETA: 19s - loss: 1.5581 - sparse_categorical_accuracy: 0.47 - ETA: 19s - loss: 1.5238 - sparse_categorical_accuracy: 0.49 - ETA: 18s - loss: 1.4978 - sparse_categorical_accuracy: 0.50 - ETA: 18s - loss: 1.4714 - sparse_categorical_accuracy: 0.51 - ETA: 18s - loss: 1.4446 - sparse_categorical_accuracy: 0.52 - ETA: 17s - loss: 1.4225 - sparse_categorical_accuracy: 0.52 - ETA: 17s - loss: 1.4026 - sparse_categorical_accuracy: 0.53 - ETA: 17s - loss: 1.3796 - sparse_categorical_accuracy: 0.54 - ETA: 17s - loss: 1.3576 - sparse_categorical_accuracy: 0.55 - ETA: 17s - loss: 1.3386 - sparse_categorical_accuracy: 0.55 - ETA: 16s - loss: 1.3204 - sparse_categorical_accuracy: 0.56 - ETA: 16s - loss: 1.3011 - sparse_categorical_accuracy: 0.56 - ETA: 16s - loss: 1.2839 - sparse_categorical_accuracy: 0.57 - ETA: 16s - loss: 1.2678 - sparse_categorical_accuracy: 0.57 - ETA: 16s - loss: 1.2506 - sparse_categorical_accuracy: 0.58 - ETA: 15s - loss: 1.2353 - sparse_categorical_accuracy: 0.58 - ETA: 15s - loss: 1.2221 - sparse_categorical_accuracy: 0.58 - ETA: 15s - loss: 1.2077 - sparse_categorical_accuracy: 0.59 - ETA: 15s - loss: 1.1941 - sparse_categorical_accuracy: 0.59 - ETA: 15s - loss: 1.1833 - sparse_categorical_accuracy: 0.59 - ETA: 15s - loss: 1.1717 - sparse_categorical_accuracy: 0.59 - ETA: 15s - loss: 1.1603 - sparse_categorical_accuracy: 0.60 - ETA: 15s - loss: 1.1495 - sparse_categorical_accuracy: 0.60 - ETA: 14s - loss: 1.1398 - sparse_categorical_accuracy: 0.60 - ETA: 14s - loss: 1.1292 - sparse_categorical_accuracy: 0.61 - ETA: 14s - loss: 1.1202 - sparse_categorical_accuracy: 0.61 - ETA: 14s - loss: 1.1112 - sparse_categorical_accuracy: 0.61 - ETA: 14s - loss: 1.1005 - sparse_categorical_accuracy: 0.61 - ETA: 14s - loss: 1.0920 - sparse_categorical_accuracy: 0.61 - ETA: 14s - loss: 1.0850 - sparse_categorical_accuracy: 0.62 - ETA: 14s - loss: 1.0784 - sparse_categorical_accuracy: 0.62 - ETA: 14s - loss: 1.0694 - sparse_categorical_accuracy: 0.62 - ETA: 13s - loss: 1.0615 - sparse_categorical_accuracy: 0.62 - ETA: 13s - loss: 1.0558 - sparse_categorical_accuracy: 0.62 - ETA: 13s - loss: 1.0495 - sparse_categorical_accuracy: 0.62 - ETA: 13s - loss: 1.0434 - sparse_categorical_accuracy: 0.62 - ETA: 13s - loss: 1.0370 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 1.0320 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 1.0260 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 1.0210 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 1.0166 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 1.0123 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 1.0074 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 1.0028 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 0.9989 - sparse_categorical_accuracy: 0.63 - ETA: 13s - loss: 0.9948 - sparse_categorical_accuracy: 0.64 - ETA: 13s - loss: 0.9916 - sparse_categorical_accuracy: 0.64 - ETA: 13s - loss: 0.9868 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9825 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9790 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9760 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9731 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9703 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9677 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9651 - sparse_categorical_accuracy: 0.64 - ETA: 12s - loss: 0.9619 - sparse_categorical_accuracy: 0.64 - ETA: 13s - loss: 0.9604 - sparse_categorical_accuracy: 0.64 - ETA: 13s - loss: 0.9572 - sparse_categorical_accuracy: 0.64 - ETA: 13s - loss: 0.9547 - sparse_categorical_accuracy: 0.64 - ETA: 13s - loss: 0.9517 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9483 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9450 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9408 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9373 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9345 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9315 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9282 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9258 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9238 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9212 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9183 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9153 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9127 - sparse_categorical_accuracy: 0.65 - ETA: 12s - loss: 0.9096 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.9077 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.9053 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.9030 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.9005 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8986 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8964 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8939 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8920 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8895 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8868 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8848 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8830 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8812 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8796 - sparse_categorical_accuracy: 0.66 - ETA: 11s - loss: 0.8775 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8752 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8727 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8706 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8687 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8666 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8653 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8634 - sparse_categorical_accuracy: 0.66 - ETA: 10s - loss: 0.8617 - sparse_categorical_accuracy: 0.67 - ETA: 10s - loss: 0.8597 - sparse_categorical_accuracy: 0.67 - ETA: 10s - loss: 0.8583 - sparse_categorical_accuracy: 0.67 - ETA: 10s - loss: 0.8568 - sparse_categorical_accuracy: 0.67 - ETA: 10s - loss: 0.8552 - sparse_categorical_accuracy: 0.67 - ETA: 10s - loss: 0.8533 - sparse_categorical_accuracy: 0.67 - ETA: 10s - loss: 0.8513 - sparse_categorical_accuracy: 0.67 - ETA: 10s - loss: 0.8497 - sparse_categorical_accuracy: 0.67 - ETA: 9s - loss: 0.8480 - sparse_categorical_accuracy: 0.6732 - ETA: 9s - loss: 0.8464 - sparse_categorical_accuracy: 0.673 - ETA: 9s - loss: 0.8452 - sparse_categorical_accuracy: 0.673 - ETA: 9s - loss: 0.8437 - sparse_categorical_accuracy: 0.674 - ETA: 9s - loss: 0.8421 - sparse_categorical_accuracy: 0.674 - ETA: 9s - loss: 0.8408 - sparse_categorical_accuracy: 0.674 - ETA: 9s - loss: 0.8391 - sparse_categorical_accuracy: 0.675 - ETA: 9s - loss: 0.8383 - sparse_categorical_accuracy: 0.675 - ETA: 9s - loss: 0.8374 - sparse_categorical_accuracy: 0.675 - ETA: 9s - loss: 0.8358 - sparse_categorical_accuracy: 0.675 - ETA: 9s - loss: 0.8343 - sparse_categorical_accuracy: 0.676 - ETA: 9s - loss: 0.8333 - sparse_categorical_accuracy: 0.676 - ETA: 9s - loss: 0.8318 - sparse_categorical_accuracy: 0.677 - ETA: 9s - loss: 0.8310 - sparse_categorical_accuracy: 0.677 - ETA: 9s - loss: 0.8298 - sparse_categorical_accuracy: 0.677 - ETA: 9s - loss: 0.8283 - sparse_categorical_accuracy: 0.677 - ETA: 9s - loss: 0.8273 - sparse_categorical_accuracy: 0.678 - ETA: 8s - loss: 0.8261 - sparse_categorical_accuracy: 0.678 - ETA: 8s - loss: 0.8251 - sparse_categorical_accuracy: 0.678 - ETA: 8s - loss: 0.8238 - sparse_categorical_accuracy: 0.678 - ETA: 8s - loss: 0.8224 - sparse_categorical_accuracy: 0.679 - ETA: 8s - loss: 0.8212 - sparse_categorical_accuracy: 0.679 - ETA: 8s - loss: 0.8203 - sparse_categorical_accuracy: 0.679 - ETA: 8s - loss: 0.8193 - sparse_categorical_accuracy: 0.679 - ETA: 8s - loss: 0.8182 - sparse_categorical_accuracy: 0.679 - ETA: 8s - loss: 0.8166 - sparse_categorical_accuracy: 0.680 - ETA: 8s - loss: 0.8154 - sparse_categorical_accuracy: 0.680 - ETA: 8s - loss: 0.8142 - sparse_categorical_accuracy: 0.681 - ETA: 8s - loss: 0.8132 - sparse_categorical_accuracy: 0.681 - ETA: 8s - loss: 0.8120 - sparse_categorical_accuracy: 0.681 - ETA: 8s - loss: 0.8108 - sparse_categorical_accuracy: 0.681 - ETA: 8s - loss: 0.8096 - sparse_categorical_accuracy: 0.682 - ETA: 8s - loss: 0.8088 - sparse_categorical_accuracy: 0.682 - ETA: 8s - loss: 0.8080 - sparse_categorical_accuracy: 0.682 - ETA: 7s - loss: 0.8069 - sparse_categorical_accuracy: 0.682 - ETA: 7s - loss: 0.8059 - sparse_categorical_accuracy: 0.682 - ETA: 7s - loss: 0.8050 - sparse_categorical_accuracy: 0.683 - ETA: 7s - loss: 0.8041 - sparse_categorical_accuracy: 0.683 - ETA: 7s - loss: 0.8032 - sparse_categorical_accuracy: 0.683 - ETA: 7s - loss: 0.8025 - sparse_categorical_accuracy: 0.683 - ETA: 7s - loss: 0.8015 - sparse_categorical_accuracy: 0.683 - ETA: 7s - loss: 0.8008 - sparse_categorical_accuracy: 0.683 - ETA: 7s - loss: 0.8003 - sparse_categorical_accuracy: 0.684 - ETA: 7s - loss: 0.7997 - sparse_categorical_accuracy: 0.684 - ETA: 7s - loss: 0.7987 - sparse_categorical_accuracy: 0.684 - ETA: 7s - loss: 0.7977 - sparse_categorical_accuracy: 0.684 - ETA: 7s - loss: 0.7967 - sparse_categorical_accuracy: 0.684 - ETA: 7s - loss: 0.7957 - sparse_categorical_accuracy: 0.685 - ETA: 7s - loss: 0.7947 - sparse_categorical_accuracy: 0.685 - ETA: 7s - loss: 0.7940 - sparse_categorical_accuracy: 0.685 - ETA: 7s - loss: 0.7933 - sparse_categorical_accuracy: 0.685 - ETA: 6s - loss: 0.7925 - sparse_categorical_accuracy: 0.685 - ETA: 6s - loss: 0.7919 - sparse_categorical_accuracy: 0.685 - ETA: 6s - loss: 0.7910 - sparse_categorical_accuracy: 0.686 - ETA: 6s - loss: 0.7903 - sparse_categorical_accuracy: 0.686 - ETA: 6s - loss: 0.7894 - sparse_categorical_accuracy: 0.686 - ETA: 6s - loss: 0.7885 - sparse_categorical_accuracy: 0.686 - ETA: 6s - loss: 0.7876 - sparse_categorical_accuracy: 0.686 - ETA: 6s - loss: 0.7868 - sparse_categorical_accuracy: 0.686 - ETA: 6s - loss: 0.7857 - sparse_categorical_accuracy: 0.687 - ETA: 6s - loss: 0.7850 - sparse_categorical_accuracy: 0.687 - ETA: 6s - loss: 0.7844 - sparse_categorical_accuracy: 0.687 - ETA: 6s - loss: 0.7836 - sparse_categorical_accuracy: 0.687 - ETA: 6s - loss: 0.7828 - sparse_categorical_accuracy: 0.687 - ETA: 6s - loss: 0.7820 - sparse_categorical_accuracy: 0.687 - ETA: 6s - loss: 0.7812 - sparse_categorical_accuracy: 0.688 - ETA: 6s - loss: 0.7805 - sparse_categorical_accuracy: 0.688 - ETA: 6s - loss: 0.7799 - sparse_categorical_accuracy: 0.688 - ETA: 6s - loss: 0.7795 - sparse_categorical_accuracy: 0.688 - ETA: 5s - loss: 0.7787 - sparse_categorical_accuracy: 0.688 - ETA: 5s - loss: 0.7778 - sparse_categorical_accuracy: 0.688 - ETA: 5s - loss: 0.7771 - sparse_categorical_accuracy: 0.689 - ETA: 5s - loss: 0.7767 - sparse_categorical_accuracy: 0.689 - ETA: 5s - loss: 0.7760 - sparse_categorical_accuracy: 0.689 - ETA: 5s - loss: 0.7754 - sparse_categorical_accuracy: 0.689 - ETA: 5s - loss: 0.7747 - sparse_categorical_accuracy: 0.689 - ETA: 5s - loss: 0.7740 - sparse_categorical_accuracy: 0.689 - ETA: 5s - loss: 0.7734 - sparse_categorical_accuracy: 0.689 - ETA: 5s - loss: 0.7729 - sparse_categorical_accuracy: 0.690 - ETA: 5s - loss: 0.7722 - sparse_categorical_accuracy: 0.690 - ETA: 5s - loss: 0.7717 - sparse_categorical_accuracy: 0.690 - ETA: 5s - loss: 0.7708 - sparse_categorical_accuracy: 0.690 - ETA: 5s - loss: 0.7702 - sparse_categorical_accuracy: 0.690 - ETA: 5s - loss: 0.7693 - sparse_categorical_accuracy: 0.691 - ETA: 5s - loss: 0.7688 - sparse_categorical_accuracy: 0.691 - ETA: 5s - loss: 0.7684 - sparse_categorical_accuracy: 0.691 - ETA: 4s - loss: 0.7679 - sparse_categorical_accuracy: 0.691 - ETA: 4s - loss: 0.7672 - sparse_categorical_accuracy: 0.691 - ETA: 4s - loss: 0.7665 - sparse_categorical_accuracy: 0.691 - ETA: 4s - loss: 0.7659 - sparse_categorical_accuracy: 0.691 - ETA: 4s - loss: 0.7653 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7650 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7646 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7643 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7638 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7632 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7627 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7621 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7615 - sparse_categorical_accuracy: 0.692 - ETA: 4s - loss: 0.7607 - sparse_categorical_accuracy: 0.693 - ETA: 4s - loss: 0.7602 - sparse_categorical_accuracy: 0.693 - ETA: 4s - loss: 0.7599 - sparse_categorical_accuracy: 0.693 - ETA: 4s - loss: 0.7593 - sparse_categorical_accuracy: 0.693 - ETA: 4s - loss: 0.7586 - sparse_categorical_accuracy: 0.693 - ETA: 3s - loss: 0.7579 - sparse_categorical_accuracy: 0.693 - ETA: 3s - loss: 0.7573 - sparse_categorical_accuracy: 0.694 - ETA: 3s - loss: 0.7567 - sparse_categorical_accuracy: 0.694 - ETA: 3s - loss: 0.7563 - sparse_categorical_accuracy: 0.694 - ETA: 3s - loss: 0.7557 - sparse_categorical_accuracy: 0.694 - ETA: 3s - loss: 0.7553 - sparse_categorical_accuracy: 0.694 - ETA: 3s - loss: 0.7547 - sparse_categorical_accuracy: 0.694 - ETA: 3s - loss: 0.7542 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7539 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7535 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7529 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7524 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7521 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7518 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7512 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7509 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7505 - sparse_categorical_accuracy: 0.695 - ETA: 3s - loss: 0.7499 - sparse_categorical_accuracy: 0.695 - ETA: 2s - loss: 0.7495 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7492 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7488 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7484 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7480 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7474 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7471 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7468 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7466 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7462 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7459 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7455 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7451 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7447 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7443 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7439 - sparse_categorical_accuracy: 0.696 - ETA: 2s - loss: 0.7435 - sparse_categorical_accuracy: 0.697 - ETA: 2s - loss: 0.7430 - sparse_categorical_accuracy: 0.697 - ETA: 2s - loss: 0.7427 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7423 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7418 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7413 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7409 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7407 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7403 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7398 - sparse_categorical_accuracy: 0.697 - ETA: 1s - loss: 0.7394 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7391 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7387 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7385 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7380 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7376 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7372 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7368 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7366 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7362 - sparse_categorical_accuracy: 0.698 - ETA: 1s - loss: 0.7358 - sparse_categorical_accuracy: 0.699 - ETA: 1s - loss: 0.7354 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7352 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7349 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7346 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7342 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7340 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7338 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7337 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7332 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7329 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7327 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7323 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7320 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7317 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7314 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7310 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7305 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7301 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7298 - sparse_categorical_accuracy: 0.700 - 18s 4ms/step - loss: 0.7295 - sparse_categorical_accuracy: 0.7006 - val_loss: 0.6473 - val_sparse_categorical_accuracy: 0.7215
INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/training/tracking/tracking.py:111: Model.state_updates (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/training/tracking/tracking.py:111: Model.state_updates (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.


WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/training/tracking/tracking.py:111: Layer.updates (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.


2024-03-24 14:25:07.796936: W tensorflow/python/util/util.cc:348] Sets are not currently considered sequences, but this may change in the future, so consider avoiding using them.
WARNING:tensorflow:From /home/jupyter/.local/lib/python3.7/site-packages/tensorflow/python/training/tracking/tracking.py:111: Layer.updates (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.


INFO:tensorflow:Assets written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model/7/serving_model_dir/assets


INFO:tensorflow:Assets written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model/7/serving_model_dir/assets
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f86428b77d0
.execution_id7
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['transform_graph']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['schema']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['model_run']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Analyzing training runs with TensorBoard

In this step you will analyze the training run with TensorBoard.dev. TensorBoard.dev is a managed service that enables you to easily host, track and share your ML experiments.

Retrieve the location of TensorBoard logs

Each model run’s train and eval metric logs are written to the model_run directory by the Tensorboard callback defined in model.py.

logs_path = trainer.outputs['model_run'].get()[0].uri
print(logs_path)
/home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/7

Upload the logs and start TensorBoard.dev

  1. Open a new JupyterLab terminal window

  2. From the terminal window, execute the following command

tensorboard dev upload --logdir [YOUR_LOGDIR]

Where [YOUR_LOGDIR] is an URI retrieved by the previous cell.

You will be asked to authorize TensorBoard.dev using your Google account. If you don’t have a Google account or you don’t want to authorize TensorBoard.dev you can skip this exercise.

After the authorization process completes, follow the link provided to view your experiment.

Tune your model’s hyperparameters with the Tuner component

The Tuner component makes use of the Python KerasTuner API to tune your model’s hyperparameters. It tighty integrates with the Transform and Trainer components for model hyperparameter tuning in continuous training pipelines as well as advanced use cases such as feature selection, feature engineering, and model architecture search.

Tuner takes:

With the given data, model, and objective, Tuner tunes the hyperparameters and emits the best results that can be directly fed into the Trainer component during model re-training.

tuner = Tuner(
        module_file=TRAINER_MODULE_FILE,
        examples=transform.outputs['transformed_examples'],
        transform_graph=transform.outputs['transform_graph'],
        train_args=trainer_pb2.TrainArgs(num_steps=1000),
        eval_args=trainer_pb2.EvalArgs(num_steps=500))
context.run(tuner)
2024-03-24 14:25:09.608743: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE

Search space summary

|-Default search space size: 5

learning_rate (Choice)

|-default: 0.001

|-ordered: True

|-values: [0.01, 0.001, 0.0001]

n_layers (Int)

|-default: 1

|-max_value: 2

|-min_value: 1

|-sampling: None

|-step: 1

n_layers=1/n_units_1 (Int)

|-default: 8

|-max_value: 128

|-min_value: 8

|-sampling: None

|-step: 8

n_layers=2/n_units_1 (Int)

|-default: 8

|-max_value: 128

|-min_value: 8

|-sampling: None

|-step: 8

n_layers=2/n_units_2 (Int)

|-default: 8

|-max_value: 128

|-min_value: 8

|-sampling: None

|-step: 8

1000/1000 [==============================] - ETA: 0s - loss: 2.0034 - sparse_categorical_accuracy: 0.187 - ETA: 2s - loss: 1.1720 - sparse_categorical_accuracy: 0.568 - ETA: 2s - loss: 0.9921 - sparse_categorical_accuracy: 0.633 - ETA: 2s - loss: 0.9216 - sparse_categorical_accuracy: 0.644 - ETA: 2s - loss: 0.8795 - sparse_categorical_accuracy: 0.652 - ETA: 2s - loss: 0.8595 - sparse_categorical_accuracy: 0.653 - ETA: 2s - loss: 0.8344 - sparse_categorical_accuracy: 0.661 - ETA: 2s - loss: 0.8200 - sparse_categorical_accuracy: 0.665 - ETA: 2s - loss: 0.8105 - sparse_categorical_accuracy: 0.666 - ETA: 2s - loss: 0.8002 - sparse_categorical_accuracy: 0.668 - ETA: 2s - loss: 0.7906 - sparse_categorical_accuracy: 0.671 - ETA: 2s - loss: 0.7811 - sparse_categorical_accuracy: 0.675 - ETA: 2s - loss: 0.7753 - sparse_categorical_accuracy: 0.676 - ETA: 2s - loss: 0.7697 - sparse_categorical_accuracy: 0.678 - ETA: 1s - loss: 0.7633 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7619 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7582 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7538 - sparse_categorical_accuracy: 0.683 - ETA: 1s - loss: 0.7510 - sparse_categorical_accuracy: 0.684 - ETA: 1s - loss: 0.7483 - sparse_categorical_accuracy: 0.684 - ETA: 1s - loss: 0.7457 - sparse_categorical_accuracy: 0.685 - ETA: 1s - loss: 0.7424 - sparse_categorical_accuracy: 0.686 - ETA: 1s - loss: 0.7395 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7377 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7350 - sparse_categorical_accuracy: 0.688 - ETA: 1s - loss: 0.7344 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7332 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7316 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7295 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7279 - sparse_categorical_accuracy: 0.688 - ETA: 1s - loss: 0.7261 - sparse_categorical_accuracy: 0.688 - ETA: 1s - loss: 0.7244 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7230 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7207 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7189 - sparse_categorical_accuracy: 0.691 - ETA: 0s - loss: 0.7180 - sparse_categorical_accuracy: 0.691 - ETA: 0s - loss: 0.7172 - sparse_categorical_accuracy: 0.691 - ETA: 0s - loss: 0.7172 - sparse_categorical_accuracy: 0.691 - ETA: 0s - loss: 0.7159 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7144 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7138 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7138 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7126 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7123 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7109 - sparse_categorical_accuracy: 0.693 - ETA: 0s - loss: 0.7100 - sparse_categorical_accuracy: 0.693 - ETA: 0s - loss: 0.7098 - sparse_categorical_accuracy: 0.694 - ETA: 0s - loss: 0.7083 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7075 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7068 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7055 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7046 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7035 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7030 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7018 - sparse_categorical_accuracy: 0.698 - 4s 4ms/step - loss: 0.7017 - sparse_categorical_accuracy: 0.6981 - val_loss: 0.6961 - val_sparse_categorical_accuracy: 0.6950

Trial complete

Trial summary

|-Trial ID: 408c3b83bddbaf40e88b3a63d229a638

|-Score: 0.6950312256813049

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 2

|-n_layers=1/n_units_1: 64

|-n_layers=2/n_units_1: 40

|-n_layers=2/n_units_2: 112

1000/1000 [==============================] - ETA: 0s - loss: 2.0680 - sparse_categorical_accuracy: 0.156 - ETA: 3s - loss: 2.0236 - sparse_categorical_accuracy: 0.174 - ETA: 3s - loss: 1.9818 - sparse_categorical_accuracy: 0.191 - ETA: 2s - loss: 1.9486 - sparse_categorical_accuracy: 0.224 - ETA: 2s - loss: 1.9134 - sparse_categorical_accuracy: 0.249 - ETA: 2s - loss: 1.8874 - sparse_categorical_accuracy: 0.273 - ETA: 2s - loss: 1.8595 - sparse_categorical_accuracy: 0.297 - ETA: 2s - loss: 1.8300 - sparse_categorical_accuracy: 0.324 - ETA: 2s - loss: 1.8028 - sparse_categorical_accuracy: 0.346 - ETA: 2s - loss: 1.7769 - sparse_categorical_accuracy: 0.368 - ETA: 2s - loss: 1.7535 - sparse_categorical_accuracy: 0.386 - ETA: 2s - loss: 1.7329 - sparse_categorical_accuracy: 0.399 - ETA: 2s - loss: 1.7121 - sparse_categorical_accuracy: 0.411 - ETA: 2s - loss: 1.6930 - sparse_categorical_accuracy: 0.421 - ETA: 2s - loss: 1.6772 - sparse_categorical_accuracy: 0.430 - ETA: 2s - loss: 1.6597 - sparse_categorical_accuracy: 0.439 - ETA: 2s - loss: 1.6435 - sparse_categorical_accuracy: 0.448 - ETA: 2s - loss: 1.6252 - sparse_categorical_accuracy: 0.458 - ETA: 1s - loss: 1.6090 - sparse_categorical_accuracy: 0.466 - ETA: 1s - loss: 1.5944 - sparse_categorical_accuracy: 0.473 - ETA: 1s - loss: 1.5795 - sparse_categorical_accuracy: 0.480 - ETA: 1s - loss: 1.5642 - sparse_categorical_accuracy: 0.486 - ETA: 1s - loss: 1.5483 - sparse_categorical_accuracy: 0.493 - ETA: 1s - loss: 1.5337 - sparse_categorical_accuracy: 0.498 - ETA: 1s - loss: 1.5195 - sparse_categorical_accuracy: 0.504 - ETA: 1s - loss: 1.5053 - sparse_categorical_accuracy: 0.510 - ETA: 1s - loss: 1.4923 - sparse_categorical_accuracy: 0.515 - ETA: 1s - loss: 1.4784 - sparse_categorical_accuracy: 0.520 - ETA: 1s - loss: 1.4663 - sparse_categorical_accuracy: 0.525 - ETA: 1s - loss: 1.4522 - sparse_categorical_accuracy: 0.529 - ETA: 1s - loss: 1.4403 - sparse_categorical_accuracy: 0.533 - ETA: 1s - loss: 1.4286 - sparse_categorical_accuracy: 0.537 - ETA: 1s - loss: 1.4163 - sparse_categorical_accuracy: 0.541 - ETA: 1s - loss: 1.4038 - sparse_categorical_accuracy: 0.545 - ETA: 1s - loss: 1.3923 - sparse_categorical_accuracy: 0.548 - ETA: 1s - loss: 1.3802 - sparse_categorical_accuracy: 0.552 - ETA: 1s - loss: 1.3693 - sparse_categorical_accuracy: 0.555 - ETA: 1s - loss: 1.3585 - sparse_categorical_accuracy: 0.559 - ETA: 0s - loss: 1.3461 - sparse_categorical_accuracy: 0.563 - ETA: 0s - loss: 1.3334 - sparse_categorical_accuracy: 0.567 - ETA: 0s - loss: 1.3228 - sparse_categorical_accuracy: 0.570 - ETA: 0s - loss: 1.3142 - sparse_categorical_accuracy: 0.573 - ETA: 0s - loss: 1.3045 - sparse_categorical_accuracy: 0.575 - ETA: 0s - loss: 1.2947 - sparse_categorical_accuracy: 0.578 - ETA: 0s - loss: 1.2867 - sparse_categorical_accuracy: 0.580 - ETA: 0s - loss: 1.2774 - sparse_categorical_accuracy: 0.582 - ETA: 0s - loss: 1.2686 - sparse_categorical_accuracy: 0.584 - ETA: 0s - loss: 1.2615 - sparse_categorical_accuracy: 0.586 - ETA: 0s - loss: 1.2551 - sparse_categorical_accuracy: 0.587 - ETA: 0s - loss: 1.2474 - sparse_categorical_accuracy: 0.589 - ETA: 0s - loss: 1.2396 - sparse_categorical_accuracy: 0.591 - ETA: 0s - loss: 1.2298 - sparse_categorical_accuracy: 0.594 - ETA: 0s - loss: 1.2228 - sparse_categorical_accuracy: 0.595 - ETA: 0s - loss: 1.2164 - sparse_categorical_accuracy: 0.597 - ETA: 0s - loss: 1.2092 - sparse_categorical_accuracy: 0.599 - ETA: 0s - loss: 1.2030 - sparse_categorical_accuracy: 0.600 - ETA: 0s - loss: 1.1963 - sparse_categorical_accuracy: 0.602 - ETA: 0s - loss: 1.1896 - sparse_categorical_accuracy: 0.603 - 4s 4ms/step - loss: 1.1850 - sparse_categorical_accuracy: 0.6048 - val_loss: 0.8132 - val_sparse_categorical_accuracy: 0.6889

Trial complete

Trial summary

|-Trial ID: 907d15cf53fd6f78b450100fe3b79521

|-Score: 0.6888750195503235

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.0001

|-n_layers: 2

|-n_layers=1/n_units_1: 16

|-n_layers=2/n_units_1: 56

|-n_layers=2/n_units_2: 112

1000/1000 [==============================] - ETA: 0s - loss: 2.1123 - sparse_categorical_accuracy: 0.046 - ETA: 2s - loss: 1.9099 - sparse_categorical_accuracy: 0.207 - ETA: 2s - loss: 1.7951 - sparse_categorical_accuracy: 0.303 - ETA: 2s - loss: 1.6977 - sparse_categorical_accuracy: 0.366 - ETA: 2s - loss: 1.5992 - sparse_categorical_accuracy: 0.417 - ETA: 2s - loss: 1.5176 - sparse_categorical_accuracy: 0.450 - ETA: 2s - loss: 1.4400 - sparse_categorical_accuracy: 0.477 - ETA: 2s - loss: 1.3678 - sparse_categorical_accuracy: 0.503 - ETA: 2s - loss: 1.3209 - sparse_categorical_accuracy: 0.518 - ETA: 2s - loss: 1.2719 - sparse_categorical_accuracy: 0.534 - ETA: 2s - loss: 1.2310 - sparse_categorical_accuracy: 0.547 - ETA: 2s - loss: 1.1981 - sparse_categorical_accuracy: 0.558 - ETA: 2s - loss: 1.1691 - sparse_categorical_accuracy: 0.566 - ETA: 2s - loss: 1.1435 - sparse_categorical_accuracy: 0.573 - ETA: 2s - loss: 1.1206 - sparse_categorical_accuracy: 0.579 - ETA: 2s - loss: 1.0992 - sparse_categorical_accuracy: 0.585 - ETA: 1s - loss: 1.0794 - sparse_categorical_accuracy: 0.590 - ETA: 1s - loss: 1.0606 - sparse_categorical_accuracy: 0.596 - ETA: 1s - loss: 1.0459 - sparse_categorical_accuracy: 0.599 - ETA: 1s - loss: 1.0309 - sparse_categorical_accuracy: 0.604 - ETA: 1s - loss: 1.0191 - sparse_categorical_accuracy: 0.607 - ETA: 1s - loss: 1.0090 - sparse_categorical_accuracy: 0.609 - ETA: 1s - loss: 0.9963 - sparse_categorical_accuracy: 0.613 - ETA: 1s - loss: 0.9862 - sparse_categorical_accuracy: 0.617 - ETA: 1s - loss: 0.9747 - sparse_categorical_accuracy: 0.621 - ETA: 1s - loss: 0.9641 - sparse_categorical_accuracy: 0.624 - ETA: 1s - loss: 0.9553 - sparse_categorical_accuracy: 0.626 - ETA: 1s - loss: 0.9465 - sparse_categorical_accuracy: 0.629 - ETA: 1s - loss: 0.9384 - sparse_categorical_accuracy: 0.631 - ETA: 1s - loss: 0.9299 - sparse_categorical_accuracy: 0.633 - ETA: 1s - loss: 0.9233 - sparse_categorical_accuracy: 0.635 - ETA: 1s - loss: 0.9174 - sparse_categorical_accuracy: 0.636 - ETA: 1s - loss: 0.9113 - sparse_categorical_accuracy: 0.638 - ETA: 1s - loss: 0.9053 - sparse_categorical_accuracy: 0.640 - ETA: 1s - loss: 0.9010 - sparse_categorical_accuracy: 0.641 - ETA: 1s - loss: 0.8944 - sparse_categorical_accuracy: 0.643 - ETA: 0s - loss: 0.8903 - sparse_categorical_accuracy: 0.644 - ETA: 0s - loss: 0.8857 - sparse_categorical_accuracy: 0.645 - ETA: 0s - loss: 0.8814 - sparse_categorical_accuracy: 0.646 - ETA: 0s - loss: 0.8773 - sparse_categorical_accuracy: 0.647 - ETA: 0s - loss: 0.8728 - sparse_categorical_accuracy: 0.649 - ETA: 0s - loss: 0.8688 - sparse_categorical_accuracy: 0.650 - ETA: 0s - loss: 0.8649 - sparse_categorical_accuracy: 0.652 - ETA: 0s - loss: 0.8610 - sparse_categorical_accuracy: 0.653 - ETA: 0s - loss: 0.8570 - sparse_categorical_accuracy: 0.654 - ETA: 0s - loss: 0.8533 - sparse_categorical_accuracy: 0.655 - ETA: 0s - loss: 0.8498 - sparse_categorical_accuracy: 0.656 - ETA: 0s - loss: 0.8457 - sparse_categorical_accuracy: 0.657 - ETA: 0s - loss: 0.8423 - sparse_categorical_accuracy: 0.658 - ETA: 0s - loss: 0.8398 - sparse_categorical_accuracy: 0.659 - ETA: 0s - loss: 0.8367 - sparse_categorical_accuracy: 0.660 - ETA: 0s - loss: 0.8340 - sparse_categorical_accuracy: 0.661 - ETA: 0s - loss: 0.8311 - sparse_categorical_accuracy: 0.662 - ETA: 0s - loss: 0.8281 - sparse_categorical_accuracy: 0.663 - ETA: 0s - loss: 0.8253 - sparse_categorical_accuracy: 0.663 - 4s 4ms/step - loss: 0.8224 - sparse_categorical_accuracy: 0.6646 - val_loss: 0.6904 - val_sparse_categorical_accuracy: 0.7062

Trial complete

Trial summary

|-Trial ID: 910c031c797c3ddf62586b0caa5a435a

|-Score: 0.7061562538146973

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.001

|-n_layers: 2

|-n_layers=1/n_units_1: 80

|-n_layers=2/n_units_1: 8

|-n_layers=2/n_units_2: 104

1000/1000 [==============================] - ETA: 0s - loss: 2.0137 - sparse_categorical_accuracy: 0.156 - ETA: 2s - loss: 1.3199 - sparse_categorical_accuracy: 0.535 - ETA: 2s - loss: 1.1099 - sparse_categorical_accuracy: 0.589 - ETA: 2s - loss: 1.0145 - sparse_categorical_accuracy: 0.614 - ETA: 2s - loss: 0.9516 - sparse_categorical_accuracy: 0.632 - ETA: 2s - loss: 0.9118 - sparse_categorical_accuracy: 0.640 - ETA: 2s - loss: 0.8698 - sparse_categorical_accuracy: 0.651 - ETA: 2s - loss: 0.8462 - sparse_categorical_accuracy: 0.657 - ETA: 2s - loss: 0.8281 - sparse_categorical_accuracy: 0.662 - ETA: 2s - loss: 0.8155 - sparse_categorical_accuracy: 0.667 - ETA: 2s - loss: 0.8054 - sparse_categorical_accuracy: 0.670 - ETA: 2s - loss: 0.7962 - sparse_categorical_accuracy: 0.672 - ETA: 2s - loss: 0.7889 - sparse_categorical_accuracy: 0.673 - ETA: 1s - loss: 0.7781 - sparse_categorical_accuracy: 0.677 - ETA: 1s - loss: 0.7681 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7658 - sparse_categorical_accuracy: 0.680 - ETA: 1s - loss: 0.7584 - sparse_categorical_accuracy: 0.682 - ETA: 1s - loss: 0.7520 - sparse_categorical_accuracy: 0.684 - ETA: 1s - loss: 0.7481 - sparse_categorical_accuracy: 0.685 - ETA: 1s - loss: 0.7429 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7397 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7362 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7320 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7301 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7287 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7266 - sparse_categorical_accuracy: 0.692 - ETA: 1s - loss: 0.7239 - sparse_categorical_accuracy: 0.693 - ETA: 1s - loss: 0.7213 - sparse_categorical_accuracy: 0.693 - ETA: 1s - loss: 0.7196 - sparse_categorical_accuracy: 0.694 - ETA: 1s - loss: 0.7178 - sparse_categorical_accuracy: 0.694 - ETA: 1s - loss: 0.7160 - sparse_categorical_accuracy: 0.694 - ETA: 1s - loss: 0.7138 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7127 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7108 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7089 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7077 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7071 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7054 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7045 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7035 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7024 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7013 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7004 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6992 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.6982 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.6977 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6976 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6967 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6957 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6954 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6945 - sparse_categorical_accuracy: 0.702 - ETA: 0s - loss: 0.6947 - sparse_categorical_accuracy: 0.702 - 4s 4ms/step - loss: 0.6943 - sparse_categorical_accuracy: 0.7021 - val_loss: 0.6585 - val_sparse_categorical_accuracy: 0.7159

Trial complete

Trial summary

|-Trial ID: 6097ae630b97acef9784ec96704918d4

|-Score: 0.7159062623977661

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 56

|-n_layers=2/n_units_1: 120

|-n_layers=2/n_units_2: 24

1000/1000 [==============================] - ETA: 0s - loss: 2.1049 - sparse_categorical_accuracy: 0.125 - ETA: 2s - loss: 1.3338 - sparse_categorical_accuracy: 0.521 - ETA: 2s - loss: 1.0920 - sparse_categorical_accuracy: 0.593 - ETA: 2s - loss: 0.9678 - sparse_categorical_accuracy: 0.626 - ETA: 2s - loss: 0.9196 - sparse_categorical_accuracy: 0.636 - ETA: 2s - loss: 0.8835 - sparse_categorical_accuracy: 0.647 - ETA: 2s - loss: 0.8576 - sparse_categorical_accuracy: 0.654 - ETA: 2s - loss: 0.8380 - sparse_categorical_accuracy: 0.660 - ETA: 2s - loss: 0.8207 - sparse_categorical_accuracy: 0.663 - ETA: 2s - loss: 0.8077 - sparse_categorical_accuracy: 0.668 - ETA: 1s - loss: 0.7987 - sparse_categorical_accuracy: 0.669 - ETA: 1s - loss: 0.7893 - sparse_categorical_accuracy: 0.672 - ETA: 1s - loss: 0.7806 - sparse_categorical_accuracy: 0.673 - ETA: 1s - loss: 0.7720 - sparse_categorical_accuracy: 0.677 - ETA: 1s - loss: 0.7663 - sparse_categorical_accuracy: 0.679 - ETA: 1s - loss: 0.7614 - sparse_categorical_accuracy: 0.680 - ETA: 1s - loss: 0.7566 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7505 - sparse_categorical_accuracy: 0.683 - ETA: 1s - loss: 0.7468 - sparse_categorical_accuracy: 0.683 - ETA: 1s - loss: 0.7416 - sparse_categorical_accuracy: 0.685 - ETA: 1s - loss: 0.7369 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7328 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7290 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7238 - sparse_categorical_accuracy: 0.692 - ETA: 1s - loss: 0.7213 - sparse_categorical_accuracy: 0.693 - ETA: 1s - loss: 0.7184 - sparse_categorical_accuracy: 0.694 - ETA: 1s - loss: 0.7152 - sparse_categorical_accuracy: 0.695 - ETA: 1s - loss: 0.7146 - sparse_categorical_accuracy: 0.695 - ETA: 1s - loss: 0.7130 - sparse_categorical_accuracy: 0.695 - ETA: 1s - loss: 0.7118 - sparse_categorical_accuracy: 0.695 - ETA: 1s - loss: 0.7105 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7084 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7063 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7050 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7047 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7045 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7036 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7032 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7021 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7018 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7002 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.6996 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.6977 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.6973 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6964 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6957 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6959 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6948 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.6941 - sparse_categorical_accuracy: 0.702 - ETA: 0s - loss: 0.6926 - sparse_categorical_accuracy: 0.702 - 4s 4ms/step - loss: 0.6913 - sparse_categorical_accuracy: 0.7032 - val_loss: 0.6560 - val_sparse_categorical_accuracy: 0.7141

Trial complete

Trial summary

|-Trial ID: 48ba1f4c61dc2f0f2f45348fc3dbc23f

|-Score: 0.7140937447547913

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 40

|-n_layers=2/n_units_1: 112

|-n_layers=2/n_units_2: 88

1000/1000 [==============================] - ETA: 0s - loss: 2.0306 - sparse_categorical_accuracy: 0.218 - ETA: 2s - loss: 1.2977 - sparse_categorical_accuracy: 0.529 - ETA: 2s - loss: 1.0488 - sparse_categorical_accuracy: 0.603 - ETA: 2s - loss: 0.9347 - sparse_categorical_accuracy: 0.641 - ETA: 2s - loss: 0.8927 - sparse_categorical_accuracy: 0.648 - ETA: 2s - loss: 0.8580 - sparse_categorical_accuracy: 0.655 - ETA: 2s - loss: 0.8388 - sparse_categorical_accuracy: 0.660 - ETA: 2s - loss: 0.8268 - sparse_categorical_accuracy: 0.662 - ETA: 2s - loss: 0.8140 - sparse_categorical_accuracy: 0.666 - ETA: 2s - loss: 0.8046 - sparse_categorical_accuracy: 0.668 - ETA: 2s - loss: 0.7900 - sparse_categorical_accuracy: 0.672 - ETA: 1s - loss: 0.7815 - sparse_categorical_accuracy: 0.674 - ETA: 1s - loss: 0.7709 - sparse_categorical_accuracy: 0.679 - ETA: 1s - loss: 0.7670 - sparse_categorical_accuracy: 0.680 - ETA: 1s - loss: 0.7597 - sparse_categorical_accuracy: 0.682 - ETA: 1s - loss: 0.7543 - sparse_categorical_accuracy: 0.684 - ETA: 1s - loss: 0.7518 - sparse_categorical_accuracy: 0.685 - ETA: 1s - loss: 0.7500 - sparse_categorical_accuracy: 0.684 - ETA: 1s - loss: 0.7454 - sparse_categorical_accuracy: 0.686 - ETA: 1s - loss: 0.7418 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7384 - sparse_categorical_accuracy: 0.688 - ETA: 1s - loss: 0.7362 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7322 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7301 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7290 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7280 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7262 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7248 - sparse_categorical_accuracy: 0.692 - ETA: 1s - loss: 0.7225 - sparse_categorical_accuracy: 0.692 - ETA: 1s - loss: 0.7207 - sparse_categorical_accuracy: 0.692 - ETA: 1s - loss: 0.7190 - sparse_categorical_accuracy: 0.693 - ETA: 1s - loss: 0.7173 - sparse_categorical_accuracy: 0.693 - ETA: 0s - loss: 0.7165 - sparse_categorical_accuracy: 0.694 - ETA: 0s - loss: 0.7139 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7131 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7115 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7108 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7094 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7089 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7073 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7056 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7051 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7040 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7034 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7021 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7013 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7008 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7001 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6996 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6993 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6989 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.6978 - sparse_categorical_accuracy: 0.700 - 4s 4ms/step - loss: 0.6979 - sparse_categorical_accuracy: 0.7005 - val_loss: 0.6691 - val_sparse_categorical_accuracy: 0.6997

Trial complete

Trial summary

|-Trial ID: 00b4372ffe35b836f45f3ee8199da2de

|-Score: 0.6997187733650208

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 104

|-n_layers=2/n_units_1: 128

|-n_layers=2/n_units_2: 64

/opt/conda/lib/python3.7/site-packages/sklearn/gaussian_process/kernels.py:427: ConvergenceWarning: The optimal value found for dimension 0 of parameter length_scale is close to the specified lower bound 1e-05. Decreasing the bound and calling fit again may find a better value.
  ConvergenceWarning,


1000/1000 [==============================] - ETA: 0s - loss: 1.9593 - sparse_categorical_accuracy: 0.265 - ETA: 3s - loss: 2.0983 - sparse_categorical_accuracy: 0.159 - ETA: 2s - loss: 2.0746 - sparse_categorical_accuracy: 0.161 - ETA: 2s - loss: 2.0407 - sparse_categorical_accuracy: 0.172 - ETA: 2s - loss: 2.0128 - sparse_categorical_accuracy: 0.183 - ETA: 2s - loss: 1.9879 - sparse_categorical_accuracy: 0.192 - ETA: 2s - loss: 1.9660 - sparse_categorical_accuracy: 0.208 - ETA: 2s - loss: 1.9462 - sparse_categorical_accuracy: 0.220 - ETA: 2s - loss: 1.9260 - sparse_categorical_accuracy: 0.235 - ETA: 2s - loss: 1.9038 - sparse_categorical_accuracy: 0.250 - ETA: 2s - loss: 1.8878 - sparse_categorical_accuracy: 0.260 - ETA: 2s - loss: 1.8695 - sparse_categorical_accuracy: 0.275 - ETA: 2s - loss: 1.8523 - sparse_categorical_accuracy: 0.290 - ETA: 1s - loss: 1.8352 - sparse_categorical_accuracy: 0.304 - ETA: 1s - loss: 1.8204 - sparse_categorical_accuracy: 0.317 - ETA: 1s - loss: 1.8046 - sparse_categorical_accuracy: 0.329 - ETA: 1s - loss: 1.7892 - sparse_categorical_accuracy: 0.343 - ETA: 1s - loss: 1.7742 - sparse_categorical_accuracy: 0.354 - ETA: 1s - loss: 1.7583 - sparse_categorical_accuracy: 0.366 - ETA: 1s - loss: 1.7457 - sparse_categorical_accuracy: 0.374 - ETA: 1s - loss: 1.7317 - sparse_categorical_accuracy: 0.383 - ETA: 1s - loss: 1.7193 - sparse_categorical_accuracy: 0.391 - ETA: 1s - loss: 1.7066 - sparse_categorical_accuracy: 0.399 - ETA: 1s - loss: 1.6936 - sparse_categorical_accuracy: 0.407 - ETA: 1s - loss: 1.6814 - sparse_categorical_accuracy: 0.415 - ETA: 1s - loss: 1.6698 - sparse_categorical_accuracy: 0.421 - ETA: 1s - loss: 1.6587 - sparse_categorical_accuracy: 0.427 - ETA: 1s - loss: 1.6447 - sparse_categorical_accuracy: 0.434 - ETA: 1s - loss: 1.6324 - sparse_categorical_accuracy: 0.441 - ETA: 1s - loss: 1.6207 - sparse_categorical_accuracy: 0.447 - ETA: 1s - loss: 1.6112 - sparse_categorical_accuracy: 0.451 - ETA: 1s - loss: 1.5984 - sparse_categorical_accuracy: 0.457 - ETA: 0s - loss: 1.5868 - sparse_categorical_accuracy: 0.462 - ETA: 0s - loss: 1.5767 - sparse_categorical_accuracy: 0.467 - ETA: 0s - loss: 1.5691 - sparse_categorical_accuracy: 0.471 - ETA: 0s - loss: 1.5587 - sparse_categorical_accuracy: 0.475 - ETA: 0s - loss: 1.5488 - sparse_categorical_accuracy: 0.479 - ETA: 0s - loss: 1.5387 - sparse_categorical_accuracy: 0.483 - ETA: 0s - loss: 1.5294 - sparse_categorical_accuracy: 0.487 - ETA: 0s - loss: 1.5199 - sparse_categorical_accuracy: 0.491 - ETA: 0s - loss: 1.5094 - sparse_categorical_accuracy: 0.495 - ETA: 0s - loss: 1.5004 - sparse_categorical_accuracy: 0.498 - ETA: 0s - loss: 1.4908 - sparse_categorical_accuracy: 0.502 - ETA: 0s - loss: 1.4810 - sparse_categorical_accuracy: 0.506 - ETA: 0s - loss: 1.4715 - sparse_categorical_accuracy: 0.509 - ETA: 0s - loss: 1.4628 - sparse_categorical_accuracy: 0.512 - ETA: 0s - loss: 1.4548 - sparse_categorical_accuracy: 0.515 - ETA: 0s - loss: 1.4457 - sparse_categorical_accuracy: 0.518 - ETA: 0s - loss: 1.4379 - sparse_categorical_accuracy: 0.521 - ETA: 0s - loss: 1.4287 - sparse_categorical_accuracy: 0.524 - ETA: 0s - loss: 1.4205 - sparse_categorical_accuracy: 0.527 - ETA: 0s - loss: 1.4124 - sparse_categorical_accuracy: 0.530 - 4s 4ms/step - loss: 1.4083 - sparse_categorical_accuracy: 0.5320 - val_loss: 1.0023 - val_sparse_categorical_accuracy: 0.6722

Trial complete

Trial summary

|-Trial ID: 7cdbd941583dc5554a76e421f946e0a0

|-Score: 0.672249972820282

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.0001

|-n_layers: 1

|-n_layers=1/n_units_1: 128

|-n_layers=2/n_units_1: 88

|-n_layers=2/n_units_2: 64

1000/1000 [==============================] - ETA: 0s - loss: 1.9450 - sparse_categorical_accuracy: 0.250 - ETA: 2s - loss: 1.5143 - sparse_categorical_accuracy: 0.492 - ETA: 2s - loss: 1.2850 - sparse_categorical_accuracy: 0.551 - ETA: 2s - loss: 1.1352 - sparse_categorical_accuracy: 0.593 - ETA: 2s - loss: 1.0446 - sparse_categorical_accuracy: 0.616 - ETA: 2s - loss: 0.9820 - sparse_categorical_accuracy: 0.632 - ETA: 2s - loss: 0.9321 - sparse_categorical_accuracy: 0.647 - ETA: 2s - loss: 0.8995 - sparse_categorical_accuracy: 0.654 - ETA: 2s - loss: 0.8767 - sparse_categorical_accuracy: 0.661 - ETA: 1s - loss: 0.8583 - sparse_categorical_accuracy: 0.665 - ETA: 1s - loss: 0.8381 - sparse_categorical_accuracy: 0.670 - ETA: 1s - loss: 0.8260 - sparse_categorical_accuracy: 0.673 - ETA: 1s - loss: 0.8147 - sparse_categorical_accuracy: 0.674 - ETA: 1s - loss: 0.8005 - sparse_categorical_accuracy: 0.678 - ETA: 1s - loss: 0.7915 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7838 - sparse_categorical_accuracy: 0.684 - ETA: 1s - loss: 0.7782 - sparse_categorical_accuracy: 0.685 - ETA: 1s - loss: 0.7722 - sparse_categorical_accuracy: 0.686 - ETA: 1s - loss: 0.7661 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7599 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7552 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7510 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7473 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7445 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7424 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7383 - sparse_categorical_accuracy: 0.692 - ETA: 1s - loss: 0.7338 - sparse_categorical_accuracy: 0.693 - ETA: 1s - loss: 0.7305 - sparse_categorical_accuracy: 0.694 - ETA: 0s - loss: 0.7275 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7254 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7236 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7206 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7185 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7178 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7163 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7151 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7133 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7112 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7100 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7093 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7080 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7066 - sparse_categorical_accuracy: 0.700 - ETA: 0s - loss: 0.7056 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.7045 - sparse_categorical_accuracy: 0.701 - ETA: 0s - loss: 0.7029 - sparse_categorical_accuracy: 0.702 - ETA: 0s - loss: 0.7014 - sparse_categorical_accuracy: 0.702 - ETA: 0s - loss: 0.7006 - sparse_categorical_accuracy: 0.702 - ETA: 0s - loss: 0.6996 - sparse_categorical_accuracy: 0.702 - 4s 4ms/step - loss: 0.6992 - sparse_categorical_accuracy: 0.7029 - val_loss: 0.6545 - val_sparse_categorical_accuracy: 0.7168

Trial complete

Trial summary

|-Trial ID: 7c767912e56eea712d25593491f7d2e2

|-Score: 0.7168124914169312

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 8

|-n_layers=2/n_units_1: 128

|-n_layers=2/n_units_2: 40

1000/1000 [==============================] - ETA: 0s - loss: 2.1125 - sparse_categorical_accuracy: 0.187 - ETA: 2s - loss: 1.5890 - sparse_categorical_accuracy: 0.444 - ETA: 2s - loss: 1.2964 - sparse_categorical_accuracy: 0.535 - ETA: 2s - loss: 1.1479 - sparse_categorical_accuracy: 0.578 - ETA: 2s - loss: 1.0746 - sparse_categorical_accuracy: 0.598 - ETA: 2s - loss: 1.0058 - sparse_categorical_accuracy: 0.617 - ETA: 2s - loss: 0.9566 - sparse_categorical_accuracy: 0.628 - ETA: 2s - loss: 0.9233 - sparse_categorical_accuracy: 0.639 - ETA: 2s - loss: 0.8975 - sparse_categorical_accuracy: 0.646 - ETA: 1s - loss: 0.8734 - sparse_categorical_accuracy: 0.655 - ETA: 1s - loss: 0.8539 - sparse_categorical_accuracy: 0.660 - ETA: 1s - loss: 0.8373 - sparse_categorical_accuracy: 0.664 - ETA: 1s - loss: 0.8246 - sparse_categorical_accuracy: 0.667 - ETA: 1s - loss: 0.8165 - sparse_categorical_accuracy: 0.668 - ETA: 1s - loss: 0.8076 - sparse_categorical_accuracy: 0.670 - ETA: 1s - loss: 0.8008 - sparse_categorical_accuracy: 0.672 - ETA: 1s - loss: 0.7954 - sparse_categorical_accuracy: 0.673 - ETA: 1s - loss: 0.7883 - sparse_categorical_accuracy: 0.675 - ETA: 1s - loss: 0.7812 - sparse_categorical_accuracy: 0.677 - ETA: 1s - loss: 0.7765 - sparse_categorical_accuracy: 0.678 - ETA: 1s - loss: 0.7708 - sparse_categorical_accuracy: 0.679 - ETA: 1s - loss: 0.7652 - sparse_categorical_accuracy: 0.680 - ETA: 1s - loss: 0.7608 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7568 - sparse_categorical_accuracy: 0.683 - ETA: 1s - loss: 0.7521 - sparse_categorical_accuracy: 0.685 - ETA: 1s - loss: 0.7481 - sparse_categorical_accuracy: 0.686 - ETA: 1s - loss: 0.7456 - sparse_categorical_accuracy: 0.687 - ETA: 1s - loss: 0.7447 - sparse_categorical_accuracy: 0.686 - ETA: 1s - loss: 0.7412 - sparse_categorical_accuracy: 0.688 - ETA: 1s - loss: 0.7374 - sparse_categorical_accuracy: 0.689 - ETA: 0s - loss: 0.7346 - sparse_categorical_accuracy: 0.690 - ETA: 0s - loss: 0.7313 - sparse_categorical_accuracy: 0.691 - ETA: 0s - loss: 0.7286 - sparse_categorical_accuracy: 0.691 - ETA: 0s - loss: 0.7258 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7232 - sparse_categorical_accuracy: 0.693 - ETA: 0s - loss: 0.7207 - sparse_categorical_accuracy: 0.694 - ETA: 0s - loss: 0.7187 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7170 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7156 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7139 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7125 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7110 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7091 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7085 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7070 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7057 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7047 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7029 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7021 - sparse_categorical_accuracy: 0.699 - 4s 4ms/step - loss: 0.7015 - sparse_categorical_accuracy: 0.6998 - val_loss: 0.6505 - val_sparse_categorical_accuracy: 0.7182

Trial complete

Trial summary

|-Trial ID: 35e035339a90ba6c5e2f6f835f74b197

|-Score: 0.7181562781333923

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 8

|-n_layers=2/n_units_1: 56

|-n_layers=2/n_units_2: 16

1000/1000 [==============================] - ETA: 0s - loss: 1.9448 - sparse_categorical_accuracy: 0.171 - ETA: 2s - loss: 1.2802 - sparse_categorical_accuracy: 0.562 - ETA: 2s - loss: 1.0828 - sparse_categorical_accuracy: 0.609 - ETA: 2s - loss: 0.9848 - sparse_categorical_accuracy: 0.631 - ETA: 2s - loss: 0.9262 - sparse_categorical_accuracy: 0.640 - ETA: 2s - loss: 0.8841 - sparse_categorical_accuracy: 0.647 - ETA: 2s - loss: 0.8546 - sparse_categorical_accuracy: 0.655 - ETA: 2s - loss: 0.8335 - sparse_categorical_accuracy: 0.663 - ETA: 2s - loss: 0.8165 - sparse_categorical_accuracy: 0.668 - ETA: 2s - loss: 0.8006 - sparse_categorical_accuracy: 0.672 - ETA: 2s - loss: 0.7904 - sparse_categorical_accuracy: 0.675 - ETA: 2s - loss: 0.7841 - sparse_categorical_accuracy: 0.676 - ETA: 1s - loss: 0.7790 - sparse_categorical_accuracy: 0.678 - ETA: 1s - loss: 0.7717 - sparse_categorical_accuracy: 0.681 - ETA: 1s - loss: 0.7632 - sparse_categorical_accuracy: 0.682 - ETA: 1s - loss: 0.7588 - sparse_categorical_accuracy: 0.682 - ETA: 1s - loss: 0.7548 - sparse_categorical_accuracy: 0.684 - ETA: 1s - loss: 0.7513 - sparse_categorical_accuracy: 0.685 - ETA: 1s - loss: 0.7463 - sparse_categorical_accuracy: 0.686 - ETA: 1s - loss: 0.7416 - sparse_categorical_accuracy: 0.688 - ETA: 1s - loss: 0.7376 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7354 - sparse_categorical_accuracy: 0.689 - ETA: 1s - loss: 0.7326 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7322 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7291 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7279 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7260 - sparse_categorical_accuracy: 0.690 - ETA: 1s - loss: 0.7238 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7227 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7218 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7202 - sparse_categorical_accuracy: 0.691 - ETA: 1s - loss: 0.7185 - sparse_categorical_accuracy: 0.691 - ETA: 0s - loss: 0.7170 - sparse_categorical_accuracy: 0.692 - ETA: 0s - loss: 0.7150 - sparse_categorical_accuracy: 0.693 - ETA: 0s - loss: 0.7139 - sparse_categorical_accuracy: 0.693 - ETA: 0s - loss: 0.7125 - sparse_categorical_accuracy: 0.693 - ETA: 0s - loss: 0.7111 - sparse_categorical_accuracy: 0.694 - ETA: 0s - loss: 0.7099 - sparse_categorical_accuracy: 0.694 - ETA: 0s - loss: 0.7088 - sparse_categorical_accuracy: 0.694 - ETA: 0s - loss: 0.7075 - sparse_categorical_accuracy: 0.695 - ETA: 0s - loss: 0.7062 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7041 - sparse_categorical_accuracy: 0.696 - ETA: 0s - loss: 0.7031 - sparse_categorical_accuracy: 0.697 - ETA: 0s - loss: 0.7020 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7008 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7005 - sparse_categorical_accuracy: 0.698 - ETA: 0s - loss: 0.7000 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6996 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.7000 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6990 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6985 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6977 - sparse_categorical_accuracy: 0.699 - ETA: 0s - loss: 0.6975 - sparse_categorical_accuracy: 0.700 - 4s 4ms/step - loss: 0.6970 - sparse_categorical_accuracy: 0.7004 - val_loss: 0.6747 - val_sparse_categorical_accuracy: 0.7081

Trial complete

Trial summary

|-Trial ID: 1cfc2b3619a67f215543daf41c47b628

|-Score: 0.7080625295639038

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 2

|-n_layers=1/n_units_1: 8

|-n_layers=2/n_units_1: 104

|-n_layers=2/n_units_2: 8

INFO:tensorflow:Oracle triggered exit


INFO:tensorflow:Oracle triggered exit

Results summary

|-Results in /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/.temp/8/covertype_tuning

|-Showing 10 best trials

|-Objective(name=‘val_sparse_categorical_accuracy’, direction=‘max’)

Trial summary

|-Trial ID: 35e035339a90ba6c5e2f6f835f74b197

|-Score: 0.7181562781333923

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 8

|-n_layers=2/n_units_1: 56

|-n_layers=2/n_units_2: 16

Trial summary

|-Trial ID: 7c767912e56eea712d25593491f7d2e2

|-Score: 0.7168124914169312

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 8

|-n_layers=2/n_units_1: 128

|-n_layers=2/n_units_2: 40

Trial summary

|-Trial ID: 6097ae630b97acef9784ec96704918d4

|-Score: 0.7159062623977661

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 56

|-n_layers=2/n_units_1: 120

|-n_layers=2/n_units_2: 24

Trial summary

|-Trial ID: 48ba1f4c61dc2f0f2f45348fc3dbc23f

|-Score: 0.7140937447547913

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 40

|-n_layers=2/n_units_1: 112

|-n_layers=2/n_units_2: 88

Trial summary

|-Trial ID: 1cfc2b3619a67f215543daf41c47b628

|-Score: 0.7080625295639038

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 2

|-n_layers=1/n_units_1: 8

|-n_layers=2/n_units_1: 104

|-n_layers=2/n_units_2: 8

Trial summary

|-Trial ID: 910c031c797c3ddf62586b0caa5a435a

|-Score: 0.7061562538146973

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.001

|-n_layers: 2

|-n_layers=1/n_units_1: 80

|-n_layers=2/n_units_1: 8

|-n_layers=2/n_units_2: 104

Trial summary

|-Trial ID: 00b4372ffe35b836f45f3ee8199da2de

|-Score: 0.6997187733650208

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 1

|-n_layers=1/n_units_1: 104

|-n_layers=2/n_units_1: 128

|-n_layers=2/n_units_2: 64

Trial summary

|-Trial ID: 408c3b83bddbaf40e88b3a63d229a638

|-Score: 0.6950312256813049

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.01

|-n_layers: 2

|-n_layers=1/n_units_1: 64

|-n_layers=2/n_units_1: 40

|-n_layers=2/n_units_2: 112

Trial summary

|-Trial ID: 907d15cf53fd6f78b450100fe3b79521

|-Score: 0.6888750195503235

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.0001

|-n_layers: 2

|-n_layers=1/n_units_1: 16

|-n_layers=2/n_units_1: 56

|-n_layers=2/n_units_2: 112

Trial summary

|-Trial ID: 7cdbd941583dc5554a76e421f946e0a0

|-Score: 0.672249972820282

|-Best step: 0

Hyperparameters:

|-learning_rate: 0.0001

|-n_layers: 1

|-n_layers=1/n_units_1: 128

|-n_layers=2/n_units_1: 88

|-n_layers=2/n_units_2: 64

<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f8642de5350
.execution_id8
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['transform_graph']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['best_hyperparameters']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Retrain your model by running Tuner with the best hyperparameters

hparams_importer = ImporterNode(
    instance_name='import_hparams',
    # This can be Tuner's output file or manually edited file. The file contains
    # text format of hyperparameters (kerastuner.HyperParameters.get_config())
    source_uri=tuner.outputs.best_hyperparameters.get()[0].uri,
    artifact_type=HyperParameters)
WARNING:absl:`instance_name` is deprecated, please set node id directly using`with_id()` or `.id` setter.
context.run(hparams_importer)
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f861dc8a210
.execution_id9
.component<tfx.components.common_nodes.importer_node.ImporterNode object at 0x7f861dd2c450>
.component.inputs{}
.component.outputs
['result']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
# TODO: your code to retrain your model with the best hyperparameters found by the Tuner component above.
# Hint: review the Trainer code above in this notebook and the documentation for how to configure the trainer 
# to use the output artifact from the hparams_importer.

trainer = Trainer(
    custom_executor_spec=executor_spec.ExecutorClassSpec(trainer_executor.GenericExecutor),
    module_file=TRAINER_MODULE_FILE,
    transformed_examples=transform.outputs.transformed_examples,
    schema=schema_importer.outputs.result,
    transform_graph=transform.outputs.transform_graph,
    hyperparameters=hparams_importer.outputs.result,
    train_args=trainer_pb2.TrainArgs(splits=['train'], num_steps=5000),
    eval_args=trainer_pb2.EvalArgs(splits=['eval'], num_steps=1000)
)
context.run(trainer)
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE
WARNING:absl:Examples artifact does not have payload_format custom property. Falling back to FORMAT_TF_EXAMPLE


WARNING:tensorflow:There are non-GPU devices in `tf.distribute.Strategy`, not using nccl allreduce.


WARNING:tensorflow:There are non-GPU devices in `tf.distribute.Strategy`, not using nccl allreduce.


INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:CPU:0',)


INFO:tensorflow:Using MirroredStrategy with devices ('/job:localhost/replica:0/task:0/device:CPU:0',)
2024-03-24 14:26:04.749916: I tensorflow/core/profiler/lib/profiler_session.cc:164] Profiler session started.


   1/5000 [..............................] - ETA: 0s - loss: 2.2447 - sparse_categorical_accuracy: 0.0469

2024-03-24 14:26:07.608712: I tensorflow/core/profiler/lib/profiler_session.cc:164] Profiler session started.


   2/5000 [..............................] - ETA: 24:02 - loss: 2.0962 - sparse_categorical_accuracy: 0.1484WARNING:tensorflow:Callbacks method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0047s vs `on_train_batch_end` time: 0.5693s). Check your callbacks.


2024-03-24 14:26:08.165199: I tensorflow/core/profiler/rpc/client/save_profile.cc:176] Creating directory: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08
2024-03-24 14:26:08.168392: I tensorflow/core/profiler/rpc/client/save_profile.cc:182] Dumped gzipped tool data for trace.json.gz to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08/tfx-on-googlecloud.trace.json.gz
2024-03-24 14:26:08.172339: I tensorflow/core/profiler/rpc/client/save_profile.cc:176] Creating directory: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08
2024-03-24 14:26:08.172461: I tensorflow/core/profiler/rpc/client/save_profile.cc:182] Dumped gzipped tool data for memory_profile.json.gz to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08/tfx-on-googlecloud.memory_profile.json.gz
2024-03-24 14:26:08.172852: I tensorflow/python/profiler/internal/profiler_wrapper.cc:111] Creating directory: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08Dumped tool data for xplane.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08/tfx-on-googlecloud.xplane.pb
Dumped tool data for overview_page.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08/tfx-on-googlecloud.overview_page.pb
Dumped tool data for input_pipeline.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08/tfx-on-googlecloud.input_pipeline.pb
Dumped tool data for tensorflow_stats.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08/tfx-on-googlecloud.tensorflow_stats.pb
Dumped tool data for kernel_stats.pb to /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model_run/10/train/plugins/profile/2024_03_24_14_26_08/tfx-on-googlecloud.kernel_stats.pb

WARNING:tensorflow:Callbacks method `on_train_batch_end` is slow compared to the batch time (batch time: 0.0047s vs `on_train_batch_end` time: 0.5693s). Check your callbacks.


5000/5000 [==============================] - ETA: 3:04 - loss: 1.7580 - sparse_categorical_accuracy: 0.3493 - ETA: 1:42 - loss: 1.4946 - sparse_categorical_accuracy: 0.459 - ETA: 1:12 - loss: 1.2959 - sparse_categorical_accuracy: 0.521 - ETA: 56s - loss: 1.1653 - sparse_categorical_accuracy: 0.565 - ETA: 48s - loss: 1.0919 - sparse_categorical_accuracy: 0.58 - ETA: 42s - loss: 1.0358 - sparse_categorical_accuracy: 0.60 - ETA: 37s - loss: 0.9844 - sparse_categorical_accuracy: 0.61 - ETA: 33s - loss: 0.9454 - sparse_categorical_accuracy: 0.62 - ETA: 31s - loss: 0.9185 - sparse_categorical_accuracy: 0.63 - ETA: 29s - loss: 0.8966 - sparse_categorical_accuracy: 0.64 - ETA: 28s - loss: 0.8795 - sparse_categorical_accuracy: 0.65 - ETA: 28s - loss: 0.8716 - sparse_categorical_accuracy: 0.65 - ETA: 26s - loss: 0.8558 - sparse_categorical_accuracy: 0.65 - ETA: 25s - loss: 0.8429 - sparse_categorical_accuracy: 0.66 - ETA: 24s - loss: 0.8325 - sparse_categorical_accuracy: 0.66 - ETA: 24s - loss: 0.8213 - sparse_categorical_accuracy: 0.66 - ETA: 23s - loss: 0.8154 - sparse_categorical_accuracy: 0.66 - ETA: 23s - loss: 0.8065 - sparse_categorical_accuracy: 0.67 - ETA: 22s - loss: 0.8002 - sparse_categorical_accuracy: 0.67 - ETA: 22s - loss: 0.7946 - sparse_categorical_accuracy: 0.67 - ETA: 21s - loss: 0.7891 - sparse_categorical_accuracy: 0.67 - ETA: 21s - loss: 0.7852 - sparse_categorical_accuracy: 0.67 - ETA: 21s - loss: 0.7810 - sparse_categorical_accuracy: 0.67 - ETA: 21s - loss: 0.7781 - sparse_categorical_accuracy: 0.67 - ETA: 20s - loss: 0.7734 - sparse_categorical_accuracy: 0.68 - ETA: 20s - loss: 0.7677 - sparse_categorical_accuracy: 0.68 - ETA: 19s - loss: 0.7629 - sparse_categorical_accuracy: 0.68 - ETA: 19s - loss: 0.7584 - sparse_categorical_accuracy: 0.68 - ETA: 19s - loss: 0.7551 - sparse_categorical_accuracy: 0.68 - ETA: 18s - loss: 0.7520 - sparse_categorical_accuracy: 0.68 - ETA: 18s - loss: 0.7480 - sparse_categorical_accuracy: 0.68 - ETA: 18s - loss: 0.7457 - sparse_categorical_accuracy: 0.68 - ETA: 18s - loss: 0.7437 - sparse_categorical_accuracy: 0.68 - ETA: 17s - loss: 0.7417 - sparse_categorical_accuracy: 0.68 - ETA: 17s - loss: 0.7388 - sparse_categorical_accuracy: 0.68 - ETA: 17s - loss: 0.7373 - sparse_categorical_accuracy: 0.68 - ETA: 17s - loss: 0.7352 - sparse_categorical_accuracy: 0.68 - ETA: 17s - loss: 0.7326 - sparse_categorical_accuracy: 0.69 - ETA: 17s - loss: 0.7312 - sparse_categorical_accuracy: 0.69 - ETA: 16s - loss: 0.7293 - sparse_categorical_accuracy: 0.69 - ETA: 16s - loss: 0.7268 - sparse_categorical_accuracy: 0.69 - ETA: 16s - loss: 0.7245 - sparse_categorical_accuracy: 0.69 - ETA: 16s - loss: 0.7232 - sparse_categorical_accuracy: 0.69 - ETA: 16s - loss: 0.7222 - sparse_categorical_accuracy: 0.69 - ETA: 16s - loss: 0.7207 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7194 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7174 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7160 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7143 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7133 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7122 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7111 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7107 - sparse_categorical_accuracy: 0.69 - ETA: 15s - loss: 0.7094 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7082 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7075 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7068 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7061 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7050 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7039 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7027 - sparse_categorical_accuracy: 0.69 - ETA: 14s - loss: 0.7020 - sparse_categorical_accuracy: 0.70 - ETA: 14s - loss: 0.7011 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.7006 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6996 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6983 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6980 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6974 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6964 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6955 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6948 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6934 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6924 - sparse_categorical_accuracy: 0.70 - ETA: 13s - loss: 0.6920 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6921 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6916 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6912 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6907 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6901 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6896 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6891 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6887 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6882 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6877 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6870 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6864 - sparse_categorical_accuracy: 0.70 - ETA: 12s - loss: 0.6858 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6853 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6851 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6848 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6841 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6841 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6837 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6832 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6827 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6823 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6818 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6814 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6810 - sparse_categorical_accuracy: 0.70 - ETA: 11s - loss: 0.6804 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6801 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6794 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6790 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6789 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6788 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6785 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6780 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6775 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6774 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6770 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6769 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6763 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6765 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6761 - sparse_categorical_accuracy: 0.70 - ETA: 10s - loss: 0.6759 - sparse_categorical_accuracy: 0.70 - ETA: 9s - loss: 0.6758 - sparse_categorical_accuracy: 0.7076 - ETA: 9s - loss: 0.6753 - sparse_categorical_accuracy: 0.707 - ETA: 9s - loss: 0.6748 - sparse_categorical_accuracy: 0.707 - ETA: 9s - loss: 0.6746 - sparse_categorical_accuracy: 0.707 - ETA: 9s - loss: 0.6742 - sparse_categorical_accuracy: 0.707 - ETA: 9s - loss: 0.6739 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6742 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6736 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6734 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6733 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6729 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6731 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6730 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6728 - sparse_categorical_accuracy: 0.708 - ETA: 9s - loss: 0.6725 - sparse_categorical_accuracy: 0.708 - ETA: 8s - loss: 0.6722 - sparse_categorical_accuracy: 0.708 - ETA: 8s - loss: 0.6719 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6716 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6713 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6715 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6711 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6708 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6704 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6701 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6701 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6699 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6698 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6696 - sparse_categorical_accuracy: 0.709 - ETA: 8s - loss: 0.6693 - sparse_categorical_accuracy: 0.710 - ETA: 8s - loss: 0.6691 - sparse_categorical_accuracy: 0.710 - ETA: 8s - loss: 0.6690 - sparse_categorical_accuracy: 0.710 - ETA: 8s - loss: 0.6689 - sparse_categorical_accuracy: 0.710 - ETA: 8s - loss: 0.6690 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6689 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6687 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6687 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6686 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6686 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6684 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6683 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6683 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6678 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6677 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6676 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6675 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6672 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6671 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6667 - sparse_categorical_accuracy: 0.710 - ETA: 7s - loss: 0.6668 - sparse_categorical_accuracy: 0.710 - ETA: 6s - loss: 0.6668 - sparse_categorical_accuracy: 0.710 - ETA: 6s - loss: 0.6665 - sparse_categorical_accuracy: 0.710 - ETA: 6s - loss: 0.6665 - sparse_categorical_accuracy: 0.710 - ETA: 6s - loss: 0.6662 - sparse_categorical_accuracy: 0.710 - ETA: 6s - loss: 0.6662 - sparse_categorical_accuracy: 0.710 - ETA: 6s - loss: 0.6660 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6656 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6654 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6654 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6652 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6651 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6649 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6648 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6647 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6645 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6643 - sparse_categorical_accuracy: 0.711 - ETA: 6s - loss: 0.6641 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6641 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6640 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6640 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6639 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6638 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6638 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6636 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6635 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6634 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6635 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6635 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6633 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6630 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6630 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6627 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6626 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6626 - sparse_categorical_accuracy: 0.711 - ETA: 5s - loss: 0.6625 - sparse_categorical_accuracy: 0.711 - ETA: 4s - loss: 0.6626 - sparse_categorical_accuracy: 0.711 - ETA: 4s - loss: 0.6626 - sparse_categorical_accuracy: 0.711 - ETA: 4s - loss: 0.6625 - sparse_categorical_accuracy: 0.711 - ETA: 4s - loss: 0.6624 - sparse_categorical_accuracy: 0.711 - ETA: 4s - loss: 0.6623 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6621 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6620 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6619 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6617 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6615 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6616 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6613 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6612 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6611 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6610 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6610 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6607 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6605 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6604 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6607 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6606 - sparse_categorical_accuracy: 0.712 - ETA: 4s - loss: 0.6606 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6604 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6604 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6603 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6601 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6599 - sparse_categorical_accuracy: 0.713 - ETA: 3s - loss: 0.6599 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6600 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6600 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6601 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6602 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6601 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6600 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6599 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6598 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6599 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6599 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6599 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6599 - sparse_categorical_accuracy: 0.712 - ETA: 3s - loss: 0.6600 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6599 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6598 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6597 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6595 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6593 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6593 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6593 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6592 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6591 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6592 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6592 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6591 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6590 - sparse_categorical_accuracy: 0.712 - ETA: 2s - loss: 0.6589 - sparse_categorical_accuracy: 0.713 - ETA: 2s - loss: 0.6589 - sparse_categorical_accuracy: 0.713 - ETA: 2s - loss: 0.6588 - sparse_categorical_accuracy: 0.713 - ETA: 2s - loss: 0.6586 - sparse_categorical_accuracy: 0.713 - ETA: 2s - loss: 0.6586 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6585 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6583 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6582 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6581 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6580 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6579 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6578 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6578 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6578 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6577 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6577 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6575 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6574 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6574 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6573 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6574 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6573 - sparse_categorical_accuracy: 0.713 - ETA: 1s - loss: 0.6573 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6572 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6571 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6571 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6570 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6570 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6569 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6568 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6567 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6567 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6566 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6567 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6566 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6566 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6565 - sparse_categorical_accuracy: 0.713 - ETA: 0s - loss: 0.6564 - sparse_categorical_accuracy: 0.714 - ETA: 0s - loss: 0.6562 - sparse_categorical_accuracy: 0.714 - ETA: 0s - loss: 0.6562 - sparse_categorical_accuracy: 0.714 - ETA: 0s - loss: 0.6562 - sparse_categorical_accuracy: 0.714 - 19s 4ms/step - loss: 0.6561 - sparse_categorical_accuracy: 0.7140 - val_loss: 0.6434 - val_sparse_categorical_accuracy: 0.7147
INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Saver not created because there are no variables in the graph to restore


INFO:tensorflow:Assets written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model/10/serving_model_dir/assets


INFO:tensorflow:Assets written to: /home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model/10/serving_model_dir/assets
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f861c77e9d0
.execution_id10
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['transform_graph']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['schema']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['hyperparameters']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['model_run']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Evaluating trained models with Evaluator

The Evaluator component analyzes model performance using the TensorFlow Model Analysis library. It runs inference requests on particular subsets of the test dataset, based on which slices are defined by the developer. Knowing which slices should be analyzed requires domain knowledge of what is important in this particular use case or domain.

The Evaluator can also optionally validate a newly trained model against a previous model. In this lab, you only train one model, so the Evaluator automatically will label the model as “blessed”.

Configure and run the Evaluator component

Use the ResolverNode to pick the previous model to compare against. The model resolver is only required if performing model validation in addition to evaluation. In this case we validate against the latest blessed model. If no model has been blessed before (as in this case) the evaluator will make our candidate the first blessed model.

model_resolver = ResolverNode(
      instance_name='latest_blessed_model_resolver',
      resolver_class=latest_blessed_model_resolver.LatestBlessedModelResolver,
      model=Channel(type=Model),
      model_blessing=Channel(type=ModelBlessing))
WARNING:absl:`instance_name` is deprecated, please set node id directly using`with_id()` or `.id` setter.
context.run(model_resolver)
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f862e288250
.execution_id11
.component<tfx.components.common_nodes.resolver_node.ResolverNode object at 0x7f864837ba50>
.component.inputs
['model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['model_blessing']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['model_blessing']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Configure evaluation metrics and slices.

# TODO: Your code here to create a tfma.MetricThreshold. 
# Review the API documentation here: https://www.tensorflow.org/tfx/model_analysis/api_docs/python/tfma/MetricThreshold
# Hint: Review the API documentation for tfma.GenericValueThreshold to constrain accuracy between 50% and 99%.

accuracy_threshold = tfma.MetricThreshold(
    value_threshold=tfma.GenericValueThreshold(
        lower_bound={"value": .5},
        upper_bound={"value": .99}
    )
)

metrics_specs = tfma.MetricsSpec(
                   metrics = [
                       tfma.MetricConfig(class_name='SparseCategoricalAccuracy',
                           threshold=accuracy_threshold),
                       tfma.MetricConfig(class_name='ExampleCount')])

eval_config = tfma.EvalConfig(
    model_specs=[
        tfma.ModelSpec(label_key='Cover_Type')
    ],
    metrics_specs=[metrics_specs],
    slicing_specs=[
        tfma.SlicingSpec(),
        tfma.SlicingSpec(feature_keys=['Wilderness_Area'])
    ]
)
eval_config
model_specs {
  label_key: "Cover_Type"
}
slicing_specs {
}
slicing_specs {
  feature_keys: "Wilderness_Area"
}
metrics_specs {
  metrics {
    class_name: "SparseCategoricalAccuracy"
    threshold {
      value_threshold {
        lower_bound {
          value: 0.5
        }
        upper_bound {
          value: 0.99
        }
      }
    }
  }
  metrics {
    class_name: "ExampleCount"
  }
}
model_analyzer = Evaluator(
    examples=example_gen.outputs.examples,
    model=trainer.outputs.model,
    baseline_model=model_resolver.outputs.model,
    eval_config=eval_config
)
context.run(model_analyzer, enable_cache=False)
2024-03-24 14:26:29.711264: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
WARNING:absl:"maybe_add_baseline" and "maybe_remove_baseline" are deprecated,
        please use "has_baseline" instead.
2024-03-24 14:26:29.715692: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f861de10410
.execution_id12
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['baseline_model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['evaluation']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['blessing']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Check the model performance validation status

model_blessing_uri = model_analyzer.outputs.blessing.get()[0].uri
!ls -l {model_blessing_uri}
total 0
-rw-r--r-- 1 jupyter jupyter 0 Mar 24 14:26 BLESSED

Visualize evaluation results

You can visualize the evaluation results using the tfma.view.render_slicing_metrics() function from TensorFlow Model Analysis library.

Setup Note: Currently, TFMA visualizations don’t render in JupyterLab. Make sure that you run this notebook in Classic Notebook.

evaluation_uri = model_analyzer.outputs['evaluation'].get()[0].uri
evaluation_uri
!ls {evaluation_uri}
eval_config.json  metrics  plots  validations
eval_result = tfma.load_eval_result(evaluation_uri)
eval_result
EvalResult(slicing_metrics=[((), {'': {'': {'sparse_categorical_accuracy': {'doubleValue': 0.7150089144706726}, 'example_count': {'doubleValue': 20148.0}}}}), ((('Wilderness_Area', 'Cache'),), {'': {'': {'sparse_categorical_accuracy': {'doubleValue': 0.5856189727783203}, 'example_count': {'doubleValue': 1349.0}}}}), ((('Wilderness_Area', 'Commanche'),), {'': {'': {'sparse_categorical_accuracy': {'doubleValue': 0.7045196294784546}, 'example_count': {'doubleValue': 8806.0}}}}), ((('Wilderness_Area', 'Rawah'),), {'': {'': {'sparse_categorical_accuracy': {'doubleValue': 0.7594060301780701}, 'example_count': {'doubleValue': 8957.0}}}}), ((('Wilderness_Area', 'Neota'),), {'': {'': {'sparse_categorical_accuracy': {'doubleValue': 0.5888031125068665}, 'example_count': {'doubleValue': 1036.0}}}})], plots=[((), None), ((('Wilderness_Area', 'Cache'),), None), ((('Wilderness_Area', 'Commanche'),), None), ((('Wilderness_Area', 'Rawah'),), None), ((('Wilderness_Area', 'Neota'),), None)], config=model_specs {
  label_key: "Cover_Type"
}
slicing_specs {
}
slicing_specs {
  feature_keys: "Wilderness_Area"
}
metrics_specs {
  metrics {
    class_name: "SparseCategoricalAccuracy"
    threshold {
      value_threshold {
        lower_bound {
          value: 0.5
        }
        upper_bound {
          value: 0.99
        }
      }
    }
  }
  metrics {
    class_name: "ExampleCount"
  }
  model_names: ""
}
, data_location='<user provided PCollection>', file_format='<unknown>', model_location='/home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Trainer/model/10/serving_model_dir')
tfma.view.render_slicing_metrics(eval_result)
SlicingMetricsViewer(config={'weightedExamplesColumn': 'example_count'}, data=[{'slice': 'Overall', 'metrics':…
tfma.view.render_slicing_metrics(
    eval_result, slicing_column='Wilderness_Area')
SlicingMetricsViewer(config={'weightedExamplesColumn': 'example_count'}, data=[{'slice': 'Wilderness_Area:Cach…

InfraValidator

The InfraValidator component acts as an additional early warning layer by validating a candidate model in a sandbox version of its serving infrastructure to prevent an unservable model from being pushed to production. Compared to the Evaluator component above which validates a model’s performance, the InfraValidator component is validating that a model is able to generate predictions from served examples in an environment configured to match production. The config below takes a model and examples, launches the model in a sand-boxed TensorflowServing model server from the latest image in a local docker engine, and optionally checks that the model binary can be loaded and queried before “blessing” it for production.

infra_validator = InfraValidator(
    model=trainer.outputs['model'],
    examples=example_gen.outputs['examples'],
    serving_spec=infra_validator_pb2.ServingSpec(
        tensorflow_serving=infra_validator_pb2.TensorFlowServing(
            tags=['latest']),
      local_docker=infra_validator_pb2.LocalDockerConfig(),
  ),
    validation_spec=infra_validator_pb2.ValidationSpec(
        max_loading_time_seconds=60,
        num_tries=5,
    ),    
  request_spec=infra_validator_pb2.RequestSpec(
      tensorflow_serving=infra_validator_pb2.TensorFlowServingRequestSpec(),
          num_examples=5,
      )
)
context.run(infra_validator, enable_cache=False)
2024-03-24 14:26:48.201466: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f8620732250
.execution_id13
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['examples']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['blessing']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Note: In case of Infra Validation (attempt 1/5) failed error, please disregard the error.

Check the model infrastructure validation status

infra_blessing_uri = infra_validator.outputs.blessing.get()[0].uri
!ls -l {infra_blessing_uri}
total 0
-rw-r--r-- 1 jupyter jupyter 0 Mar 24 14:27 INFRA_BLESSED

Deploying models with Pusher

The Pusher component checks whether a model has been “blessed”, and if so, deploys it by pushing the model to a well known file destination.

Configure and run the Pusher component

trainer.outputs['model']
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
Channel of type 'Model' (1 artifact) at 0x7f861dd48850
.type_nameModel
._artifacts
[0]<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
pusher = Pusher(
    model=trainer.outputs['model'],
    model_blessing=model_analyzer.outputs['blessing'],
    infra_blessing=infra_validator.outputs['blessing'],
    push_destination=pusher_pb2.PushDestination(
        filesystem=pusher_pb2.PushDestination.Filesystem(
            base_directory=SERVING_MODEL_DIR)))
context.run(pusher)
2024-03-24 14:27:15.443932: W ml_metadata/metadata_store/rdbms_metadata_access_object.cc:581] No property is defined for the Type
<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
ExecutionResult at 0x7f86202587d0
.execution_id14
.component<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.inputs
['model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['model_blessing']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
['infra_blessing']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }
.component.outputs
['pushed_model']<style> .tfx-object.expanded { padding: 4px 8px 4px 8px; background: white; border: 1px solid #bbbbbb; box-shadow: 4px 4px 2px rgba(0,0,0,0.05); } .tfx-object, .tfx-object * { font-size: 11pt; } .tfx-object > .title { cursor: pointer; } .tfx-object .expansion-marker { color: #999999; } .tfx-object.expanded > .title > .expansion-marker:before { content: '▼'; } .tfx-object.collapsed > .title > .expansion-marker:before { content: '▶'; } .tfx-object .class-name { font-weight: bold; } .tfx-object .deemphasize { opacity: 0.5; } .tfx-object.collapsed > table.attr-table { display: none; } .tfx-object.expanded > table.attr-table { display: block; } .tfx-object table.attr-table { border: 2px solid white; margin-top: 5px; } .tfx-object table.attr-table td.attr-name { vertical-align: top; font-weight: bold; } .tfx-object table.attr-table td.attrvalue { text-align: left; } <script> function toggleTfxObject(element) { var objElement = element.parentElement; if (objElement.classList.contains('collapsed')) { objElement.classList.remove('collapsed'); objElement.classList.add('expanded'); } else { objElement.classList.add('collapsed'); objElement.classList.remove('expanded'); } }

Examine the output of Pusher

pusher.outputs
{'pushed_model': Channel(
    type_name: PushedModel
    artifacts: [Artifact(artifact: id: 17
type_id: 28
uri: "/home/jupyter/artifact-store/tfx-covertype-classifier/20240324_140959/Pusher/pushed_model/14"
custom_properties {
  key: "name"
  value {
    string_value: "pushed_model"
  }
}
custom_properties {
  key: "producer_component"
  value {
    string_value: "Pusher"
  }
}
custom_properties {
  key: "pushed"
  value {
    int_value: 1
  }
}
custom_properties {
  key: "pushed_destination"
  value {
    string_value: "/home/jupyter/serving_model/1711290435"
  }
}
custom_properties {
  key: "pushed_version"
  value {
    string_value: "1711290435"
  }
}
custom_properties {
  key: "state"
  value {
    string_value: "published"
  }
}
state: LIVE
, artifact_type: id: 28
name: "PushedModel"
)]
)}
# Set `PATH` to include a directory containing `saved_model_cli.
PATH=%env PATH
%env PATH=/opt/conda/envs/tfx/bin:{PATH}
env: PATH=/opt/conda/envs/tfx/bin:/opt/conda/bin:/opt/conda/condabin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
latest_pushed_model = os.path.join(SERVING_MODEL_DIR, max(os.listdir(SERVING_MODEL_DIR)))
!saved_model_cli show --dir {latest_pushed_model} --all
2024-03-24 14:27:16.305772: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcudart.so.10.1'; dlerror: libcudart.so.10.1: cannot open shared object file: No such file or directory
2024-03-24 14:27:16.305927: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['__saved_model_init_op']:
  The given SavedModel SignatureDef contains the following input(s):
  The given SavedModel SignatureDef contains the following output(s):
    outputs['__saved_model_init_op'] tensor_info:
        dtype: DT_INVALID
        shape: unknown_rank
        name: NoOp
  Method name is: 

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['examples'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: serving_default_examples:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['output_0'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 7)
        name: StatefulPartitionedCall_1:0
  Method name is: tensorflow/serving/predict
2024-03-24 14:27:20.953296: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory
2024-03-24 14:27:20.953349: W tensorflow/stream_executor/cuda/cuda_driver.cc:312] failed call to cuInit: UNKNOWN ERROR (303)
2024-03-24 14:27:20.953377: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (tfx-on-googlecloud): /proc/driver/nvidia/version does not exist

Defined Functions:
  Function Name: '__call__'
    Option #1
      Callable with:
        Argument #1
          DType: dict
          Value: {'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Slope_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Vertical_Distance_To_Hydrology_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Soil_Type_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_Noon_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Hydrology_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Fire_Points_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Wilderness_Area_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Elevation_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Aspect_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Roadways_xf'), 'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_9am_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_3pm_xf')}
        Argument #2
          DType: bool
          Value: False
        Argument #3
          DType: NoneType
          Value: None
    Option #2
      Callable with:
        Argument #1
          DType: dict
          Value: {'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_Noon_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_3pm_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Soil_Type_xf'), 'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_9am_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Fire_Points_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Wilderness_Area_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Hydrology_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Vertical_Distance_To_Hydrology_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Aspect_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Slope_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Roadways_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Elevation_xf')}
        Argument #2
          DType: bool
          Value: False
        Argument #3
          DType: NoneType
          Value: None
    Option #3
      Callable with:
        Argument #1
          DType: dict
          Value: {'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_9am_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Fire_Points_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Aspect_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_3pm_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Slope_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Vertical_Distance_To_Hydrology_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Roadways_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Wilderness_Area_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Hydrology_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_Noon_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Elevation_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Soil_Type_xf')}
        Argument #2
          DType: bool
          Value: True
        Argument #3
          DType: NoneType
          Value: None
    Option #4
      Callable with:
        Argument #1
          DType: dict
          Value: {'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Hydrology_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Elevation_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Wilderness_Area_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_3pm_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Soil_Type_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_Noon_xf'), 'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_9am_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Aspect_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Vertical_Distance_To_Hydrology_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Roadways_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Fire_Points_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Slope_xf')}
        Argument #2
          DType: bool
          Value: True
        Argument #3
          DType: NoneType
          Value: None

  Function Name: '_default_save_signature'
    Option #1
      Callable with:
        Argument #1
          DType: dict
          Value: {'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Aspect_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Hydrology_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Wilderness_Area_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Soil_Type_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Elevation_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Vertical_Distance_To_Hydrology_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_3pm_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Slope_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_Noon_xf'), 'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_9am_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Roadways_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Fire_Points_xf')}

  Function Name: 'call_and_return_all_conditional_losses'
    Option #1
      Callable with:
        Argument #1
          DType: dict
          Value: {'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Fire_Points_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Slope_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_Noon_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Aspect_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Vertical_Distance_To_Hydrology_xf'), 'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_9am_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Hydrology_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Wilderness_Area_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_3pm_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Soil_Type_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Roadways_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Elevation_xf')}
        Argument #2
          DType: bool
          Value: False
        Argument #3
          DType: NoneType
          Value: None
    Option #2
      Callable with:
        Argument #1
          DType: dict
          Value: {'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_9am_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Wilderness_Area_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Elevation_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Vertical_Distance_To_Hydrology_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Aspect_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Hydrology_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_3pm_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Fire_Points_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Soil_Type_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Roadways_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_Noon_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Slope_xf')}
        Argument #2
          DType: bool
          Value: True
        Argument #3
          DType: NoneType
          Value: None
    Option #3
      Callable with:
        Argument #1
          DType: dict
          Value: {'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Elevation_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Roadways_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Aspect_xf'), 'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_3pm_xf'), 'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_9am_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Slope_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Vertical_Distance_To_Hydrology_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Hillshade_Noon_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Soil_Type_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Fire_Points_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='Horizontal_Distance_To_Hydrology_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='Wilderness_Area_xf')}
        Argument #2
          DType: bool
          Value: True
        Argument #3
          DType: NoneType
          Value: None
    Option #4
      Callable with:
        Argument #1
          DType: dict
          Value: {'Hillshade_3pm_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_3pm_xf'), 'Aspect_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Aspect_xf'), 'Horizontal_Distance_To_Roadways_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Roadways_xf'), 'Horizontal_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Hydrology_xf'), 'Soil_Type_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Soil_Type_xf'), 'Horizontal_Distance_To_Fire_Points_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Horizontal_Distance_To_Fire_Points_xf'), 'Hillshade_9am_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_9am_xf'), 'Hillshade_Noon_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Hillshade_Noon_xf'), 'Vertical_Distance_To_Hydrology_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Vertical_Distance_To_Hydrology_xf'), 'Elevation_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Elevation_xf'), 'Wilderness_Area_xf': TensorSpec(shape=(None,), dtype=tf.int32, name='inputs/Wilderness_Area_xf'), 'Slope_xf': TensorSpec(shape=(None,), dtype=tf.float32, name='inputs/Slope_xf')}
        Argument #2
          DType: bool
          Value: False
        Argument #3
          DType: NoneType
          Value: None

Note: In case of FileNotFoundError error, please disregard the error.

Next steps

This concludes your introductory walthrough through TFX pipeline components. In the lab, you used TFX to analyze, understand, and pre-process the dataset and train, analyze, validate, and deploy a multi-class classification model to predict the type of forest cover from cartographic features. You utilized a TFX Interactive Context for prototype development of a TFX pipeline directly in a Jupyter notebook. Next, you worked with the TFDV library to modify your dataset schema to add feature constraints to catch data anamolies that can negatively impact your model’s performance. You utilized TFT library for feature proprocessing for consistent feature transformations for your model at training and serving time. Lastly, using the TFMA library, you added model performance constraints to ensure you only push more accurate models than previous runs to production.

The next labs in the series will guide through developing a TFX pipeline, deploying and running the pipeline on AI Platform Pipelines and automating the pipeline build and deployment processes with Cloud Build.

License

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at https://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.