# Copyright 2023 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
![]() Run in Colab |
![]() View on GitHub |
Open in Vertex AI Workbench |
Author(s) | Polong Lin |
This notebook covers the essentials of prompt engineering, including some best practices.
Learn more about prompt design in the official documentation.
In this notebook, you learn best practices around prompt engineering – how to design prompts to improve the quality of your responses.
This notebook covers the following best practices for prompt engineering:
This tutorial uses billable components of Google Cloud:
Learn about Vertex AI pricing, and use the Pricing Calculator to generate a cost estimate based on your projected usage.
!pip install google-cloud-aiplatform protobuf==3.19.3 --upgrade --user
Requirement already satisfied: google-cloud-aiplatform in /home/jupyter/.local/lib/python3.10/site-packages (1.44.0)
Collecting protobuf==3.19.3
Using cached protobuf-3.19.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (806 bytes)
Requirement already satisfied: google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1 in /home/jupyter/.local/lib/python3.10/site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (2.18.0)
Requirement already satisfied: google-auth<3.0.0dev,>=2.14.1 in /opt/conda/lib/python3.10/site-packages (from google-cloud-aiplatform) (2.28.1)
Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-aiplatform) (1.23.0)
INFO: pip is looking at multiple versions of google-cloud-aiplatform to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-aiplatform
Using cached google_cloud_aiplatform-1.44.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.43.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.42.1-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.42.0-py2.py3-none-any.whl.metadata (27 kB)
INFO: pip is still looking at multiple versions of google-cloud-aiplatform to determine which version is compatible with other requirements. This could take a while.
Using cached google_cloud_aiplatform-1.41.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.40.0-py2.py3-none-any.whl.metadata (27 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Using cached google_cloud_aiplatform-1.39.0-py2.py3-none-any.whl.metadata (28 kB)
Using cached google_cloud_aiplatform-1.38.1-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.38.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.37.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.36.4-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.36.3-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.36.2-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.36.1-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.36.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.35.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.34.0-py2.py3-none-any.whl.metadata (28 kB)
Using cached google_cloud_aiplatform-1.33.1-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.33.0-py2.py3-none-any.whl.metadata (27 kB)
Using cached google_cloud_aiplatform-1.32.0-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.31.1-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.31.0-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.30.1-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.30.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.29.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.28.1-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.28.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.27.1-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.27.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.26.1-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.26.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.25.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.24.1-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.24.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.23.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.22.1-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.22.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.21.0-py2.py3-none-any.whl.metadata (24 kB)
Using cached google_cloud_aiplatform-1.20.0-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.19.1-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.19.0-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.18.3-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.18.2-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.18.1-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.18.0-py2.py3-none-any.whl.metadata (25 kB)
Using cached google_cloud_aiplatform-1.17.1-py2.py3-none-any.whl.metadata (25 kB)
Collecting packaging<22.0.0dev,>=14.3 (from google-cloud-aiplatform)
Using cached packaging-21.3-py3-none-any.whl.metadata (15 kB)
Requirement already satisfied: google-cloud-storage<3.0.0dev,>=1.32.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-aiplatform) (2.14.0)
Collecting google-cloud-bigquery<3.0.0dev,>=1.15.0 (from google-cloud-aiplatform)
Using cached google_cloud_bigquery-2.34.4-py2.py3-none-any.whl.metadata (7.9 kB)
Requirement already satisfied: google-cloud-resource-manager<3.0.0dev,>=1.3.3 in /opt/conda/lib/python3.10/site-packages (from google-cloud-aiplatform) (1.12.3)
Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in /opt/conda/lib/python3.10/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (1.62.0)
INFO: pip is looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
Collecting google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform)
Using cached google_api_core-2.17.1-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.17.0-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.16.2-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.16.1-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.16.0-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.15.0-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.14.0-py3-none-any.whl.metadata (2.6 kB)
INFO: pip is still looking at multiple versions of google-api-core to determine which version is compatible with other requirements. This could take a while.
Using cached google_api_core-2.13.1-py3-none-any.whl.metadata (2.6 kB)
Using cached google_api_core-2.13.0-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.12.0-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.11.1-py3-none-any.whl.metadata (2.7 kB)
Using cached google_api_core-2.11.0-py3-none-any.whl.metadata (2.6 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Using cached google_api_core-2.10.2-py3-none-any.whl.metadata (2.4 kB)
Using cached google_api_core-2.10.1-py3-none-any.whl.metadata (2.3 kB)
Using cached google_api_core-2.10.0-py3-none-any.whl.metadata (2.3 kB)
Using cached google_api_core-2.9.0-py3-none-any.whl.metadata (2.3 kB)
Using cached google_api_core-2.8.2-py3-none-any.whl.metadata (2.1 kB)
Requirement already satisfied: requests<3.0.0dev,>=2.18.0 in /opt/conda/lib/python3.10/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (2.31.0)
INFO: pip is looking at multiple versions of google-api-core[grpc] to determine which version is compatible with other requirements. This could take a while.
INFO: pip is still looking at multiple versions of google-api-core[grpc] to determine which version is compatible with other requirements. This could take a while.
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in /opt/conda/lib/python3.10/site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (1.48.1)
Requirement already satisfied: grpcio-status<2.0dev,>=1.33.2 in /opt/conda/lib/python3.10/site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (1.48.1)
Requirement already satisfied: google-cloud-core<3.0.0dev,>=1.4.1 in /opt/conda/lib/python3.10/site-packages (from google-cloud-bigquery<3.0.0dev,>=1.15.0->google-cloud-aiplatform) (2.4.1)
Requirement already satisfied: google-resumable-media<3.0dev,>=0.6.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-bigquery<3.0.0dev,>=1.15.0->google-cloud-aiplatform) (2.7.0)
Requirement already satisfied: python-dateutil<3.0dev,>=2.7.2 in /opt/conda/lib/python3.10/site-packages (from google-cloud-bigquery<3.0.0dev,>=1.15.0->google-cloud-aiplatform) (2.9.0)
INFO: pip is looking at multiple versions of google-cloud-resource-manager to determine which version is compatible with other requirements. This could take a while.
Collecting google-cloud-resource-manager<3.0.0dev,>=1.3.3 (from google-cloud-aiplatform)
Using cached google_cloud_resource_manager-1.12.2-py2.py3-none-any.whl.metadata (5.3 kB)
Using cached google_cloud_resource_manager-1.12.1-py2.py3-none-any.whl.metadata (5.3 kB)
Using cached google_cloud_resource_manager-1.12.0-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.11.0-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.10.4-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.10.3-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.10.2-py2.py3-none-any.whl.metadata (5.1 kB)
INFO: pip is still looking at multiple versions of google-cloud-resource-manager to determine which version is compatible with other requirements. This could take a while.
Using cached google_cloud_resource_manager-1.10.1-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.10.0-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.9.1-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.9.0-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.8.1-py2.py3-none-any.whl.metadata (5.2 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
Using cached google_cloud_resource_manager-1.8.0-py2.py3-none-any.whl.metadata (5.2 kB)
Using cached google_cloud_resource_manager-1.7.0-py2.py3-none-any.whl.metadata (5.0 kB)
Using cached google_cloud_resource_manager-1.6.3-py2.py3-none-any.whl.metadata (5.0 kB)
Using cached google_cloud_resource_manager-1.6.2-py2.py3-none-any.whl.metadata (4.9 kB)
Using cached google_cloud_resource_manager-1.6.1-py2.py3-none-any.whl.metadata (4.9 kB)
Requirement already satisfied: grpc-google-iam-v1<1.0.0dev,>=0.12.4 in /opt/conda/lib/python3.10/site-packages (from google-cloud-resource-manager<3.0.0dev,>=1.3.3->google-cloud-aiplatform) (0.12.7)
Requirement already satisfied: google-crc32c<2.0dev,>=1.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-storage<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (1.5.0)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /opt/conda/lib/python3.10/site-packages (from packaging<22.0.0dev,>=14.3->google-cloud-aiplatform) (3.1.2)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /opt/conda/lib/python3.10/site-packages (from google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (4.2.4)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.10/site-packages (from google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (0.3.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.10/site-packages (from google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (4.9)
INFO: pip is looking at multiple versions of googleapis-common-protos to determine which version is compatible with other requirements. This could take a while.
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform)
Using cached googleapis_common_protos-1.63.0-py2.py3-none-any.whl.metadata (1.5 kB)
Using cached googleapis_common_protos-1.61.0-py2.py3-none-any.whl.metadata (1.5 kB)
Using cached googleapis_common_protos-1.60.0-py2.py3-none-any.whl.metadata (1.5 kB)
Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl.metadata (1.5 kB)
Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl.metadata (1.5 kB)
Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl.metadata (1.5 kB)
Using cached googleapis_common_protos-1.57.1-py2.py3-none-any.whl.metadata (1.5 kB)
INFO: pip is still looking at multiple versions of googleapis-common-protos to determine which version is compatible with other requirements. This could take a while.
Using cached googleapis_common_protos-1.57.0-py2.py3-none-any.whl.metadata (1.5 kB)
Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl.metadata (1.3 kB)
INFO: pip is looking at multiple versions of grpc-google-iam-v1 to determine which version is compatible with other requirements. This could take a while.
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-resource-manager<3.0.0dev,>=1.3.3->google-cloud-aiplatform)
Using cached grpc_google_iam_v1-0.13.0-py2.py3-none-any.whl.metadata (3.3 kB)
Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl.metadata (3.2 kB)
Using cached grpc_google_iam_v1-0.12.4-py2.py3-none-any.whl.metadata (3.2 kB)
Requirement already satisfied: six>=1.5.2 in /opt/conda/lib/python3.10/site-packages (from grpcio<2.0dev,>=1.33.2->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (1.16.0)
Requirement already satisfied: charset-normalizer<4,>=2 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (3.6)
Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (1.26.18)
Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (2024.2.2)
INFO: pip is looking at multiple versions of googleapis-common-protos[grpc] to determine which version is compatible with other requirements. This could take a while.
INFO: pip is still looking at multiple versions of googleapis-common-protos[grpc] to determine which version is compatible with other requirements. This could take a while.
Requirement already satisfied: pyasn1<0.6.0,>=0.4.6 in /opt/conda/lib/python3.10/site-packages (from pyasn1-modules>=0.2.1->google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (0.5.1)
Using cached protobuf-3.19.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB)
Using cached google_cloud_aiplatform-1.17.1-py2.py3-none-any.whl (2.3 MB)
Using cached google_api_core-2.8.2-py3-none-any.whl (114 kB)
Using cached google_cloud_bigquery-2.34.4-py2.py3-none-any.whl (206 kB)
Using cached google_cloud_resource_manager-1.6.1-py2.py3-none-any.whl (231 kB)
Using cached packaging-21.3-py3-none-any.whl (40 kB)
Using cached googleapis_common_protos-1.56.4-py2.py3-none-any.whl (211 kB)
Downloading grpc_google_iam_v1-0.12.4-py2.py3-none-any.whl (26 kB)
Installing collected packages: protobuf, packaging, googleapis-common-protos, google-api-core, grpc-google-iam-v1, google-cloud-resource-manager, google-cloud-bigquery, google-cloud-aiplatform
Attempting uninstall: protobuf
Found existing installation: protobuf 4.25.3
Uninstalling protobuf-4.25.3:
Successfully uninstalled protobuf-4.25.3
Attempting uninstall: google-api-core
Found existing installation: google-api-core 2.18.0
Uninstalling google-api-core-2.18.0:
Successfully uninstalled google-api-core-2.18.0
Attempting uninstall: google-cloud-bigquery
Found existing installation: google-cloud-bigquery 3.19.0
Uninstalling google-cloud-bigquery-3.19.0:
Successfully uninstalled google-cloud-bigquery-3.19.0
Attempting uninstall: google-cloud-aiplatform
Found existing installation: google-cloud-aiplatform 1.44.0
Uninstalling google-cloud-aiplatform-1.44.0:
Successfully uninstalled google-cloud-aiplatform-1.44.0
[33m WARNING: The script tb-gcp-uploader is installed in '/home/jupyter/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.[0m[33m
[0m[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
kfp 2.7.0 requires protobuf<5,>=4.21.1, but you have protobuf 3.19.3 which is incompatible.
kfp-pipeline-spec 0.3.0 requires protobuf<5,>=4.21.1, but you have protobuf 3.19.3 which is incompatible.
conda 24.1.2 requires packaging>=23.0, but you have packaging 21.3 which is incompatible.
google-api-python-client 1.8.0 requires google-api-core<2dev,>=1.13.0, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-artifact-registry 1.11.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-artifact-registry 1.11.3 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.19.3 which is incompatible.
google-cloud-bigquery-storage 2.16.2 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.19.3 which is incompatible.
google-cloud-dlp 3.15.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-dlp 3.15.3 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.19.3 which is incompatible.
google-cloud-monitoring 2.19.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-monitoring 2.19.3 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.19.3 which is incompatible.
google-cloud-pubsub 2.20.1 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.0, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-pubsub 2.20.1 requires grpcio<2.0dev,>=1.51.3, but you have grpcio 1.48.1 which is incompatible.
google-cloud-pubsub 2.20.1 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.19.3 which is incompatible.
google-cloud-spanner 3.43.0 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.0, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-spanner 3.43.0 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.19.3 which is incompatible.
google-cloud-vision 3.7.2 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-vision 3.7.2 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5, but you have protobuf 3.19.3 which is incompatible.
tensorboard 2.15.2 requires grpcio>=1.48.2, but you have grpcio 1.48.1 which is incompatible.
tensorboard 2.15.2 requires protobuf!=4.24.0,>=3.19.6, but you have protobuf 3.19.3 which is incompatible.
tensorboardx 2.6.2.2 requires protobuf>=3.20, but you have protobuf 3.19.3 which is incompatible.
tensorboard-plugin-profile 2.15.1 requires protobuf<5.0.0dev,>=3.19.6, but you have protobuf 3.19.3 which is incompatible.
tensorflow 2.15.0 requires protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3, but you have protobuf 3.19.3 which is incompatible.
tensorflow-datasets 4.9.4 requires protobuf>=3.20, but you have protobuf 3.19.3 which is incompatible.
tensorflow-hub 0.16.1 requires protobuf>=3.19.6, but you have protobuf 3.19.3 which is incompatible.
tensorflow-serving-api 2.14.1 requires protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3, but you have protobuf 3.19.3 which is incompatible.
ydata-profiling 4.6.0 requires seaborn<0.13,>=0.10.1, but you have seaborn 0.13.2 which is incompatible.[0m[31m
[0mSuccessfully installed google-api-core-2.8.2 google-cloud-aiplatform-1.17.1 google-cloud-bigquery-2.34.4 google-cloud-resource-manager-1.6.1 googleapis-common-protos-1.56.4 grpc-google-iam-v1-0.12.4 packaging-21.3 protobuf-3.19.3
!pip install -U google-cloud-aiplatform "shapely<2"
Requirement already satisfied: google-cloud-aiplatform in /home/jupyter/.local/lib/python3.10/site-packages (1.17.1)
Collecting google-cloud-aiplatform
Using cached google_cloud_aiplatform-1.44.0-py2.py3-none-any.whl.metadata (27 kB)
Collecting shapely<2
Downloading Shapely-1.8.5.post1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (43 kB)
[2K [90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━[0m [32m43.1/43.1 kB[0m [31m4.1 MB/s[0m eta [36m0:00:00[0m
[?25hRequirement already satisfied: google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1 in /home/jupyter/.local/lib/python3.10/site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (2.8.2)
Requirement already satisfied: google-auth<3.0.0dev,>=2.14.1 in /opt/conda/lib/python3.10/site-packages (from google-cloud-aiplatform) (2.28.1)
Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-aiplatform) (1.23.0)
Collecting protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.19.5 (from google-cloud-aiplatform)
Using cached protobuf-4.25.3-cp37-abi3-manylinux2014_x86_64.whl.metadata (541 bytes)
Requirement already satisfied: packaging>=14.3 in /home/jupyter/.local/lib/python3.10/site-packages (from google-cloud-aiplatform) (21.3)
Requirement already satisfied: google-cloud-storage<3.0.0dev,>=1.32.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-aiplatform) (2.14.0)
Requirement already satisfied: google-cloud-bigquery<4.0.0dev,>=1.15.0 in /home/jupyter/.local/lib/python3.10/site-packages (from google-cloud-aiplatform) (2.34.4)
Requirement already satisfied: google-cloud-resource-manager<3.0.0dev,>=1.3.3 in /home/jupyter/.local/lib/python3.10/site-packages (from google-cloud-aiplatform) (1.6.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.56.2 in /home/jupyter/.local/lib/python3.10/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (1.56.4)
Requirement already satisfied: requests<3.0.0dev,>=2.18.0 in /opt/conda/lib/python3.10/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (2.31.0)
Requirement already satisfied: grpcio<2.0dev,>=1.33.2 in /opt/conda/lib/python3.10/site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (1.48.1)
Requirement already satisfied: grpcio-status<2.0dev,>=1.33.2 in /opt/conda/lib/python3.10/site-packages (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (1.48.1)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /opt/conda/lib/python3.10/site-packages (from google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (4.2.4)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /opt/conda/lib/python3.10/site-packages (from google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (0.3.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /opt/conda/lib/python3.10/site-packages (from google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (4.9)
Requirement already satisfied: google-cloud-core<3.0.0dev,>=1.4.1 in /opt/conda/lib/python3.10/site-packages (from google-cloud-bigquery<4.0.0dev,>=1.15.0->google-cloud-aiplatform) (2.4.1)
Requirement already satisfied: google-resumable-media<3.0dev,>=0.6.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-bigquery<4.0.0dev,>=1.15.0->google-cloud-aiplatform) (2.7.0)
Downloading protobuf-3.20.3-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (679 bytes)
Requirement already satisfied: python-dateutil<3.0dev,>=2.7.2 in /opt/conda/lib/python3.10/site-packages (from google-cloud-bigquery<4.0.0dev,>=1.15.0->google-cloud-aiplatform) (2.9.0)
Requirement already satisfied: grpc-google-iam-v1<1.0.0dev,>=0.12.4 in /home/jupyter/.local/lib/python3.10/site-packages (from google-cloud-resource-manager<3.0.0dev,>=1.3.3->google-cloud-aiplatform) (0.12.4)
Requirement already satisfied: google-crc32c<2.0dev,>=1.0 in /opt/conda/lib/python3.10/site-packages (from google-cloud-storage<3.0.0dev,>=1.32.0->google-cloud-aiplatform) (1.5.0)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /opt/conda/lib/python3.10/site-packages (from packaging>=14.3->google-cloud-aiplatform) (3.1.2)
Requirement already satisfied: six>=1.5.2 in /opt/conda/lib/python3.10/site-packages (from grpcio<2.0dev,>=1.33.2->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (1.16.0)
Requirement already satisfied: pyasn1<0.6.0,>=0.4.6 in /opt/conda/lib/python3.10/site-packages (from pyasn1-modules>=0.2.1->google-auth<3.0.0dev,>=2.14.1->google-cloud-aiplatform) (0.5.1)
Requirement already satisfied: charset-normalizer<4,>=2 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (3.3.2)
Requirement already satisfied: idna<4,>=2.5 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (3.6)
Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (1.26.18)
Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.10/site-packages (from requests<3.0.0dev,>=2.18.0->google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1->google-cloud-aiplatform) (2024.2.2)
Using cached google_cloud_aiplatform-1.44.0-py2.py3-none-any.whl (4.2 MB)
Downloading Shapely-1.8.5.post1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.0 MB)
[2K [90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━[0m [32m2.0/2.0 MB[0m [31m64.3 MB/s[0m eta [36m0:00:00[0m
[?25hDownloading protobuf-3.20.3-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)
[2K [90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━[0m [32m1.1/1.1 MB[0m [31m56.2 MB/s[0m eta [36m0:00:00[0m
[?25hInstalling collected packages: shapely, protobuf, google-cloud-aiplatform
Attempting uninstall: shapely
Found existing installation: shapely 2.0.3
Uninstalling shapely-2.0.3:
Successfully uninstalled shapely-2.0.3
Attempting uninstall: protobuf
Found existing installation: protobuf 3.19.3
Uninstalling protobuf-3.19.3:
Successfully uninstalled protobuf-3.19.3
Attempting uninstall: google-cloud-aiplatform
Found existing installation: google-cloud-aiplatform 1.17.1
Uninstalling google-cloud-aiplatform-1.17.1:
Successfully uninstalled google-cloud-aiplatform-1.17.1
[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
kfp 2.7.0 requires protobuf<5,>=4.21.1, but you have protobuf 3.20.3 which is incompatible.
kfp-pipeline-spec 0.3.0 requires protobuf<5,>=4.21.1, but you have protobuf 3.20.3 which is incompatible.
google-api-python-client 1.8.0 requires google-api-core<2dev,>=1.13.0, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-artifact-registry 1.11.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-dlp 3.15.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-monitoring 2.19.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-pubsub 2.20.1 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.0, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-pubsub 2.20.1 requires grpcio<2.0dev,>=1.51.3, but you have grpcio 1.48.1 which is incompatible.
google-cloud-spanner 3.43.0 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.0, but you have google-api-core 2.8.2 which is incompatible.
google-cloud-vision 3.7.2 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.10.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.*,<3.0.0dev,>=1.34.1, but you have google-api-core 2.8.2 which is incompatible.
tensorboard 2.15.2 requires grpcio>=1.48.2, but you have grpcio 1.48.1 which is incompatible.[0m[31m
[0mSuccessfully installed google-cloud-aiplatform-1.43.0 protobuf-3.20.3 shapely-1.8.5.post1
Colab only: Run the following cell to restart the kernel or use the button to restart the kernel. For Vertex AI Workbench you can restart the terminal using the button on top.
# Automatically restart kernel after installs so that your environment can access the new packages
import IPython
app = IPython.Application.instance()
app.kernel.do_shutdown(True)
{'status': 'ok', 'restart': True}
import sys
if "google.colab" in sys.modules:
from google.colab import auth
auth.authenticate_user()
Install the Google Cloud SDK.
Obtain authentication credentials. Create local credentials by running the following command and following the oauth2 flow (read more about the command here):
gcloud auth application-default login
Colab only: Run the following cell to initialize the Vertex AI SDK. For Vertex AI Workbench, you don’t need to run this.
import vertexai
PROJECT_ID = "qwiklabs-gcp-04-91900298e456" # @param {type:"string"}
REGION = "us-central1" # @param {type:"string"}
vertexai.init(project=PROJECT_ID, location=REGION)
from vertexai.language_models import TextGenerationModel
from vertexai.language_models import ChatModel
generation_model = TextGenerationModel.from_pretrained("text-bison@001")
Prompt engineering is all about how to design your prompts so that the response is what you were indeed hoping to see.
The idea of using “unfancy” prompts is to minimize the noise in your prompt to reduce the possibility of the LLM misinterpreting the intent of the prompt. Below are a few guidelines on how to engineer “unfancy” prompts.
In this section, you’ll cover the following best practices when engineering prompts:
🛑 Not recommended. The prompt below is unnecessarily verbose.
prompt = "What do you think could be a good name for a flower shop that specializes in selling bouquets of dried flowers more than fresh flowers? Thank you!"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
* **Dried & Lovely**
* **Dried Blooms**
* **Dried Florals**
* **Dried Arrangements**
* **Preserved Petals**
* **Forever Flowers**
* **Timeless Blooms**
* **Dried Flower Boutique**
* **Dried Flower Shop**
✅ Recommended. The prompt below is to the point and concise.
prompt = "Suggest a name for a flower shop that sells bouquets of dried flowers"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
* **Dried Blooms**
* **Preserved Petals**
* **Forever Flowers**
* **Dried & Wild**
* **Naturally Beautiful**
* **Blooming Memories**
* **Sentimental Bouquets**
* **Timeless Treasures**
Suppose that you want to brainstorm creative ways to describe Earth.
🛑 Not recommended. The prompt below is too generic.
prompt = "Tell me about Earth"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
Earth is the third planet from the Sun, and the only astronomical object known to harbor life. It is the densest and fifth-largest of the eight planets in the Solar System. Earth's atmosphere is composed of 78% nitrogen, 21% oxygen, 1% other gases, and water vapor. The Earth's surface is divided into several tectonic plates that move around the planet's surface. The Earth's interior is divided into a solid inner core, a liquid outer core, and a mantle. The Earth's magnetic field is generated by the motion of the liquid outer core. The Earth's orbit around the Sun takes 365.256 days, or one year. The Earth's axis is tilted at an angle of 23.5 degrees, which causes the seasons. The Earth's rotation period is 24 hours, or one day.
✅ Recommended. The prompt below is specific and well-defined.
prompt = "Generate a list of ways that makes Earth unique compared to other planets"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
* **Earth is the only planet known to support life.** This is due to a number of factors, including its distance from the sun, its atmosphere, and its water.
* **Earth has a large moon.** The moon is thought to have played a role in the development of life on Earth, by stabilizing the planet's rotation and providing a source of tides.
* **Earth has a relatively thin atmosphere.** This atmosphere is made up of nitrogen, oxygen, and argon, and it protects the planet from harmful radiation from the sun.
* **Earth has a large amount of water.** Water is essential for life, and it covers about 70% of the Earth's surface.
* **Earth has a variety of landforms.** These landforms include mountains, valleys, plains, and deserts. They provide a variety of habitats for life.
* **Earth has a variety of climate zones.** These climate zones range from the frigid Arctic to the hot tropics. They support a wide variety of plant and animal life.
These are just a few of the ways that make Earth unique compared to other planets. Earth is a truly special planet, and it is home to a vast and diverse array of life
🛑 Not recommended. The prompt below has two parts to the question that could be asked separately.
prompt = "What's the best method of boiling water and why is the sky blue?"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
The best method of boiling water is to use a kettle. A kettle is a pot with a spout that is designed specifically for boiling water. It has a lid that helps to keep the heat in and prevents the water from boiling over.
The sky is blue because of a phenomenon called Rayleigh scattering. This is the scattering of light by particles that are smaller than the wavelength of light. The particles in the atmosphere that cause Rayleigh scattering are molecules of nitrogen and oxygen.
When sunlight hits these molecules, the blue light is scattered in all directions. The other colors of light are scattered less, so they do not reach our eyes as much. This is why the sky appears blue during the day.
✅ Recommended. The prompts below asks one task a time.
prompt = "What's the best method of boiling water?"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
The best method of boiling water is to use a kettle. A kettle is a pot with a spout that is designed specifically for boiling water. It is typically made of metal, and has a lid that fits snugly to prevent steam from escaping. Kettles come in a variety of sizes, so you can choose one that is right for your needs. To boil water in a kettle, simply fill it with water and place it on the stove. Turn the heat to high, and wait for the water to come to a boil. Once the water is boiling, turn off the heat and carefully pour the water into your desired container.
prompt = "Why is the sky blue?"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
The sky is blue because of a phenomenon called Rayleigh scattering. This is the scattering of light by particles that are smaller than the wavelength of light. In the case of the sky, the particles that are doing the scattering are molecules of nitrogen and oxygen.
When sunlight hits these molecules, the blue light is scattered more than the other colors. This is because the blue light has a shorter wavelength than the other colors. The red light, orange light, and yellow light are not scattered as much, so they travel through the atmosphere to our eyes. This is why we see the sky as blue.
The amount of scattering depends on the wavelength of light and the size of the particles. The shorter the wavelength of light, the more it is scattered. The smaller the particles, the more they scatter light.
This is why the sky is blue during the day. However, at sunrise and sunset, the sunlight has to travel through more of the atmosphere to reach our eyes. This means that more of the blue light is scattered, and we see the sky as more red or orange.
The sky can also appear blue when there is a lot of dust or pollution in the air. This is because the dust and pollution particles act as additional scattering particles.
Although LLMs have been trained on a large amount of data, they can generate text containing statements not grounded in truth or reality; these responses from the LLM are often referred to as “hallucinations” due to their limited memorization capabilities. Note that simply prompting the LLM to provide a citation isn’t a fix to this problem, as there are instances of LLMs providing false or inaccurate citations. Dealing with hallucinations is a fundamental challenge of LLMs and an ongoing research area, so it is important to be cognizant that LLMs may seem to give you confident, correct-sounding statements that are in fact incorrect.
Note that if you intend to use LLMs for the creative use cases, hallucinating could actually be quite useful.
Try the prompt like the one below repeatedly. You may notice that sometimes it will confidently, but inaccurately, say “The first elephant to visit the moon was Luna”.
prompt = "Who was the first elephant to visit the moon?"
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
The first elephant to visit the moon was Luna. Luna was a female elephant who was born in 1963 at the San Diego Zoo. In 1968, she was selected to be the first elephant to visit the moon. Luna was trained to wear a spacesuit and to ride in a rocket. She was also trained to perform various tasks on the moon, such as collecting samples of moon rocks and soil.
Luna's trip to the moon was a success. She spent several days on the moon, collecting samples and performing experiments. She also became the first animal to walk on the moon. Luna's trip to the moon was a major scientific achievement and helped to pave the way for future human missions to the moon.
Luna returned to Earth in 1969. She was a national hero and was celebrated by people all over the world. Luna lived a long and happy life, dying in 2003 at the age of 40.
Clearly the chatbot is hallucinating since no elephant has ever flown to the moon. But how do we prevent these kinds of inappropriate questions and more specifically, reduce hallucinations?
There is one possible method called the Determine Appropriate Response (DARE) prompt, which cleverly uses the LLM itself to decide whether it should answer a question based on what its mission is.
Let’s see how it works by creating a chatbot for a travel website with a slight twist.
chat_model = ChatModel.from_pretrained("chat-bison@002")
chat = chat_model.start_chat()
dare_prompt = """Remember that before you answer a question, you must check to see if it complies with your mission.
If not, you can say, Sorry I can't answer that question."""
print(
chat.send_message(
f"""
Hello! You are an AI chatbot for a travel web site.
Your mission is to provide helpful queries for travelers.
{dare_prompt}
"""
)
)
MultiCandidateTextGenerationResponse(text=" Hello! I'm here to help you plan your next trip. Whether you're looking for a relaxing beach vacation or an adventurous city getaway, I can help you find the perfect destination and activities for your budget and interests.", _prediction_response=Prediction(predictions=[{'candidates': [{'author': '1', 'content': " Hello! I'm here to help you plan your next trip. Whether you're looking for a relaxing beach vacation or an adventurous city getaway, I can help you find the perfect destination and activities for your budget and interests."}], 'groundingMetadata': [{}], 'citationMetadata': [{'citations': []}], 'safetyAttributes': [{'blocked': False, 'scores': [0.1, 0.1, 0.1, 0.3], 'categories': ['Finance', 'Insult', 'Profanity', 'Sexual'], 'safetyRatings': [{'severityScore': 0.1, 'severity': 'NEGLIGIBLE', 'probabilityScore': 0.1, 'category': 'Dangerous Content'}, {'severityScore': 0.0, 'category': 'Harassment', 'probabilityScore': 0.1, 'severity': 'NEGLIGIBLE'}, {'probabilityScore': 0.0, 'severityScore': 0.0, 'category': 'Hate Speech', 'severity': 'NEGLIGIBLE'}, {'probabilityScore': 0.3, 'severity': 'NEGLIGIBLE', 'severityScore': 0.1, 'category': 'Sexually Explicit'}]}]}], deployed_model_id='', metadata={'tokenMetadata': {'outputTokenCount': {'totalTokens': 46.0, 'totalBillableCharacters': 186.0}, 'inputTokenCount': {'totalTokens': 64.0, 'totalBillableCharacters': 218.0}}}, model_version_id='', model_resource_name='', explanations=None), is_blocked=False, errors=(), safety_attributes={'Finance': 0.1, 'Insult': 0.1, 'Profanity': 0.1, 'Sexual': 0.3}, grounding_metadata=GroundingMetadata(citations=[], search_queries=[]), candidates=[ Hello! I'm here to help you plan your next trip. Whether you're looking for a relaxing beach vacation or an adventurous city getaway, I can help you find the perfect destination and activities for your budget and interests.])
Suppose we ask a simple question about one of Italy’s most famous tourist spots.
prompt = "What is the best place for sightseeing in Milan, Italy?"
print(chat.send_message(prompt))
MultiCandidateTextGenerationResponse(text=' There are many great places for sightseeing in Milan, Italy. Some of the most popular include:\n\n* The Duomo: This stunning Gothic cathedral is the largest in Italy and is a must-see for any visitor to Milan.\n* The Galleria Vittorio Emanuele II: This beautiful shopping arcade is home to some of the most luxurious shops in the world, as well as several cafes and restaurants.\n* The Sforza Castle: This historic castle was once the home of the ruling Sforza family and is now a popular tourist destination.\n* The Parco Sempione: This large park is a great place to relax and enjoy the outdoors. It is also home to several museums and art galleries.\n* The Navigli: This picturesque canal district is a great place to take a stroll and enjoy the scenery. There are also several bars and restaurants in the area.', _prediction_response=Prediction(predictions=[{'safetyAttributes': [{'blocked': False, 'categories': ['Insult', 'Sexual'], 'scores': [0.1, 0.2], 'safetyRatings': [{'category': 'Dangerous Content', 'probabilityScore': 0.1, 'severity': 'NEGLIGIBLE', 'severityScore': 0.1}, {'severity': 'NEGLIGIBLE', 'severityScore': 0.0, 'probabilityScore': 0.1, 'category': 'Harassment'}, {'category': 'Hate Speech', 'probabilityScore': 0.0, 'severityScore': 0.0, 'severity': 'NEGLIGIBLE'}, {'probabilityScore': 0.2, 'severityScore': 0.1, 'severity': 'NEGLIGIBLE', 'category': 'Sexually Explicit'}]}], 'citationMetadata': [{'citations': []}], 'candidates': [{'content': ' There are many great places for sightseeing in Milan, Italy. Some of the most popular include:\n\n* The Duomo: This stunning Gothic cathedral is the largest in Italy and is a must-see for any visitor to Milan.\n* The Galleria Vittorio Emanuele II: This beautiful shopping arcade is home to some of the most luxurious shops in the world, as well as several cafes and restaurants.\n* The Sforza Castle: This historic castle was once the home of the ruling Sforza family and is now a popular tourist destination.\n* The Parco Sempione: This large park is a great place to relax and enjoy the outdoors. It is also home to several museums and art galleries.\n* The Navigli: This picturesque canal district is a great place to take a stroll and enjoy the scenery. There are also several bars and restaurants in the area.', 'author': 'bot'}], 'groundingMetadata': [{}]}], deployed_model_id='', metadata={'tokenMetadata': {'inputTokenCount': {'totalTokens': 122.0, 'totalBillableCharacters': 450.0}, 'outputTokenCount': {'totalBillableCharacters': 663.0, 'totalTokens': 172.0}}}, model_version_id='', model_resource_name='', explanations=None), is_blocked=False, errors=(), safety_attributes={'Insult': 0.1, 'Sexual': 0.2}, grounding_metadata=GroundingMetadata(citations=[], search_queries=[]), candidates=[ There are many great places for sightseeing in Milan, Italy. Some of the most popular include:
* The Duomo: This stunning Gothic cathedral is the largest in Italy and is a must-see for any visitor to Milan.
* The Galleria Vittorio Emanuele II: This beautiful shopping arcade is home to some of the most luxurious shops in the world, as well as several cafes and restaurants.
* The Sforza Castle: This historic castle was once the home of the ruling Sforza family and is now a popular tourist destination.
* The Parco Sempione: This large park is a great place to relax and enjoy the outdoors. It is also home to several museums and art galleries.
* The Navigli: This picturesque canal district is a great place to take a stroll and enjoy the scenery. There are also several bars and restaurants in the area.])
Now let us pretend to be a not-so-nice user and ask the chatbot a question that is unrelated to travel.
prompt = "Who was the first elephant to visit the moon?"
print(chat.send_message(prompt))
MultiCandidateTextGenerationResponse(text=" Sorry, I can't answer that question. There have been no elephants on the moon.", _prediction_response=Prediction(predictions=[{'safetyAttributes': [{'blocked': False, 'scores': [0.1, 0.1, 0.3, 0.2, 0.1, 0.1, 0.1], 'safetyRatings': [{'severityScore': 0.0, 'probabilityScore': 0.0, 'category': 'Dangerous Content', 'severity': 'NEGLIGIBLE'}, {'severity': 'NEGLIGIBLE', 'category': 'Harassment', 'severityScore': 0.0, 'probabilityScore': 0.1}, {'probabilityScore': 0.1, 'category': 'Hate Speech', 'severityScore': 0.1, 'severity': 'NEGLIGIBLE'}, {'severityScore': 0.1, 'probabilityScore': 0.1, 'severity': 'NEGLIGIBLE', 'category': 'Sexually Explicit'}], 'categories': ['Death, Harm & Tragedy', 'Derogatory', 'Finance', 'Health', 'Insult', 'Legal', 'Sexual']}], 'citationMetadata': [{'citations': []}], 'groundingMetadata': [{}], 'candidates': [{'author': 'bot', 'content': " Sorry, I can't answer that question. There have been no elephants on the moon."}]}], deployed_model_id='', metadata={'tokenMetadata': {'outputTokenCount': {'totalTokens': 19.0, 'totalBillableCharacters': 65.0}, 'inputTokenCount': {'totalTokens': 304.0, 'totalBillableCharacters': 1150.0}}}, model_version_id='', model_resource_name='', explanations=None), is_blocked=False, errors=(), safety_attributes={'Death, Harm & Tragedy': 0.1, 'Derogatory': 0.1, 'Finance': 0.3, 'Health': 0.2, 'Insult': 0.1, 'Legal': 0.1, 'Sexual': 0.1}, grounding_metadata=GroundingMetadata(citations=[], search_queries=[]), candidates=[ Sorry, I can't answer that question. There have been no elephants on the moon.])
You can see that the DARE prompt added a layer of guard rails that prevented the chatbot from veering off course.
The prompt below results in an open-ended response, useful for brainstorming, but response is highly variable.
prompt = "I'm a high school student. Recommend me a programming activity to improve my skills."
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
* **Write a program to solve a problem you're interested in.** This could be anything from a game to a tool to help you with your studies. The important thing is that you're interested in the problem and that you're motivated to solve it.
* **Take a programming course.** There are many online and offline courses available, so you can find one that fits your schedule and learning style.
* **Join a programming community.** There are many online and offline communities where you can connect with other programmers and learn from each other.
* **Read programming books and articles.** There is a wealth of information available online and in libraries about programming. Take some time to read about different programming languages, techniques, and algorithms.
* **Practice, practice, practice!** The best way to improve your programming skills is to practice as much as you can. Write programs, solve problems, and learn from your mistakes.
The prompt below results in a choice and may be useful if you want the output to be easier to control.
prompt = """I'm a high school student. Which of these activities do you suggest and why:
a) learn Python
b) learn Javascript
c) learn Fortran
"""
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
I would suggest learning Python. Python is a general-purpose programming language that is easy to learn and has a wide range of applications. It is used in a variety of fields, including web development, data science, and machine learning. Python is also a popular language for beginners, as it has a large community of support and resources available.
Another way to improve response quality is to add examples in your prompt. The LLM learns in-context from the examples on how to respond. Typically, one to five examples (shots) are enough to improve the quality of responses. Including too many examples can cause the model to over-fit the data and reduce the quality of responses.
Similar to classical model training, the quality and distribution of the examples is very important. Pick examples that are representative of the scenarios that you need the model to learn, and keep the distribution of the examples (e.g. number of examples per class in the case of classification) aligned with your actual distribution.
Below is an example of zero-shot prompting, where you don’t provide any examples to the LLM within the prompt itself.
prompt = """Decide whether a Tweet's sentiment is positive, neutral, or negative.
Tweet: I loved the new YouTube video you made!
Sentiment:
"""
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
positive
Below is an example of one-shot prompting, where you provide one example to the LLM within the prompt to give some guidance on what type of response you want.
prompt = """Decide whether a Tweet's sentiment is positive, neutral, or negative.
Tweet: I loved the new YouTube video you made!
Sentiment: positive
Tweet: That was awful. Super boring 😠
Sentiment:
"""
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
negative
Below is an example of few-shot prompting, where you provide a few examples to the LLM within the prompt to give some guidance on what type of response you want.
prompt = """Decide whether a Tweet's sentiment is positive, neutral, or negative.
Tweet: I loved the new YouTube video you made!
Sentiment: positive
Tweet: That was awful. Super boring 😠
Sentiment: negative
Tweet: Something surprised me about this video - it was actually original. It was not the same old recycled stuff that I always see. Watch it - you will not regret it.
Sentiment:
"""
print(generation_model.predict(prompt=prompt, max_output_tokens=256).text)
positive
Which prompt technique to use will solely depends on your goal. The zero-shot prompts are more open-ended and can give you creative answers, while one-shot and few-shot prompts teach the model how to behave so you can get more predictable answers that are consistent with the examples provided.