Coursera

Creating a Siamese model using Trax: Ungraded Lecture Notebook

import trax
from trax import layers as tl
import trax.fastmath.numpy as np
import numpy

# Setting random seeds
numpy.random.seed(10)

L2 Normalization

Before building the model you will need to define a function that applies L2 normalization to a tensor. This is very important because in this week’s assignment you will create a custom loss function which expects the tensors it receives to be normalized. Luckily this is pretty straightforward:

def normalize(x):
    return x / np.sqrt(np.sum(x * x, axis=-1, keepdims=True))

Notice that the denominator can be replaced by np.linalg.norm(x, axis=-1, keepdims=True) to achieve the same results and that Trax’s numpy is being used within the function.

tensor = numpy.random.random((2,5))
print(f'The tensor is of type: {type(tensor)}\n\nAnd looks like this:\n\n {tensor}')
The tensor is of type: <class 'numpy.ndarray'>

And looks like this:

 [[0.77132064 0.02075195 0.63364823 0.74880388 0.49850701]
 [0.22479665 0.19806286 0.76053071 0.16911084 0.08833981]]
norm_tensor = normalize(tensor)
print(f'The normalized tensor is of type: {type(norm_tensor)}\n\nAnd looks like this:\n\n {norm_tensor}')
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)


The normalized tensor is of type: <class 'jaxlib.xla_extension.DeviceArray'>

And looks like this:

 [[0.57393795 0.01544148 0.4714962  0.55718327 0.37093794]
 [0.26781026 0.23596111 0.9060541  0.20146926 0.10524315]]

Notice that the initial tensor was converted from a numpy array to a jax array in the process.

Siamese Model

To create a Siamese model you will first need to create a LSTM model using the Serial combinator layer and then use another combinator layer called Parallel to create the Siamese model. You should be familiar with the following layers (notice each layer can be clicked to go to the docs):

Putting everything together the Siamese model will look like this:

vocab_size = 500
model_dimension = 128

# Define the LSTM model
LSTM = tl.Serial(
        tl.Embedding(vocab_size=vocab_size, d_feature=model_dimension),
        tl.LSTM(model_dimension),
        tl.Mean(axis=1),
        tl.Fn('Normalize', lambda x: normalize(x))
    )

# Use the Parallel combinator to create a Siamese model out of the LSTM 
Siamese = tl.Parallel(LSTM, LSTM)

Next is a helper function that prints information for every layer (sublayer within Serial):

def show_layers(model, layer_prefix):
    print(f"Total layers: {len(model.sublayers)}\n")
    for i in range(len(model.sublayers)):
        print('========')
        print(f'{layer_prefix}_{i}: {model.sublayers[i]}\n')

print('Siamese model:\n')
show_layers(Siamese, 'Parallel.sublayers')

print('Detail of LSTM models:\n')
show_layers(LSTM, 'Serial.sublayers')
Siamese model:

Total layers: 2

========
Parallel.sublayers_0: Serial[
  Embedding_500_128
  LSTM_128
  Mean
  Normalize
]

========
Parallel.sublayers_1: Serial[
  Embedding_500_128
  LSTM_128
  Mean
  Normalize
]

Detail of LSTM models:

Total layers: 4

========
Serial.sublayers_0: Embedding_500_128

========
Serial.sublayers_1: LSTM_128

========
Serial.sublayers_2: Mean

========
Serial.sublayers_3: Normalize

Try changing the parameters defined before the Siamese model and see how it changes!

You will actually train this model in this week’s assignment. For now you should be more familiarized with creating Siamese models using Trax. Keep it up!