Custom Loss function w. Tensorflow, example:

```
def test_loss(y_true, y_pred):
if y_pred.shape[2] < 10:
return 0.0
else:
ris = tf.constant(0.0)
dim = K.shape(y_pred)
for i in range(y_pred.shape[1]):
aaa = K.reshape(y_true[0][i],(dim[2]*1,1))
bbb = K.reshape(y_pred[0][i],(dim[2]*1,1))
bce = tf.keras.losses.BinaryCrossentropy(from_logits=False, label_smoothing=0, reduction="auto",)
ris = tf.add(ris, bce(aaa, bbb))
return ris/(y_pred.shape[1]+1)
```

The code above is an example of (advanced) custom loss built in Tensorflow-keras. Lets analize it together to learn how to build it from zero. First of all we have to use a standard syntax, it must accept only 2 arguments, y_true and y_pred, which are respectively the “true label” label tensor and the model output tensor.

Here’s a naive example:

```
import keras.backend as K
def cutom_loss(y_true, y_pred):
return K.mean(y_true, y_pred)
```

The above loss function compute the mean of the two tensors in input. However normally we are interested in do more complex operations in our custom loss, such the following :

```
import tensorflow as tf
import keras.backend as K
# define the losses we are gonna use in our custom
huber = tf.keras.loss.Huber()
bce = tf.keras.loss.BinaryCrossEntropy(from_logits = False)
# custom function
def custom_loss_01(y_true, y_pred):
return bce(y_true, y_pred)/2
# another custom loss function
def custom_loss_function(y_true, y_pred):
squared_difference = tf.square(y_true - y_pred)
return tf.reduce_mean(squared_difference, axis=-1)
```

The above script shows 2 examples of custom losses:

- the first return half the std return of the BCE loss
- the second one return reduce_mean of the squared difference

So basically we can build any kind of function as our loss. These were just two examples to show some possible operations. Many more operations can be done, see all tf.OPERATOR function available in the official documentation (link).

We are gonna see some more complex examples this weekend, i’m gonna link the colab notebook in red box below. 👋

Still working on it…