w3resource

Python TensorFlow custom loss function

Python TensorFlow Building and Training a Simple Model: Exercise-8 with Solution

Write a Python program that creates a custom loss function using TensorFlow that penalizes errors differently for positive and negative examples.

To create a custom loss function using TensorFlow that penalizes errors differently for positive and negative examples we define a custom loss function that applies different weights to positive and negative examples when calculating the loss.

Sample Solution:

Python Code:

import tensorflow as tf

# Custom loss function definition
def custom_loss(y_true, y_pred):
    # Define custom weights for positive and negative examples
    positive_weight = 3.0  # Weight for positive examples
    negative_weight = 2.0  # Weight for negative examples
    
    # Calculate the weighted binary cross-entropy loss
    loss = - (positive_weight * y_true * tf.math.log(y_pred) + negative_weight * (1 - y_true) * tf.math.log(1 - y_pred))
    
    # Calculate the mean loss over batch
    loss = tf.reduce_mean(loss)
    
    return loss

# Simulated ground truth and predicted values (for demonstration)
y_true = tf.constant([1.0, 0.0, 1.0, 0.0], dtype=tf.float32)
y_pred = tf.constant([0.8, 0.2, 0.7, 0.3], dtype=tf.float32)

# Calculate the custom loss using the defined function
loss = custom_loss(y_true, y_pred)

# Print the custom loss value
print("Custom Loss:", loss.numpy())

Output:

Custom Loss: 0.72477317

Explanation:

In the exercise above -

  • Import the necessary TensorFlow modules.
  • Define a custom loss function "custom_loss" that takes two arguments: 'y_true' (ground truth labels) and 'y_pred' (predicted probabilities).
  • Inside the "custom_loss()" function, we specify different weights for positive and negative examples ('positive_weight' and 'negative_weight').
  • Calculate the binary cross-entropy loss with custom weights by applying them to positive and negative examples differently. The loss formula is applied element-wise to each pair of true labels ('y_true') and predicted probabilities ('y_pred').
  • Calculate the mean loss over the batch using 'tf.reduce_mean'.
  • Finally, we return the calculated loss.

Python Code Editor:


Previous: Implementing a categorical cross-entropy loss function in TensorFlow.
Next: Custom loss function in TensorFlow for positive and negative examples.

What is the difficulty level of this exercise?



Follow us on Facebook and Twitter for latest update.