• 政安晨:【Keras机器学习示例演绎】(十四)—— 用于弱光图像增强的零 DCE


    目录

    简介

    下载 LOL 数据集

    创建 TensorFlow 数据集

    零 DCE 框架

    了解光线增强曲线

    DCE-Net

    损失函数

    色彩恒定损失

    曝光损失

    光照平滑度损失

    空间一致性损失

    深度曲线估计模型

    训练

    推论

    测试图像推理


    政安晨的个人主页政安晨

    欢迎 👍点赞✍评论⭐收藏

    收录专栏TensorFlow与Keras机器学习实战

    希望政安晨的博客能够对您有所裨益,如有不足之处,欢迎在评论区提出指正!

    本文目标:实施零参考深度曲线估算,实现低-高。

    简介


    零参考深度曲线估算(Zero-Reference Deep Curve Estimation 或 Zero-DCE)将低照度图像增强定义为利用深度神经网络估算图像特定色调曲线的任务。

    在本示例中,我们训练一个轻量级深度网络 DCE-Net,以估计像素级和高阶色调曲线,从而调整给定图像的动态范围。

    Zero-DCE 将低照度图像作为输入,并生成高阶色调曲线作为输出。然后利用这些曲线对输入图像的动态范围进行像素级调整,从而获得增强图像。曲线估算过程可以保持增强图像的范围,并保留相邻像素的对比度。这种曲线估算的灵感来自 Adobe Photoshop 等照片编辑软件中使用的曲线调整,用户可以在整个图像的色调范围内调整点。

    Zero-DCE 的吸引力在于其对参考图像的宽松假设:它在训练过程中不需要任何输入/输出图像对。这是通过一组精心制定的非参考损失函数实现的,这些函数隐含地测量增强质量并指导网络的训练。

    下载 LOL 数据集


    LoL 数据集是为弱光图像增强而创建的。该数据集提供 485 幅图像用于训练,15 幅图像用于测试。数据集中的每对图像都由低照度输入图像和相应的曝光良好的参考图像组成。

    1. import os
    2. os.environ["KERAS_BACKEND"] = "tensorflow"
    3. import random
    4. import numpy as np
    5. from glob import glob
    6. from PIL import Image, ImageOps
    7. import matplotlib.pyplot as plt
    8. import keras
    9. from keras import layers
    10. import tensorflow as tf
    1. !wget https://huggingface.co/datasets/geekyrakshit/LoL-Dataset/resolve/main/lol_dataset.zip
    2. !unzip -q lol_dataset.zip && rm lol_dataset.zip

    演绎如下:

    1. --2023-11-20 20:01:50-- https://huggingface.co/datasets/geekyrakshit/LoL-Dataset/resolve/main/lol_dataset.zip
    2. Resolving huggingface.co (huggingface.co)... 3.163.189.74, 3.163.189.90, 3.163.189.114, ...
    3. Connecting to huggingface.co (huggingface.co)|3.163.189.74|:443... connected.
    4. HTTP request sent, awaiting response... 302 Found
    5. Location: https://cdn-lfs.huggingface.co/repos/d9/09/d909ef7668bb417b7065a311bd55a3084cc83a1f918e13cb41c5503328432db2/419fddc48958cd0f5599939ee0248852a37ceb8bb738c9b9525e95b25a89de9a?response-content-disposition=attachment%3B+filename*%3DUTF-8%27%27lol_dataset.zip%3B+filename%3D%22lol_dataset.zip%22%3B&response-content-type=application%2Fzip&Expires=1700769710&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcwMDc2OTcxMH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy5odWdnaW5nZmFjZS5jby9yZXBvcy9kOS8wOS9kOTA5ZWY3NjY4YmI0MTdiNzA2NWEzMTFiZDU1YTMwODRjYzgzYTFmOTE4ZTEzY2I0MWM1NTAzMzI4NDMyZGIyLzQxOWZkZGM0ODk1OGNkMGY1NTk5OTM5ZWUwMjQ4ODUyYTM3Y2ViOGJiNzM4YzliOTUyNWU5NWIyNWE4OWRlOWE%7EcmVzcG9uc2UtY29udGVudC1kaXNwb3NpdGlvbj0qJnJlc3BvbnNlLWNvbnRlbnQtdHlwZT0qIn1dfQ__&Signature=VPqHlt0h6mUV7D3alDMIO61VSvUX498wZn5rIpo4u5yTYOu2s9CbO82xeGfrZguIuENVO6yiuoUAlZO4XXDsGC0Gc3MR3KIoTGuI9URA815nrdvFE616XBooGAW200KOUmVj2IoySAufi-7ORPuspaVJoKqWr8wytt0hDpNMeaWSg766kVMkJB1Aywq6yu5KHFGkqvOPDWNZZO6yfOtdX2XfbXVuiaiUlS03gRZ58H9pYn535TrE3BYP4W1u%7EehJ4OACpsRsnrsrXDr--PLH5RsxApOR2neFLySta3LiN9mtdjSpOKGn0oUapDfCWG7Ik5OMB5PGGzQBTB5J0b0O9g__&Key-Pair-Id=KVTP0A1DKRTAX [following]
    6. --2023-11-20 20:01:50-- https://cdn-lfs.huggingface.co/repos/d9/09/d909ef7668bb417b7065a311bd55a3084cc83a1f918e13cb41c5503328432db2/419fddc48958cd0f5599939ee0248852a37ceb8bb738c9b9525e95b25a89de9a?response-content-disposition=attachment%3B+filename*%3DUTF-8%27%27lol_dataset.zip%3B+filename%3D%22lol_dataset.zip%22%3B&response-content-type=application%2Fzip&Expires=1700769710&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcwMDc2OTcxMH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy5odWdnaW5nZmFjZS5jby9yZXBvcy9kOS8wOS9kOTA5ZWY3NjY4YmI0MTdiNzA2NWEzMTFiZDU1YTMwODRjYzgzYTFmOTE4ZTEzY2I0MWM1NTAzMzI4NDMyZGIyLzQxOWZkZGM0ODk1OGNkMGY1NTk5OTM5ZWUwMjQ4ODUyYTM3Y2ViOGJiNzM4YzliOTUyNWU5NWIyNWE4OWRlOWE%7EcmVzcG9uc2UtY29udGVudC1kaXNwb3NpdGlvbj0qJnJlc3BvbnNlLWNvbnRlbnQtdHlwZT0qIn1dfQ__&Signature=VPqHlt0h6mUV7D3alDMIO61VSvUX498wZn5rIpo4u5yTYOu2s9CbO82xeGfrZguIuENVO6yiuoUAlZO4XXDsGC0Gc3MR3KIoTGuI9URA815nrdvFE616XBooGAW200KOUmVj2IoySAufi-7ORPuspaVJoKqWr8wytt0hDpNMeaWSg766kVMkJB1Aywq6yu5KHFGkqvOPDWNZZO6yfOtdX2XfbXVuiaiUlS03gRZ58H9pYn535TrE3BYP4W1u%7EehJ4OACpsRsnrsrXDr--PLH5RsxApOR2neFLySta3LiN9mtdjSpOKGn0oUapDfCWG7Ik5OMB5PGGzQBTB5J0b0O9g__&Key-Pair-Id=KVTP0A1DKRTAX
    7. Resolving cdn-lfs.huggingface.co (cdn-lfs.huggingface.co)... 108.138.94.122, 108.138.94.25, 108.138.94.14, ...
    8. Connecting to cdn-lfs.huggingface.co (cdn-lfs.huggingface.co)|108.138.94.122|:443... connected.
    9. HTTP request sent, awaiting response... 200 OK
    10. Length: 347171015 (331M) [application/zip]
    11. Saving to: ‘lol_dataset.zip’
    lol_dataset.zip     100%[===================>] 331.09M  37.4MB/s    in 9.5s    
    
    2023-11-20 20:02:00 (34.9 MB/s) - ‘lol_dataset.zip’ saved [347171015/347171015]

    创建 TensorFlow 数据集

    我们使用 LoL 数据集训练集中的 300 张弱光图像进行训练,并使用剩余的 185 张弱光图像进行验证。我们将图像大小调整为 256 x 256,以便同时用于训练和验证。请注意,为了训练 DCE-Net,我们不需要相应的增强图像。

    1. IMAGE_SIZE = 256
    2. BATCH_SIZE = 16
    3. MAX_TRAIN_IMAGES = 400
    4. def load_data(image_path):
    5. image = tf.io.read_file(image_path)
    6. image = tf.image.decode_png(image, channels=3)
    7. image = tf.image.resize(images=image, size=[IMAGE_SIZE, IMAGE_SIZE])
    8. image = image / 255.0
    9. return image
    10. def data_generator(low_light_images):
    11. dataset = tf.data.Dataset.from_tensor_slices((low_light_images))
    12. dataset = dataset.map(load_data, num_parallel_calls=tf.data.AUTOTUNE)
    13. dataset = dataset.batch(BATCH_SIZE, drop_remainder=True)
    14. return dataset
    15. train_low_light_images = sorted(glob("./lol_dataset/our485/low/*"))[:MAX_TRAIN_IMAGES]
    16. val_low_light_images = sorted(glob("./lol_dataset/our485/low/*"))[MAX_TRAIN_IMAGES:]
    17. test_low_light_images = sorted(glob("./lol_dataset/eval15/low/*"))
    18. train_dataset = data_generator(train_low_light_images)
    19. val_dataset = data_generator(val_low_light_images)
    20. print("Train Dataset:", train_dataset)
    21. print("Validation Dataset:", val_dataset)

    演绎展示:

    1. Train Dataset: <_BatchDataset element_spec=TensorSpec(shape=(16, 256, 256, 3), dtype=tf.float32, name=None)>
    2. Validation Dataset: <_BatchDataset element_spec=TensorSpec(shape=(16, 256, 256, 3), dtype=tf.float32, name=None)>

    零 DCE 框架


    DCE-Net 的目标是根据输入图像估算出一组最合适的光增强曲线 (LE-curves)。然后,该框架通过迭代应用这些曲线来映射输入图像 RGB 通道的所有像素,从而获得最终的增强图像。

    了解光线增强曲线


    光线增强曲线是一种能将低照度图像自动映射为增强版本的曲线,其自适应曲线参数完全取决于输入图像。在设计这种曲线时,应考虑三个目标:

    × 增强图像的每个像素值都应在归一化范围 [0,1] 内,以避免溢出截断造成信息丢失。
    × 它应该是单调的,以保持相邻像素之间的对比度。
    × 曲线的形状应尽可能简单,曲线应可微分,以便进行反向传播。

    光增强曲线分别应用于三个 RGB 通道,而不是只应用于照明通道。

    三通道调整可以更好地保留固有色彩,降低过度饱和的风险。

    DCE-Net


    DCE-Net 是一种轻量级深度神经网络,可学习输入图像与其最佳拟合曲线参数图之间的映射。DCE-Net 的输入是一幅低亮度图像,而输出则是一组对应高阶曲线的像素曲线参数图。

    它是一个由七个卷积层对称连接而成的普通 CNN。每层由 32 个大小为 3×3 和步长为 1 的卷积核组成,然后是 ReLU 激活函数。最后一个卷积层之后是 Tanh 激活函数,该函数在 8 次迭代中产生 24 个参数图,其中每次迭代需要三个通道的三个曲线参数图。

    1. def build_dce_net():
    2. input_img = keras.Input(shape=[None, None, 3])
    3. conv1 = layers.Conv2D(
    4. 32, (3, 3), strides=(1, 1), activation="relu", padding="same"
    5. )(input_img)
    6. conv2 = layers.Conv2D(
    7. 32, (3, 3), strides=(1, 1), activation="relu", padding="same"
    8. )(conv1)
    9. conv3 = layers.Conv2D(
    10. 32, (3, 3), strides=(1, 1), activation="relu", padding="same"
    11. )(conv2)
    12. conv4 = layers.Conv2D(
    13. 32, (3, 3), strides=(1, 1), activation="relu", padding="same"
    14. )(conv3)
    15. int_con1 = layers.Concatenate(axis=-1)([conv4, conv3])
    16. conv5 = layers.Conv2D(
    17. 32, (3, 3), strides=(1, 1), activation="relu", padding="same"
    18. )(int_con1)
    19. int_con2 = layers.Concatenate(axis=-1)([conv5, conv2])
    20. conv6 = layers.Conv2D(
    21. 32, (3, 3), strides=(1, 1), activation="relu", padding="same"
    22. )(int_con2)
    23. int_con3 = layers.Concatenate(axis=-1)([conv6, conv1])
    24. x_r = layers.Conv2D(24, (3, 3), strides=(1, 1), activation="tanh", padding="same")(
    25. int_con3
    26. )
    27. return keras.Model(inputs=input_img, outputs=x_r)

    损失函数


    为了在 DCE-Net 中实现零参考学习,我们使用了一组可微分的零参考损失,以便评估增强图像的质量。

    色彩恒定损失


    色彩不变性损失用于纠正增强图像中潜在的色彩偏差。

    1. def color_constancy_loss(x):
    2. mean_rgb = tf.reduce_mean(x, axis=(1, 2), keepdims=True)
    3. mr, mg, mb = (
    4. mean_rgb[:, :, :, 0],
    5. mean_rgb[:, :, :, 1],
    6. mean_rgb[:, :, :, 2],
    7. )
    8. d_rg = tf.square(mr - mg)
    9. d_rb = tf.square(mr - mb)
    10. d_gb = tf.square(mb - mg)
    11. return tf.sqrt(tf.square(d_rg) + tf.square(d_rb) + tf.square(d_gb))

    曝光损失


    为了抑制曝光不足/曝光过度的区域,我们使用了曝光控制损失。它测量的是局部区域的平均强度值与预设的良好曝光水平(设为 0.6)之间的距离。

    1. def exposure_loss(x, mean_val=0.6):
    2. x = tf.reduce_mean(x, axis=3, keepdims=True)
    3. mean = tf.nn.avg_pool2d(x, ksize=16, strides=16, padding="VALID")
    4. return tf.reduce_mean(tf.square(mean - mean_val))

    光照平滑度损失


    为了保持相邻像素之间的单调性关系,每个曲线参数图都会添加光照平滑度损失。

    1. def illumination_smoothness_loss(x):
    2. batch_size = tf.shape(x)[0]
    3. h_x = tf.shape(x)[1]
    4. w_x = tf.shape(x)[2]
    5. count_h = (tf.shape(x)[2] - 1) * tf.shape(x)[3]
    6. count_w = tf.shape(x)[2] * (tf.shape(x)[3] - 1)
    7. h_tv = tf.reduce_sum(tf.square((x[:, 1:, :, :] - x[:, : h_x - 1, :, :])))
    8. w_tv = tf.reduce_sum(tf.square((x[:, :, 1:, :] - x[:, :, : w_x - 1, :])))
    9. batch_size = tf.cast(batch_size, dtype=tf.float32)
    10. count_h = tf.cast(count_h, dtype=tf.float32)
    11. count_w = tf.cast(count_w, dtype=tf.float32)
    12. return 2 * (h_tv / count_h + w_tv / count_w) / batch_size

    空间一致性损失


    空间一致性损失通过保持输入图像及其增强版本中相邻区域之间的对比度,促进增强图像的空间一致性。

    1. class SpatialConsistencyLoss(keras.losses.Loss):
    2. def __init__(self, **kwargs):
    3. super().__init__(reduction="none")
    4. self.left_kernel = tf.constant(
    5. [[[[0, 0, 0]], [[-1, 1, 0]], [[0, 0, 0]]]], dtype=tf.float32
    6. )
    7. self.right_kernel = tf.constant(
    8. [[[[0, 0, 0]], [[0, 1, -1]], [[0, 0, 0]]]], dtype=tf.float32
    9. )
    10. self.up_kernel = tf.constant(
    11. [[[[0, -1, 0]], [[0, 1, 0]], [[0, 0, 0]]]], dtype=tf.float32
    12. )
    13. self.down_kernel = tf.constant(
    14. [[[[0, 0, 0]], [[0, 1, 0]], [[0, -1, 0]]]], dtype=tf.float32
    15. )
    16. def call(self, y_true, y_pred):
    17. original_mean = tf.reduce_mean(y_true, 3, keepdims=True)
    18. enhanced_mean = tf.reduce_mean(y_pred, 3, keepdims=True)
    19. original_pool = tf.nn.avg_pool2d(
    20. original_mean, ksize=4, strides=4, padding="VALID"
    21. )
    22. enhanced_pool = tf.nn.avg_pool2d(
    23. enhanced_mean, ksize=4, strides=4, padding="VALID"
    24. )
    25. d_original_left = tf.nn.conv2d(
    26. original_pool,
    27. self.left_kernel,
    28. strides=[1, 1, 1, 1],
    29. padding="SAME",
    30. )
    31. d_original_right = tf.nn.conv2d(
    32. original_pool,
    33. self.right_kernel,
    34. strides=[1, 1, 1, 1],
    35. padding="SAME",
    36. )
    37. d_original_up = tf.nn.conv2d(
    38. original_pool, self.up_kernel, strides=[1, 1, 1, 1], padding="SAME"
    39. )
    40. d_original_down = tf.nn.conv2d(
    41. original_pool,
    42. self.down_kernel,
    43. strides=[1, 1, 1, 1],
    44. padding="SAME",
    45. )
    46. d_enhanced_left = tf.nn.conv2d(
    47. enhanced_pool,
    48. self.left_kernel,
    49. strides=[1, 1, 1, 1],
    50. padding="SAME",
    51. )
    52. d_enhanced_right = tf.nn.conv2d(
    53. enhanced_pool,
    54. self.right_kernel,
    55. strides=[1, 1, 1, 1],
    56. padding="SAME",
    57. )
    58. d_enhanced_up = tf.nn.conv2d(
    59. enhanced_pool, self.up_kernel, strides=[1, 1, 1, 1], padding="SAME"
    60. )
    61. d_enhanced_down = tf.nn.conv2d(
    62. enhanced_pool,
    63. self.down_kernel,
    64. strides=[1, 1, 1, 1],
    65. padding="SAME",
    66. )
    67. d_left = tf.square(d_original_left - d_enhanced_left)
    68. d_right = tf.square(d_original_right - d_enhanced_right)
    69. d_up = tf.square(d_original_up - d_enhanced_up)
    70. d_down = tf.square(d_original_down - d_enhanced_down)
    71. return d_left + d_right + d_up + d_down

    深度曲线估计模型


    我们将 Zero-DCE 框架作为 Keras 子类模型来实现。

    1. class ZeroDCE(keras.Model):
    2. def __init__(self, **kwargs):
    3. super().__init__(**kwargs)
    4. self.dce_model = build_dce_net()
    5. def compile(self, learning_rate, **kwargs):
    6. super().compile(**kwargs)
    7. self.optimizer = keras.optimizers.Adam(learning_rate=learning_rate)
    8. self.spatial_constancy_loss = SpatialConsistencyLoss(reduction="none")
    9. self.total_loss_tracker = keras.metrics.Mean(name="total_loss")
    10. self.illumination_smoothness_loss_tracker = keras.metrics.Mean(
    11. name="illumination_smoothness_loss"
    12. )
    13. self.spatial_constancy_loss_tracker = keras.metrics.Mean(
    14. name="spatial_constancy_loss"
    15. )
    16. self.color_constancy_loss_tracker = keras.metrics.Mean(
    17. name="color_constancy_loss"
    18. )
    19. self.exposure_loss_tracker = keras.metrics.Mean(name="exposure_loss")
    20. @property
    21. def metrics(self):
    22. return [
    23. self.total_loss_tracker,
    24. self.illumination_smoothness_loss_tracker,
    25. self.spatial_constancy_loss_tracker,
    26. self.color_constancy_loss_tracker,
    27. self.exposure_loss_tracker,
    28. ]
    29. def get_enhanced_image(self, data, output):
    30. r1 = output[:, :, :, :3]
    31. r2 = output[:, :, :, 3:6]
    32. r3 = output[:, :, :, 6:9]
    33. r4 = output[:, :, :, 9:12]
    34. r5 = output[:, :, :, 12:15]
    35. r6 = output[:, :, :, 15:18]
    36. r7 = output[:, :, :, 18:21]
    37. r8 = output[:, :, :, 21:24]
    38. x = data + r1 * (tf.square(data) - data)
    39. x = x + r2 * (tf.square(x) - x)
    40. x = x + r3 * (tf.square(x) - x)
    41. enhanced_image = x + r4 * (tf.square(x) - x)
    42. x = enhanced_image + r5 * (tf.square(enhanced_image) - enhanced_image)
    43. x = x + r6 * (tf.square(x) - x)
    44. x = x + r7 * (tf.square(x) - x)
    45. enhanced_image = x + r8 * (tf.square(x) - x)
    46. return enhanced_image
    47. def call(self, data):
    48. dce_net_output = self.dce_model(data)
    49. return self.get_enhanced_image(data, dce_net_output)
    50. def compute_losses(self, data, output):
    51. enhanced_image = self.get_enhanced_image(data, output)
    52. loss_illumination = 200 * illumination_smoothness_loss(output)
    53. loss_spatial_constancy = tf.reduce_mean(
    54. self.spatial_constancy_loss(enhanced_image, data)
    55. )
    56. loss_color_constancy = 5 * tf.reduce_mean(color_constancy_loss(enhanced_image))
    57. loss_exposure = 10 * tf.reduce_mean(exposure_loss(enhanced_image))
    58. total_loss = (
    59. loss_illumination
    60. + loss_spatial_constancy
    61. + loss_color_constancy
    62. + loss_exposure
    63. )
    64. return {
    65. "total_loss": total_loss,
    66. "illumination_smoothness_loss": loss_illumination,
    67. "spatial_constancy_loss": loss_spatial_constancy,
    68. "color_constancy_loss": loss_color_constancy,
    69. "exposure_loss": loss_exposure,
    70. }
    71. def train_step(self, data):
    72. with tf.GradientTape() as tape:
    73. output = self.dce_model(data)
    74. losses = self.compute_losses(data, output)
    75. gradients = tape.gradient(
    76. losses["total_loss"], self.dce_model.trainable_weights
    77. )
    78. self.optimizer.apply_gradients(zip(gradients, self.dce_model.trainable_weights))
    79. self.total_loss_tracker.update_state(losses["total_loss"])
    80. self.illumination_smoothness_loss_tracker.update_state(
    81. losses["illumination_smoothness_loss"]
    82. )
    83. self.spatial_constancy_loss_tracker.update_state(
    84. losses["spatial_constancy_loss"]
    85. )
    86. self.color_constancy_loss_tracker.update_state(losses["color_constancy_loss"])
    87. self.exposure_loss_tracker.update_state(losses["exposure_loss"])
    88. return {metric.name: metric.result() for metric in self.metrics}
    89. def test_step(self, data):
    90. output = self.dce_model(data)
    91. losses = self.compute_losses(data, output)
    92. self.total_loss_tracker.update_state(losses["total_loss"])
    93. self.illumination_smoothness_loss_tracker.update_state(
    94. losses["illumination_smoothness_loss"]
    95. )
    96. self.spatial_constancy_loss_tracker.update_state(
    97. losses["spatial_constancy_loss"]
    98. )
    99. self.color_constancy_loss_tracker.update_state(losses["color_constancy_loss"])
    100. self.exposure_loss_tracker.update_state(losses["exposure_loss"])
    101. return {metric.name: metric.result() for metric in self.metrics}
    102. def save_weights(self, filepath, overwrite=True, save_format=None, options=None):
    103. """While saving the weights, we simply save the weights of the DCE-Net"""
    104. self.dce_model.save_weights(
    105. filepath,
    106. overwrite=overwrite,
    107. save_format=save_format,
    108. options=options,
    109. )
    110. def load_weights(self, filepath, by_name=False, skip_mismatch=False, options=None):
    111. """While loading the weights, we simply load the weights of the DCE-Net"""
    112. self.dce_model.load_weights(
    113. filepath=filepath,
    114. by_name=by_name,
    115. skip_mismatch=skip_mismatch,
    116. options=options,
    117. )

    训练

    1. zero_dce_model = ZeroDCE()
    2. zero_dce_model.compile(learning_rate=1e-4)
    3. history = zero_dce_model.fit(train_dataset, validation_data=val_dataset, epochs=100)
    4. def plot_result(item):
    5. plt.plot(history.history[item], label=item)
    6. plt.plot(history.history["val_" + item], label="val_" + item)
    7. plt.xlabel("Epochs")
    8. plt.ylabel(item)
    9. plt.title("Train and Validation {} Over Epochs".format(item), fontsize=14)
    10. plt.legend()
    11. plt.grid()
    12. plt.show()
    13. plot_result("total_loss")
    14. plot_result("illumination_smoothness_loss")
    15. plot_result("spatial_constancy_loss")
    16. plot_result("color_constancy_loss")
    17. plot_result("exposure_loss")

    演绎展示:

    1. Epoch 1/100
    2. 2/25 ━[37m━━━━━━━━━━━━━━━━━━━ 1s 85ms/step - color_constancy_loss: 0.0013 - exposure_loss: 3.0376 - illumination_smoothness_loss: 2.5211 - spatial_constancy_loss: 4.6834e-07 - total_loss: 5.5601
    3. WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
    4. I0000 00:00:1700510538.106578 3409375 device_compiler.h:187] Compiled cluster using XLA! This line is logged at most once for the lifetime of the process.
    5. 25/25 ━━━━━━━━━━━━━━━━━━━━ 16s 123ms/step - color_constancy_loss: 0.0029 - exposure_loss: 2.9968 - illumination_smoothness_loss: 2.1813 - spatial_constancy_loss: 1.8559e-06 - total_loss: 5.1810 - val_color_constancy_loss: 0.0023 - val_exposure_loss: 2.9489 - val_illumination_smoothness_loss: 2.7063 - val_spatial_constancy_loss: 5.0979e-06 - val_total_loss: 5.6575
    6. Epoch 2/100
    7. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0030 - exposure_loss: 2.9854 - illumination_smoothness_loss: 1.2876 - spatial_constancy_loss: 6.1811e-06 - total_loss: 4.2759 - val_color_constancy_loss: 0.0023 - val_exposure_loss: 2.9381 - val_illumination_smoothness_loss: 1.8299 - val_spatial_constancy_loss: 1.3742e-05 - val_total_loss: 4.7703
    8. Epoch 3/100
    9. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0031 - exposure_loss: 2.9746 - illumination_smoothness_loss: 0.8735 - spatial_constancy_loss: 1.6664e-05 - total_loss: 3.8512 - val_color_constancy_loss: 0.0024 - val_exposure_loss: 2.9255 - val_illumination_smoothness_loss: 1.3135 - val_spatial_constancy_loss: 3.1783e-05 - val_total_loss: 4.2414
    10. Epoch 4/100
    11. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0032 - exposure_loss: 2.9623 - illumination_smoothness_loss: 0.6259 - spatial_constancy_loss: 3.7938e-05 - total_loss: 3.5914 - val_color_constancy_loss: 0.0025 - val_exposure_loss: 2.9118 - val_illumination_smoothness_loss: 0.9835 - val_spatial_constancy_loss: 6.1902e-05 - val_total_loss: 3.8979
    12. Epoch 5/100
    13. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0033 - exposure_loss: 2.9493 - illumination_smoothness_loss: 0.4700 - spatial_constancy_loss: 7.2080e-05 - total_loss: 3.4226 - val_color_constancy_loss: 0.0026 - val_exposure_loss: 2.8976 - val_illumination_smoothness_loss: 0.7751 - val_spatial_constancy_loss: 1.0500e-04 - val_total_loss: 3.6754
    14. Epoch 6/100
    15. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0034 - exposure_loss: 2.9358 - illumination_smoothness_loss: 0.3693 - spatial_constancy_loss: 1.1878e-04 - total_loss: 3.3086 - val_color_constancy_loss: 0.0027 - val_exposure_loss: 2.8829 - val_illumination_smoothness_loss: 0.6316 - val_spatial_constancy_loss: 1.6075e-04 - val_total_loss: 3.5173
    16. Epoch 7/100
    17. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 65ms/step - color_constancy_loss: 0.0036 - exposure_loss: 2.9219 - illumination_smoothness_loss: 0.2996 - spatial_constancy_loss: 1.7723e-04 - total_loss: 3.2252 - val_color_constancy_loss: 0.0028 - val_exposure_loss: 2.8660 - val_illumination_smoothness_loss: 0.5261 - val_spatial_constancy_loss: 2.3790e-04 - val_total_loss: 3.3951
    18. Epoch 8/100
    19. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0037 - exposure_loss: 2.9056 - illumination_smoothness_loss: 0.2486 - spatial_constancy_loss: 2.5932e-04 - total_loss: 3.1582 - val_color_constancy_loss: 0.0029 - val_exposure_loss: 2.8466 - val_illumination_smoothness_loss: 0.4454 - val_spatial_constancy_loss: 3.4372e-04 - val_total_loss: 3.2952
    20. Epoch 9/100
    21. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0039 - exposure_loss: 2.8872 - illumination_smoothness_loss: 0.2110 - spatial_constancy_loss: 3.6800e-04 - total_loss: 3.1025 - val_color_constancy_loss: 0.0031 - val_exposure_loss: 2.8244 - val_illumination_smoothness_loss: 0.3853 - val_spatial_constancy_loss: 4.8290e-04 - val_total_loss: 3.2132
    22. Epoch 10/100
    23. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0041 - exposure_loss: 2.8665 - illumination_smoothness_loss: 0.1846 - spatial_constancy_loss: 5.0693e-04 - total_loss: 3.0558 - val_color_constancy_loss: 0.0033 - val_exposure_loss: 2.8002 - val_illumination_smoothness_loss: 0.3395 - val_spatial_constancy_loss: 6.5965e-04 - val_total_loss: 3.1436
    24. Epoch 11/100
    25. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0044 - exposure_loss: 2.8440 - illumination_smoothness_loss: 0.1654 - spatial_constancy_loss: 6.8036e-04 - total_loss: 3.0145 - val_color_constancy_loss: 0.0035 - val_exposure_loss: 2.7749 - val_illumination_smoothness_loss: 0.3031 - val_spatial_constancy_loss: 8.6824e-04 - val_total_loss: 3.0824
    26. Epoch 12/100
    27. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0047 - exposure_loss: 2.8198 - illumination_smoothness_loss: 0.1512 - spatial_constancy_loss: 8.9387e-04 - total_loss: 2.9765 - val_color_constancy_loss: 0.0038 - val_exposure_loss: 2.7463 - val_illumination_smoothness_loss: 0.2753 - val_spatial_constancy_loss: 0.0011 - val_total_loss: 3.0265
    28. Epoch 13/100
    29. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0050 - exposure_loss: 2.7928 - illumination_smoothness_loss: 0.1408 - spatial_constancy_loss: 0.0012 - total_loss: 2.9398 - val_color_constancy_loss: 0.0041 - val_exposure_loss: 2.7132 - val_illumination_smoothness_loss: 0.2537 - val_spatial_constancy_loss: 0.0015 - val_total_loss: 2.9724
    30. Epoch 14/100
    31. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0054 - exposure_loss: 2.7600 - illumination_smoothness_loss: 0.1340 - spatial_constancy_loss: 0.0016 - total_loss: 2.9009 - val_color_constancy_loss: 0.0045 - val_exposure_loss: 2.6673 - val_illumination_smoothness_loss: 0.2389 - val_spatial_constancy_loss: 0.0021 - val_total_loss: 2.9129
    32. Epoch 15/100
    33. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0060 - exposure_loss: 2.7115 - illumination_smoothness_loss: 0.1314 - spatial_constancy_loss: 0.0022 - total_loss: 2.8512 - val_color_constancy_loss: 0.0055 - val_exposure_loss: 2.5820 - val_illumination_smoothness_loss: 0.2374 - val_spatial_constancy_loss: 0.0035 - val_total_loss: 2.8284
    34. Epoch 16/100
    35. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0075 - exposure_loss: 2.6129 - illumination_smoothness_loss: 0.1414 - spatial_constancy_loss: 0.0041 - total_loss: 2.7660 - val_color_constancy_loss: 0.0081 - val_exposure_loss: 2.3797 - val_illumination_smoothness_loss: 0.2453 - val_spatial_constancy_loss: 0.0083 - val_total_loss: 2.6414
    36. Epoch 17/100
    37. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0128 - exposure_loss: 2.3149 - illumination_smoothness_loss: 0.1766 - spatial_constancy_loss: 0.0148 - total_loss: 2.5190 - val_color_constancy_loss: 0.0286 - val_exposure_loss: 1.5060 - val_illumination_smoothness_loss: 0.3288 - val_spatial_constancy_loss: 0.0648 - val_total_loss: 1.9282
    38. Epoch 18/100
    39. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0505 - exposure_loss: 1.3386 - illumination_smoothness_loss: 0.2606 - spatial_constancy_loss: 0.1196 - total_loss: 1.7693 - val_color_constancy_loss: 0.0827 - val_exposure_loss: 0.6645 - val_illumination_smoothness_loss: 0.2964 - val_spatial_constancy_loss: 0.2687 - val_total_loss: 1.3123
    40. Epoch 19/100
    41. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0873 - exposure_loss: 0.8174 - illumination_smoothness_loss: 0.2378 - spatial_constancy_loss: 0.2577 - total_loss: 1.4002 - val_color_constancy_loss: 0.0861 - val_exposure_loss: 0.6856 - val_illumination_smoothness_loss: 0.2464 - val_spatial_constancy_loss: 0.2539 - val_total_loss: 1.2719
    42. Epoch 20/100
    43. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0753 - exposure_loss: 0.8584 - illumination_smoothness_loss: 0.1858 - spatial_constancy_loss: 0.2394 - total_loss: 1.3589 - val_color_constancy_loss: 0.0882 - val_exposure_loss: 0.6714 - val_illumination_smoothness_loss: 0.2195 - val_spatial_constancy_loss: 0.2620 - val_total_loss: 1.2410
    44. Epoch 21/100
    45. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0779 - exposure_loss: 0.8382 - illumination_smoothness_loss: 0.1706 - spatial_constancy_loss: 0.2486 - total_loss: 1.3354 - val_color_constancy_loss: 0.0886 - val_exposure_loss: 0.6648 - val_illumination_smoothness_loss: 0.2072 - val_spatial_constancy_loss: 0.2643 - val_total_loss: 1.2249
    46. Epoch 22/100
    47. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0784 - exposure_loss: 0.8337 - illumination_smoothness_loss: 0.1590 - spatial_constancy_loss: 0.2502 - total_loss: 1.3212 - val_color_constancy_loss: 0.0889 - val_exposure_loss: 0.6647 - val_illumination_smoothness_loss: 0.1934 - val_spatial_constancy_loss: 0.2653 - val_total_loss: 1.2122
    48. Epoch 23/100
    49. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0783 - exposure_loss: 0.8329 - illumination_smoothness_loss: 0.1498 - spatial_constancy_loss: 0.2508 - total_loss: 1.3118 - val_color_constancy_loss: 0.0897 - val_exposure_loss: 0.6602 - val_illumination_smoothness_loss: 0.1834 - val_spatial_constancy_loss: 0.2671 - val_total_loss: 1.2003
    50. Epoch 24/100
    51. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0787 - exposure_loss: 0.8283 - illumination_smoothness_loss: 0.1426 - spatial_constancy_loss: 0.2529 - total_loss: 1.3025 - val_color_constancy_loss: 0.0897 - val_exposure_loss: 0.6601 - val_illumination_smoothness_loss: 0.1754 - val_spatial_constancy_loss: 0.2671 - val_total_loss: 1.1923
    52. Epoch 25/100
    53. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0785 - exposure_loss: 0.8294 - illumination_smoothness_loss: 0.1365 - spatial_constancy_loss: 0.2524 - total_loss: 1.2968 - val_color_constancy_loss: 0.0902 - val_exposure_loss: 0.6562 - val_illumination_smoothness_loss: 0.1672 - val_spatial_constancy_loss: 0.2692 - val_total_loss: 1.1828
    54. Epoch 26/100
    55. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0793 - exposure_loss: 0.8229 - illumination_smoothness_loss: 0.1316 - spatial_constancy_loss: 0.2554 - total_loss: 1.2892 - val_color_constancy_loss: 0.0896 - val_exposure_loss: 0.6567 - val_illumination_smoothness_loss: 0.1606 - val_spatial_constancy_loss: 0.2699 - val_total_loss: 1.1768
    56. Epoch 27/100
    57. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 65ms/step - color_constancy_loss: 0.0788 - exposure_loss: 0.8285 - illumination_smoothness_loss: 0.1238 - spatial_constancy_loss: 0.2534 - total_loss: 1.2845 - val_color_constancy_loss: 0.0906 - val_exposure_loss: 0.6519 - val_illumination_smoothness_loss: 0.1574 - val_spatial_constancy_loss: 0.2725 - val_total_loss: 1.1724
    58. Epoch 28/100
    59. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0794 - exposure_loss: 0.8247 - illumination_smoothness_loss: 0.1194 - spatial_constancy_loss: 0.2550 - total_loss: 1.2785 - val_color_constancy_loss: 0.0914 - val_exposure_loss: 0.6451 - val_illumination_smoothness_loss: 0.1542 - val_spatial_constancy_loss: 0.2783 - val_total_loss: 1.1689
    60. Epoch 29/100
    61. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0797 - exposure_loss: 0.8203 - illumination_smoothness_loss: 0.1139 - spatial_constancy_loss: 0.2577 - total_loss: 1.2715 - val_color_constancy_loss: 0.0914 - val_exposure_loss: 0.6468 - val_illumination_smoothness_loss: 0.1435 - val_spatial_constancy_loss: 0.2775 - val_total_loss: 1.1592
    62. Epoch 30/100
    63. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0795 - exposure_loss: 0.8199 - illumination_smoothness_loss: 0.1083 - spatial_constancy_loss: 0.2581 - total_loss: 1.2659 - val_color_constancy_loss: 0.0911 - val_exposure_loss: 0.6483 - val_illumination_smoothness_loss: 0.1336 - val_spatial_constancy_loss: 0.2768 - val_total_loss: 1.1498
    64. Epoch 31/100
    65. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0797 - exposure_loss: 0.8194 - illumination_smoothness_loss: 0.1037 - spatial_constancy_loss: 0.2589 - total_loss: 1.2617 - val_color_constancy_loss: 0.0912 - val_exposure_loss: 0.6483 - val_illumination_smoothness_loss: 0.1289 - val_spatial_constancy_loss: 0.2772 - val_total_loss: 1.1456
    66. Epoch 32/100
    67. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0794 - exposure_loss: 0.8226 - illumination_smoothness_loss: 0.0982 - spatial_constancy_loss: 0.2578 - total_loss: 1.2580 - val_color_constancy_loss: 0.0923 - val_exposure_loss: 0.6421 - val_illumination_smoothness_loss: 0.1251 - val_spatial_constancy_loss: 0.2814 - val_total_loss: 1.1409
    68. Epoch 33/100
    69. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0801 - exposure_loss: 0.8188 - illumination_smoothness_loss: 0.0939 - spatial_constancy_loss: 0.2601 - total_loss: 1.2529 - val_color_constancy_loss: 0.0934 - val_exposure_loss: 0.6367 - val_illumination_smoothness_loss: 0.1261 - val_spatial_constancy_loss: 0.2853 - val_total_loss: 1.1416
    70. Epoch 34/100
    71. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0802 - exposure_loss: 0.8173 - illumination_smoothness_loss: 0.0889 - spatial_constancy_loss: 0.2611 - total_loss: 1.2475 - val_color_constancy_loss: 0.0941 - val_exposure_loss: 0.6326 - val_illumination_smoothness_loss: 0.1227 - val_spatial_constancy_loss: 0.2883 - val_total_loss: 1.1378
    72. Epoch 35/100
    73. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 65ms/step - color_constancy_loss: 0.0807 - exposure_loss: 0.8134 - illumination_smoothness_loss: 0.0844 - spatial_constancy_loss: 0.2632 - total_loss: 1.2418 - val_color_constancy_loss: 0.0946 - val_exposure_loss: 0.6312 - val_illumination_smoothness_loss: 0.1180 - val_spatial_constancy_loss: 0.2893 - val_total_loss: 1.1330
    74. Epoch 36/100
    75. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0808 - exposure_loss: 0.8119 - illumination_smoothness_loss: 0.0798 - spatial_constancy_loss: 0.2644 - total_loss: 1.2368 - val_color_constancy_loss: 0.0941 - val_exposure_loss: 0.6351 - val_illumination_smoothness_loss: 0.1096 - val_spatial_constancy_loss: 0.2865 - val_total_loss: 1.1253
    76. Epoch 37/100
    77. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0807 - exposure_loss: 0.8127 - illumination_smoothness_loss: 0.0759 - spatial_constancy_loss: 0.2637 - total_loss: 1.2330 - val_color_constancy_loss: 0.0949 - val_exposure_loss: 0.6295 - val_illumination_smoothness_loss: 0.1088 - val_spatial_constancy_loss: 0.2904 - val_total_loss: 1.1237
    78. Epoch 38/100
    79. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0812 - exposure_loss: 0.8091 - illumination_smoothness_loss: 0.0732 - spatial_constancy_loss: 0.2658 - total_loss: 1.2293 - val_color_constancy_loss: 0.0946 - val_exposure_loss: 0.6313 - val_illumination_smoothness_loss: 0.1022 - val_spatial_constancy_loss: 0.2893 - val_total_loss: 1.1174
    80. Epoch 39/100
    81. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0810 - exposure_loss: 0.8100 - illumination_smoothness_loss: 0.0694 - spatial_constancy_loss: 0.2655 - total_loss: 1.2259 - val_color_constancy_loss: 0.0953 - val_exposure_loss: 0.6278 - val_illumination_smoothness_loss: 0.1015 - val_spatial_constancy_loss: 0.2918 - val_total_loss: 1.1164
    82. Epoch 40/100
    83. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0813 - exposure_loss: 0.8077 - illumination_smoothness_loss: 0.0668 - spatial_constancy_loss: 0.2668 - total_loss: 1.2226 - val_color_constancy_loss: 0.0951 - val_exposure_loss: 0.6294 - val_illumination_smoothness_loss: 0.0950 - val_spatial_constancy_loss: 0.2907 - val_total_loss: 1.1103
    84. Epoch 41/100
    85. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0814 - exposure_loss: 0.8074 - illumination_smoothness_loss: 0.0639 - spatial_constancy_loss: 0.2669 - total_loss: 1.2195 - val_color_constancy_loss: 0.0955 - val_exposure_loss: 0.6263 - val_illumination_smoothness_loss: 0.0946 - val_spatial_constancy_loss: 0.2930 - val_total_loss: 1.1093
    86. Epoch 42/100
    87. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0816 - exposure_loss: 0.8056 - illumination_smoothness_loss: 0.0613 - spatial_constancy_loss: 0.2684 - total_loss: 1.2168 - val_color_constancy_loss: 0.0950 - val_exposure_loss: 0.6304 - val_illumination_smoothness_loss: 0.0876 - val_spatial_constancy_loss: 0.2900 - val_total_loss: 1.1031
    88. Epoch 43/100
    89. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0813 - exposure_loss: 0.8074 - illumination_smoothness_loss: 0.0582 - spatial_constancy_loss: 0.2671 - total_loss: 1.2140 - val_color_constancy_loss: 0.0953 - val_exposure_loss: 0.6271 - val_illumination_smoothness_loss: 0.0859 - val_spatial_constancy_loss: 0.2925 - val_total_loss: 1.1008
    90. Epoch 44/100
    91. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0816 - exposure_loss: 0.8048 - illumination_smoothness_loss: 0.0564 - spatial_constancy_loss: 0.2687 - total_loss: 1.2115 - val_color_constancy_loss: 0.0956 - val_exposure_loss: 0.6266 - val_illumination_smoothness_loss: 0.0837 - val_spatial_constancy_loss: 0.2930 - val_total_loss: 1.0988
    92. Epoch 45/100
    93. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0816 - exposure_loss: 0.8045 - illumination_smoothness_loss: 0.0541 - spatial_constancy_loss: 0.2690 - total_loss: 1.2093 - val_color_constancy_loss: 0.0955 - val_exposure_loss: 0.6275 - val_illumination_smoothness_loss: 0.0796 - val_spatial_constancy_loss: 0.2923 - val_total_loss: 1.0949
    94. Epoch 46/100
    95. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0816 - exposure_loss: 0.8043 - illumination_smoothness_loss: 0.0517 - spatial_constancy_loss: 0.2691 - total_loss: 1.2067 - val_color_constancy_loss: 0.0959 - val_exposure_loss: 0.6245 - val_illumination_smoothness_loss: 0.0790 - val_spatial_constancy_loss: 0.2945 - val_total_loss: 1.0939
    96. Epoch 47/100
    97. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0819 - exposure_loss: 0.8025 - illumination_smoothness_loss: 0.0505 - spatial_constancy_loss: 0.2701 - total_loss: 1.2050 - val_color_constancy_loss: 0.0960 - val_exposure_loss: 0.6242 - val_illumination_smoothness_loss: 0.0764 - val_spatial_constancy_loss: 0.2949 - val_total_loss: 1.0914
    98. Epoch 48/100
    99. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0819 - exposure_loss: 0.8021 - illumination_smoothness_loss: 0.0482 - spatial_constancy_loss: 0.2706 - total_loss: 1.2027 - val_color_constancy_loss: 0.0957 - val_exposure_loss: 0.6262 - val_illumination_smoothness_loss: 0.0721 - val_spatial_constancy_loss: 0.2934 - val_total_loss: 1.0874
    100. Epoch 49/100
    101. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0818 - exposure_loss: 0.8027 - illumination_smoothness_loss: 0.0463 - spatial_constancy_loss: 0.2702 - total_loss: 1.2010 - val_color_constancy_loss: 0.0959 - val_exposure_loss: 0.6244 - val_illumination_smoothness_loss: 0.0712 - val_spatial_constancy_loss: 0.2947 - val_total_loss: 1.0863
    102. Epoch 50/100
    103. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0820 - exposure_loss: 0.8015 - illumination_smoothness_loss: 0.0446 - spatial_constancy_loss: 0.2711 - total_loss: 1.1992 - val_color_constancy_loss: 0.0959 - val_exposure_loss: 0.6248 - val_illumination_smoothness_loss: 0.0688 - val_spatial_constancy_loss: 0.2945 - val_total_loss: 1.0839
    104. Epoch 51/100
    105. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0819 - exposure_loss: 0.8019 - illumination_smoothness_loss: 0.0429 - spatial_constancy_loss: 0.2707 - total_loss: 1.1974 - val_color_constancy_loss: 0.0964 - val_exposure_loss: 0.6224 - val_illumination_smoothness_loss: 0.0677 - val_spatial_constancy_loss: 0.2964 - val_total_loss: 1.0829
    106. Epoch 52/100
    107. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0823 - exposure_loss: 0.7996 - illumination_smoothness_loss: 0.0416 - spatial_constancy_loss: 0.2721 - total_loss: 1.1955 - val_color_constancy_loss: 0.0958 - val_exposure_loss: 0.6240 - val_illumination_smoothness_loss: 0.0644 - val_spatial_constancy_loss: 0.2951 - val_total_loss: 1.0793
    108. Epoch 53/100
    109. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0822 - exposure_loss: 0.8004 - illumination_smoothness_loss: 0.0399 - spatial_constancy_loss: 0.2717 - total_loss: 1.1941 - val_color_constancy_loss: 0.0960 - val_exposure_loss: 0.6234 - val_illumination_smoothness_loss: 0.0633 - val_spatial_constancy_loss: 0.2957 - val_total_loss: 1.0785
    110. Epoch 54/100
    111. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0823 - exposure_loss: 0.7997 - illumination_smoothness_loss: 0.0382 - spatial_constancy_loss: 0.2723 - total_loss: 1.1924 - val_color_constancy_loss: 0.0959 - val_exposure_loss: 0.6242 - val_illumination_smoothness_loss: 0.0591 - val_spatial_constancy_loss: 0.2951 - val_total_loss: 1.0744
    112. Epoch 55/100
    113. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0822 - exposure_loss: 0.7999 - illumination_smoothness_loss: 0.0362 - spatial_constancy_loss: 0.2721 - total_loss: 1.1904 - val_color_constancy_loss: 0.0965 - val_exposure_loss: 0.6211 - val_illumination_smoothness_loss: 0.0603 - val_spatial_constancy_loss: 0.2974 - val_total_loss: 1.0754
    114. Epoch 56/100
    115. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0825 - exposure_loss: 0.7983 - illumination_smoothness_loss: 0.0351 - spatial_constancy_loss: 0.2732 - total_loss: 1.1890 - val_color_constancy_loss: 0.0960 - val_exposure_loss: 0.6237 - val_illumination_smoothness_loss: 0.0547 - val_spatial_constancy_loss: 0.2955 - val_total_loss: 1.0699
    116. Epoch 57/100
    117. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0823 - exposure_loss: 0.7987 - illumination_smoothness_loss: 0.0331 - spatial_constancy_loss: 0.2730 - total_loss: 1.1871 - val_color_constancy_loss: 0.0963 - val_exposure_loss: 0.6236 - val_illumination_smoothness_loss: 0.0540 - val_spatial_constancy_loss: 0.2956 - val_total_loss: 1.0694
    118. Epoch 58/100
    119. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0823 - exposure_loss: 0.7990 - illumination_smoothness_loss: 0.0319 - spatial_constancy_loss: 0.2727 - total_loss: 1.1859 - val_color_constancy_loss: 0.0965 - val_exposure_loss: 0.6210 - val_illumination_smoothness_loss: 0.0537 - val_spatial_constancy_loss: 0.2976 - val_total_loss: 1.0688
    120. Epoch 59/100
    121. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0826 - exposure_loss: 0.7969 - illumination_smoothness_loss: 0.0315 - spatial_constancy_loss: 0.2740 - total_loss: 1.1850 - val_color_constancy_loss: 0.0966 - val_exposure_loss: 0.6208 - val_illumination_smoothness_loss: 0.0530 - val_spatial_constancy_loss: 0.2978 - val_total_loss: 1.0682
    122. Epoch 60/100
    123. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0824 - exposure_loss: 0.7971 - illumination_smoothness_loss: 0.0304 - spatial_constancy_loss: 0.2740 - total_loss: 1.1840 - val_color_constancy_loss: 0.0966 - val_exposure_loss: 0.6206 - val_illumination_smoothness_loss: 0.0516 - val_spatial_constancy_loss: 0.2979 - val_total_loss: 1.0667
    124. Epoch 61/100
    125. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0825 - exposure_loss: 0.7969 - illumination_smoothness_loss: 0.0295 - spatial_constancy_loss: 0.2741 - total_loss: 1.1829 - val_color_constancy_loss: 0.0969 - val_exposure_loss: 0.6194 - val_illumination_smoothness_loss: 0.0506 - val_spatial_constancy_loss: 0.2988 - val_total_loss: 1.0657
    126. Epoch 62/100
    127. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7954 - illumination_smoothness_loss: 0.0287 - spatial_constancy_loss: 0.2749 - total_loss: 1.1817 - val_color_constancy_loss: 0.0967 - val_exposure_loss: 0.6203 - val_illumination_smoothness_loss: 0.0494 - val_spatial_constancy_loss: 0.2981 - val_total_loss: 1.0644
    128. Epoch 63/100
    129. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0825 - exposure_loss: 0.7966 - illumination_smoothness_loss: 0.0278 - spatial_constancy_loss: 0.2742 - total_loss: 1.1810 - val_color_constancy_loss: 0.0971 - val_exposure_loss: 0.6184 - val_illumination_smoothness_loss: 0.0491 - val_spatial_constancy_loss: 0.2996 - val_total_loss: 1.0642
    130. Epoch 64/100
    131. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 67ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7949 - illumination_smoothness_loss: 0.0268 - spatial_constancy_loss: 0.2753 - total_loss: 1.1797 - val_color_constancy_loss: 0.0969 - val_exposure_loss: 0.6199 - val_illumination_smoothness_loss: 0.0460 - val_spatial_constancy_loss: 0.2984 - val_total_loss: 1.0611
    132. Epoch 65/100
    133. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0826 - exposure_loss: 0.7957 - illumination_smoothness_loss: 0.0254 - spatial_constancy_loss: 0.2748 - total_loss: 1.1785 - val_color_constancy_loss: 0.0976 - val_exposure_loss: 0.6180 - val_illumination_smoothness_loss: 0.0464 - val_spatial_constancy_loss: 0.2998 - val_total_loss: 1.0618
    134. Epoch 66/100
    135. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7948 - illumination_smoothness_loss: 0.0249 - spatial_constancy_loss: 0.2753 - total_loss: 1.1777 - val_color_constancy_loss: 0.0975 - val_exposure_loss: 0.6189 - val_illumination_smoothness_loss: 0.0448 - val_spatial_constancy_loss: 0.2991 - val_total_loss: 1.0602
    136. Epoch 67/100
    137. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0825 - exposure_loss: 0.7954 - illumination_smoothness_loss: 0.0241 - spatial_constancy_loss: 0.2750 - total_loss: 1.1770 - val_color_constancy_loss: 0.0977 - val_exposure_loss: 0.6179 - val_illumination_smoothness_loss: 0.0441 - val_spatial_constancy_loss: 0.2998 - val_total_loss: 1.0595
    138. Epoch 68/100
    139. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7946 - illumination_smoothness_loss: 0.0231 - spatial_constancy_loss: 0.2757 - total_loss: 1.1761 - val_color_constancy_loss: 0.0973 - val_exposure_loss: 0.6198 - val_illumination_smoothness_loss: 0.0410 - val_spatial_constancy_loss: 0.2980 - val_total_loss: 1.0562
    140. Epoch 69/100
    141. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0826 - exposure_loss: 0.7947 - illumination_smoothness_loss: 0.0226 - spatial_constancy_loss: 0.2752 - total_loss: 1.1752 - val_color_constancy_loss: 0.0979 - val_exposure_loss: 0.6170 - val_illumination_smoothness_loss: 0.0435 - val_spatial_constancy_loss: 0.3003 - val_total_loss: 1.0587
    142. Epoch 70/100
    143. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7940 - illumination_smoothness_loss: 0.0224 - spatial_constancy_loss: 0.2758 - total_loss: 1.1749 - val_color_constancy_loss: 0.0976 - val_exposure_loss: 0.6182 - val_illumination_smoothness_loss: 0.0414 - val_spatial_constancy_loss: 0.2994 - val_total_loss: 1.0566
    144. Epoch 71/100
    145. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7941 - illumination_smoothness_loss: 0.0216 - spatial_constancy_loss: 0.2758 - total_loss: 1.1742 - val_color_constancy_loss: 0.0974 - val_exposure_loss: 0.6189 - val_illumination_smoothness_loss: 0.0389 - val_spatial_constancy_loss: 0.2986 - val_total_loss: 1.0538
    146. Epoch 72/100
    147. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7941 - illumination_smoothness_loss: 0.0211 - spatial_constancy_loss: 0.2755 - total_loss: 1.1734 - val_color_constancy_loss: 0.0979 - val_exposure_loss: 0.6166 - val_illumination_smoothness_loss: 0.0420 - val_spatial_constancy_loss: 0.3005 - val_total_loss: 1.0571
    148. Epoch 73/100
    149. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7935 - illumination_smoothness_loss: 0.0214 - spatial_constancy_loss: 0.2759 - total_loss: 1.1735 - val_color_constancy_loss: 0.0977 - val_exposure_loss: 0.6172 - val_illumination_smoothness_loss: 0.0401 - val_spatial_constancy_loss: 0.3001 - val_total_loss: 1.0551
    150. Epoch 74/100
    151. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7935 - illumination_smoothness_loss: 0.0205 - spatial_constancy_loss: 0.2760 - total_loss: 1.1727 - val_color_constancy_loss: 0.0978 - val_exposure_loss: 0.6168 - val_illumination_smoothness_loss: 0.0395 - val_spatial_constancy_loss: 0.3005 - val_total_loss: 1.0546
    152. Epoch 75/100
    153. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7924 - illumination_smoothness_loss: 0.0204 - spatial_constancy_loss: 0.2764 - total_loss: 1.1721 - val_color_constancy_loss: 0.0977 - val_exposure_loss: 0.6176 - val_illumination_smoothness_loss: 0.0385 - val_spatial_constancy_loss: 0.2997 - val_total_loss: 1.0536
    154. Epoch 76/100
    155. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7933 - illumination_smoothness_loss: 0.0198 - spatial_constancy_loss: 0.2760 - total_loss: 1.1718 - val_color_constancy_loss: 0.0979 - val_exposure_loss: 0.6166 - val_illumination_smoothness_loss: 0.0376 - val_spatial_constancy_loss: 0.3002 - val_total_loss: 1.0524
    156. Epoch 77/100
    157. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7925 - illumination_smoothness_loss: 0.0195 - spatial_constancy_loss: 0.2763 - total_loss: 1.1710 - val_color_constancy_loss: 0.0979 - val_exposure_loss: 0.6170 - val_illumination_smoothness_loss: 0.0384 - val_spatial_constancy_loss: 0.2999 - val_total_loss: 1.0532
    158. Epoch 78/100
    159. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0827 - exposure_loss: 0.7929 - illumination_smoothness_loss: 0.0196 - spatial_constancy_loss: 0.2761 - total_loss: 1.1713 - val_color_constancy_loss: 0.0979 - val_exposure_loss: 0.6170 - val_illumination_smoothness_loss: 0.0369 - val_spatial_constancy_loss: 0.3000 - val_total_loss: 1.0518
    160. Epoch 79/100
    161. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7922 - illumination_smoothness_loss: 0.0192 - spatial_constancy_loss: 0.2763 - total_loss: 1.1704 - val_color_constancy_loss: 0.0981 - val_exposure_loss: 0.6157 - val_illumination_smoothness_loss: 0.0380 - val_spatial_constancy_loss: 0.3009 - val_total_loss: 1.0527
    162. Epoch 80/100
    163. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7918 - illumination_smoothness_loss: 0.0191 - spatial_constancy_loss: 0.2766 - total_loss: 1.1703 - val_color_constancy_loss: 0.0980 - val_exposure_loss: 0.6159 - val_illumination_smoothness_loss: 0.0373 - val_spatial_constancy_loss: 0.3004 - val_total_loss: 1.0516
    164. Epoch 81/100
    165. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7917 - illumination_smoothness_loss: 0.0190 - spatial_constancy_loss: 0.2764 - total_loss: 1.1699 - val_color_constancy_loss: 0.0981 - val_exposure_loss: 0.6153 - val_illumination_smoothness_loss: 0.0373 - val_spatial_constancy_loss: 0.3009 - val_total_loss: 1.0516
    166. Epoch 82/100
    167. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 66ms/step - color_constancy_loss: 0.0829 - exposure_loss: 0.7915 - illumination_smoothness_loss: 0.0187 - spatial_constancy_loss: 0.2766 - total_loss: 1.1697 - val_color_constancy_loss: 0.0979 - val_exposure_loss: 0.6170 - val_illumination_smoothness_loss: 0.0348 - val_spatial_constancy_loss: 0.2996 - val_total_loss: 1.0493
    168. Epoch 83/100
    169. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 65ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7918 - illumination_smoothness_loss: 0.0182 - spatial_constancy_loss: 0.2763 - total_loss: 1.1691 - val_color_constancy_loss: 0.0980 - val_exposure_loss: 0.6158 - val_illumination_smoothness_loss: 0.0358 - val_spatial_constancy_loss: 0.3004 - val_total_loss: 1.0500
    170. Epoch 84/100
    171. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 65ms/step - color_constancy_loss: 0.0829 - exposure_loss: 0.7911 - illumination_smoothness_loss: 0.0184 - spatial_constancy_loss: 0.2766 - total_loss: 1.1689 - val_color_constancy_loss: 0.0982 - val_exposure_loss: 0.6146 - val_illumination_smoothness_loss: 0.0366 - val_spatial_constancy_loss: 0.3010 - val_total_loss: 1.0505
    172. Epoch 85/100
    173. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0829 - exposure_loss: 0.7907 - illumination_smoothness_loss: 0.0185 - spatial_constancy_loss: 0.2767 - total_loss: 1.1687 - val_color_constancy_loss: 0.0980 - val_exposure_loss: 0.6154 - val_illumination_smoothness_loss: 0.0361 - val_spatial_constancy_loss: 0.3006 - val_total_loss: 1.0501
    174. Epoch 86/100
    175. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 65ms/step - color_constancy_loss: 0.0828 - exposure_loss: 0.7910 - illumination_smoothness_loss: 0.0182 - spatial_constancy_loss: 0.2765 - total_loss: 1.1685 - val_color_constancy_loss: 0.0982 - val_exposure_loss: 0.6145 - val_illumination_smoothness_loss: 0.0356 - val_spatial_constancy_loss: 0.3009 - val_total_loss: 1.0492
    176. Epoch 87/100
    177. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0829 - exposure_loss: 0.7902 - illumination_smoothness_loss: 0.0181 - spatial_constancy_loss: 0.2767 - total_loss: 1.1680 - val_color_constancy_loss: 0.0981 - val_exposure_loss: 0.6149 - val_illumination_smoothness_loss: 0.0357 - val_spatial_constancy_loss: 0.3007 - val_total_loss: 1.0494
    178. Epoch 88/100
    179. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0829 - exposure_loss: 0.7904 - illumination_smoothness_loss: 0.0180 - spatial_constancy_loss: 0.2766 - total_loss: 1.1679 - val_color_constancy_loss: 0.0983 - val_exposure_loss: 0.6133 - val_illumination_smoothness_loss: 0.0359 - val_spatial_constancy_loss: 0.3015 - val_total_loss: 1.0491
    180. Epoch 89/100
    181. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0830 - exposure_loss: 0.7893 - illumination_smoothness_loss: 0.0181 - spatial_constancy_loss: 0.2770 - total_loss: 1.1674 - val_color_constancy_loss: 0.0981 - val_exposure_loss: 0.6148 - val_illumination_smoothness_loss: 0.0350 - val_spatial_constancy_loss: 0.3006 - val_total_loss: 1.0484
    182. Epoch 90/100
    183. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0829 - exposure_loss: 0.7901 - illumination_smoothness_loss: 0.0178 - spatial_constancy_loss: 0.2765 - total_loss: 1.1673 - val_color_constancy_loss: 0.0984 - val_exposure_loss: 0.6128 - val_illumination_smoothness_loss: 0.0358 - val_spatial_constancy_loss: 0.3017 - val_total_loss: 1.0487
    184. Epoch 91/100
    185. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0831 - exposure_loss: 0.7886 - illumination_smoothness_loss: 0.0181 - spatial_constancy_loss: 0.2771 - total_loss: 1.1669 - val_color_constancy_loss: 0.0981 - val_exposure_loss: 0.6142 - val_illumination_smoothness_loss: 0.0351 - val_spatial_constancy_loss: 0.3007 - val_total_loss: 1.0481
    186. Epoch 92/100
    187. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0829 - exposure_loss: 0.7895 - illumination_smoothness_loss: 0.0177 - spatial_constancy_loss: 0.2766 - total_loss: 1.1668 - val_color_constancy_loss: 0.0983 - val_exposure_loss: 0.6133 - val_illumination_smoothness_loss: 0.0349 - val_spatial_constancy_loss: 0.3011 - val_total_loss: 1.0476
    188. Epoch 93/100
    189. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0831 - exposure_loss: 0.7884 - illumination_smoothness_loss: 0.0179 - spatial_constancy_loss: 0.2770 - total_loss: 1.1664 - val_color_constancy_loss: 0.0984 - val_exposure_loss: 0.6125 - val_illumination_smoothness_loss: 0.0355 - val_spatial_constancy_loss: 0.3014 - val_total_loss: 1.0478
    190. Epoch 94/100
    191. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 65ms/step - color_constancy_loss: 0.0831 - exposure_loss: 0.7882 - illumination_smoothness_loss: 0.0181 - spatial_constancy_loss: 0.2769 - total_loss: 1.1663 - val_color_constancy_loss: 0.0983 - val_exposure_loss: 0.6128 - val_illumination_smoothness_loss: 0.0349 - val_spatial_constancy_loss: 0.3012 - val_total_loss: 1.0473
    192. Epoch 95/100
    193. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0831 - exposure_loss: 0.7881 - illumination_smoothness_loss: 0.0179 - spatial_constancy_loss: 0.2770 - total_loss: 1.1660 - val_color_constancy_loss: 0.0983 - val_exposure_loss: 0.6130 - val_illumination_smoothness_loss: 0.0341 - val_spatial_constancy_loss: 0.3009 - val_total_loss: 1.0462
    194. Epoch 96/100
    195. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0832 - exposure_loss: 0.7874 - illumination_smoothness_loss: 0.0179 - spatial_constancy_loss: 0.2771 - total_loss: 1.1656 - val_color_constancy_loss: 0.0983 - val_exposure_loss: 0.6125 - val_illumination_smoothness_loss: 0.0353 - val_spatial_constancy_loss: 0.3010 - val_total_loss: 1.0471
    196. Epoch 97/100
    197. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0830 - exposure_loss: 0.7882 - illumination_smoothness_loss: 0.0181 - spatial_constancy_loss: 0.2765 - total_loss: 1.1658 - val_color_constancy_loss: 0.0984 - val_exposure_loss: 0.6120 - val_illumination_smoothness_loss: 0.0346 - val_spatial_constancy_loss: 0.3014 - val_total_loss: 1.0464
    198. Epoch 98/100
    199. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 63ms/step - color_constancy_loss: 0.0832 - exposure_loss: 0.7869 - illumination_smoothness_loss: 0.0180 - spatial_constancy_loss: 0.2772 - total_loss: 1.1653 - val_color_constancy_loss: 0.0984 - val_exposure_loss: 0.6118 - val_illumination_smoothness_loss: 0.0344 - val_spatial_constancy_loss: 0.3012 - val_total_loss: 1.0458
    200. Epoch 99/100
    201. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0832 - exposure_loss: 0.7863 - illumination_smoothness_loss: 0.0182 - spatial_constancy_loss: 0.2772 - total_loss: 1.1650 - val_color_constancy_loss: 0.0983 - val_exposure_loss: 0.6120 - val_illumination_smoothness_loss: 0.0343 - val_spatial_constancy_loss: 0.3007 - val_total_loss: 1.0453
    202. Epoch 100/100
    203. 25/25 ━━━━━━━━━━━━━━━━━━━━ 2s 64ms/step - color_constancy_loss: 0.0831 - exposure_loss: 0.7873 - illumination_smoothness_loss: 0.0180 - spatial_constancy_loss: 0.2765 - total_loss: 1.1649 - val_color_constancy_loss: 0.0984 - val_exposure_loss: 0.6115 - val_illumination_smoothness_loss: 0.0341 - val_spatial_constancy_loss: 0.3011 - val_total_loss: 1.0451

    推论

    1. def plot_results(images, titles, figure_size=(12, 12)):
    2. fig = plt.figure(figsize=figure_size)
    3. for i in range(len(images)):
    4. fig.add_subplot(1, len(images), i + 1).set_title(titles[i])
    5. _ = plt.imshow(images[i])
    6. plt.axis("off")
    7. plt.show()
    8. def infer(original_image):
    9. image = keras.utils.img_to_array(original_image)
    10. image = image.astype("float32") / 255.0
    11. image = np.expand_dims(image, axis=0)
    12. output_image = zero_dce_model(image)
    13. output_image = tf.cast((output_image[0, :, :, :] * 255), dtype=np.uint8)
    14. output_image = Image.fromarray(output_image.numpy())
    15. return output_image

    测试图像推理


    我们将通过 MIRNet 增强的 LOLDataset 测试图像与通过 PIL.ImageOps.autocontrast() 函数增强的图像进行了比较。

    您可以使用 Hugging Face Hub 上托管的训练有素的模型,并在 Hugging Face Spaces 上尝试演

    1. for val_image_file in test_low_light_images:
    2. original_image = Image.open(val_image_file)
    3. enhanced_image = infer(original_image)
    4. plot_results(
    5. [original_image, ImageOps.autocontrast(original_image), enhanced_image],
    6. ["Original", "PIL Autocontrast", "Enhanced"],
    7. (20, 12),
    8. )


  • 相关阅读:
    Spring 03: xml的构造方法注入
    使用Uiautomator2进行APP自动化测试
    libevent学习——Reactor模式
    沪深300期权一个点多少钱?
    33. 对 BFC 的理解, 如何创建 BFC?
    两利好因素携手而至 美元指数或逼近年内高点?
    (1) Java后端从0硬撸vite3+vue3+ts项目 | 起步
    MySQL5.7版本与8.0版本在Ubuntu(WSL环境)系统安装
    LeetCode-895. 最大频率栈【栈,哈希表,设计,有序集合】
    【cmake】find_package设置查找路径
  • 原文地址:https://blog.csdn.net/snowdenkeke/article/details/138144156