Skip to content

remat : The custom_gradient decorator currently supports keywords arguments only when eager execution is enabled. #21207

New issue

Have a question about this project? No Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “No Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? No Sign in to your account

Open
innat opened this issue Apr 24, 2025 · 0 comments
Assignees
Labels

Comments

@innat
Copy link

innat commented Apr 24, 2025

It happens while using remat.

import keras
from keras import layers
import tensorflow as tf
import numpy as np

from keras import RematScope


def with_remat(mode):
    with RematScope(mode=mode):

        base_model = keras.applications.DenseNet121(
            weights='imagenet',  
            input_shape=(224,224,3),
            include_top=False
        )  
        inputs = keras.Input(shape=(224,224,3))
        x = base_model(inputs)
        x = keras.layers.GlobalAveragePooling2D()(x)
        outputs = keras.layers.Dense(10, activation='softmax')(x)
        custom_model = keras.Model(inputs, outputs)

    # bind all
    custom_model.compile(
        optimizer=keras.optimizers.Adam(),
        loss=keras.losses.CategoricalCrossentropy(),
        metrics=[
            keras.metrics.TopKCategoricalAccuracy(k=3, name='acc_top3'),
            keras.metrics.TopKCategoricalAccuracy(k=1, name='acc_top1')
            ]
        )



    # data 
    (x_train, y_train), (_, _) = keras.datasets.mnist.load_data()
    x_train, y_train = x_train[:5000], y_train[:5000]

    x_train = np.expand_dims(x_train, axis=-1)
    x_train = np.repeat(x_train, 3, axis=-1)
    x_train = x_train.astype('float32') / 255
    x_train = tf.image.resize(x_train, [224,224])
    y_train = tf.one_hot(y_train , depth=10) 
    custom_model.fit(x_train, y_train, batch_size=6, epochs=10, verbose = 1)


with_remat(mode='full')
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
[<ipython-input-10-839581e84b79>](https://localhost:8080/#) in <cell line: 0>()
----> 1 with_remat(mode='full')

4 frames
[/usr/local/lib/python3.11/dist-packages/keras/src/utils/traceback_utils.py](https://localhost:8080/#) in error_handler(*args, **kwargs)
    122             raise e.with_traceback(filtered_tb) from None
    123         finally:
--> 124             del filtered_tb
    125 
    126     return error_handler

ValueError: Exception encountered when calling Functional.call().

The custom_gradient decorator currently supports keywords arguments only when eager execution is enabled.

Arguments received by Functional.call():
  • inputs=tf.Tensor(shape=(None, 224, 224, 3), dtype=float32)
  • training=True
  • mask=None
  1. What is the future of keras.application API?
  2. Does remat scope also work on data parallel or model parallel set up?
No Sign up for free to join this conversation on GitHub. Already have an account? No Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants