Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clippy_adagrad raised an error: NotImplementedError: _resource_apply_dense must be implemented in subclasses. #751

Open
ArachisTong opened this issue Jan 16, 2025 · 0 comments

Comments

@ArachisTong
Copy link

ArachisTong commented Jan 16, 2025

env:
gcc version 9.3.0
Description: Ubuntu 20.04.4 LTS
Python 3.10.16
tensorflow 2.9.0
tensorflow-estimator 2.9.0
tensorflow-io 0.37.1
tensorflow-io-gcs-filesystem 0.37.1
tensorflow-recommenders 0.7.3
keras 2.9.0

traceback:
File "/data1/env/miniconda3/envs/tf2.18/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/optimizer_v2.py", line 539, in minimize
return self.apply_gradients(grads_and_vars, name=name)
File "/data1/env/miniconda3/envs/tf2.18/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/optimizer_v2.py", line 678, in apply_gradients
return tf.internal.distribute.interim.maybe_merge_call(
File "/data1/env/miniconda3/envs/tf2.18/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/optimizer_v2.py", line 723, in _distributed_apply
update_op = distribution.extended.update(
File "/data1/env/miniconda3/envs/tf2.18/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/optimizer_v2.py", line 706, in apply_grad_to_update_var **
update_op = self._resource_apply_dense(grad, var, **apply_kwargs)
File "/data1/env/miniconda3/envs/tf2.18/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/optimizer_v2.py", line 1298, in _resource_apply_dense
raise NotImplementedError("_resource_apply_dense must be implemented in "

NotImplementedError: _resource_apply_dense must be implemented in subclasses.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant