-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OverflowError: cannot serialize a bytes object larger than 4 GiB (reduction.py) #80
Comments
Man, I was stumbling upon the exact same error today. I am trying to deploy an processing container agent on AWS ECR. Do you have a link that made you make this change? It could be related to a particular version of the multiprocessing package. |
I found it on some random forum. but here i think is the source: https://docs.python.org/3/library/pickle.html Protocol version 4 was added in Python 3.4. It adds support for very large objects, pickling more kinds of objects, and some data format optimizations. It is the default protocol starting with Python 3.8. Refer to PEP 3154 for information about improvements brought by protocol 4. |
Ok. I will try upgrading to python 3.8 and see if that fixes it. |
My workaround doesn't work and gives a invalid syntax error. should I try with python 3.8? |
Yes, I have found this to go away with a docker in python 3.8 on AWS |
I am not sure if this is related but I've been getting the following error: Epoch 1/2 I tried reducing the batch size and increasing the number of steps per epoch but to no avail. Traceback (most recent call last): Can anyone help me out with this? |
How much RAM memory do you have on your machine? |
Hi Jerome, I have 46 GB of RAM. |
Turn off multi_processing. When using multi-processing, it can duplicate your generator in each thread. This can results in larger memory use (depending on the generator you use). This is an option of the training object. |
That seems to have solved the problem. Thank you! |
When starting training on windows 10, tensorflow 2.4.4
OverflowError: cannot serialize a bytes object larger than 4 GiB
error appears in multiprocessing/reduction.py (line 60)
not really knowing what I am doing, but the error goes away if I change line 58 to:
def dump(obj, file, protocol=4):
The text was updated successfully, but these errors were encountered: