Skip to content

Out of memory when loading a large JSON file. #18

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
beoran opened this issue Nov 28, 2019 · 4 comments
Open

Out of memory when loading a large JSON file. #18

beoran opened this issue Nov 28, 2019 · 4 comments
Assignees

Comments

@beoran
Copy link

beoran commented Nov 28, 2019

I have a 941MB json file I would like to import into eliasdb, but I get an out of error crash. this is because the importer tries to load the data wholesale in stead of incrementally. A way to do incremental loading from a single json file would be great.

@krotik
Copy link
Owner

krotik commented Dec 8, 2019

You are correct. Let me think about that ...

@beoran
Copy link
Author

beoran commented Dec 9, 2019

I think you could switch to a streaming parser for JSON, maybe something like this, if you don't mind the dependency:

https://github.com/francoispqt/gojay

@krotik krotik self-assigned this Dec 11, 2019
@mladkau
Copy link
Contributor

mladkau commented May 19, 2021

I think what would also work here is newline delimited JSON data like BigQuery is using ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants
@beoran @krotik @mladkau and others