You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm seeing some runtime leaks (not visible in valgrind) when parsing a big json file with 10K-20K entries in array.
I'm using the jansson version 2.14.
I know it's been over a year, but any luck with this? I think I'm running into a similar issue in my own project, even though I'm loading the file into a buffer manually before using json_load() on it. My JSON file sucks up about 2.8 GB when parsing, and even though I free the buffer and do a json_decref() (which is working) on the JSON root once I'm done, most of the memory is still unaccounted for and I presume it's being eaten up by the load function.
Hi Team,
I'm seeing some runtime leaks (not visible in valgrind) when parsing a big json file with 10K-20K entries in array.
I'm using the jansson version 2.14.
Sample code:
The json file looks something like this:
When checked with some other memory debugging tools, it reported some leaks as below.
Thanks,
The text was updated successfully, but these errors were encountered: