Hi,
A crash caused by stack exhaustion parsing a JSON was found. It affects, at least version 1.5 as well as the last git revision. To reproduce:
$ gdb -tty=/dev/null --args jq . qcufnzxcnp.json.4167733746247029131
...
Program received signal SIGSEGV, Segmentation fault.
0x00007ffff47fa7c2 in _IO_new_file_overflow (f=0x7ffff4b3f400 <_IO_2_1_stdout_>, ch=-1) at fileops.c:824
824 fileops.c: No such file or directory.
(gdb) bt 10
#0 0x00007ffff47fa7c2 in _IO_new_file_overflow (f=0x7ffff4b3f400 <_IO_2_1_stdout_>, ch=-1) at fileops.c:824
#1 0x00007ffff47f96a1 in _IO_new_file_xsputn (f=0x7ffff4b3f400 <_IO_2_1_stdout_>, data=<optimized out>, n=1) at fileops.c:1332
#2 0x00007ffff47eee6d in __GI__IO_fwrite (buf=<optimized out>, size=1, count=1, fp=0x7ffff4b3f400 <_IO_2_1_stdout_>) at iofwrite.c:43
#3 0x0000000000428943 in put_buf (s=0x7fffff7ff0cc " \177", len=1, fout=0x7ffff4b3f400 <_IO_2_1_stdout_>, strout=0x0, is_tty=0) at src/jv_print.c:41
#4 0x000000000042897c in put_char (c=32 ' ', fout=0x7ffff4b3f400 <_IO_2_1_stdout_>, strout=0x0, T=0) at src/jv_print.c:47
#5 0x0000000000428ab1 in put_indent (n=29145, flags=513, fout=0x7ffff4b3f400 <_IO_2_1_stdout_>, strout=0x0, T=0) at src/jv_print.c:61
#6 0x000000000042983e in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16374, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
at src/jv_print.c:204
#7 0x0000000000429977 in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16373, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
at src/jv_print.c:215
#8 0x0000000000429977 in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16372, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
at src/jv_print.c:215
#9 0x0000000000429977 in jv_dump_term (C=0x7fffffffdf60, x=..., flags=513, indent=16371, F=0x7ffff4b3f400 <_IO_2_1_stdout_>, S=0x0)
at src/jv_print.c:215
(More stack frames follow...)
Find attached a JSON file to reproduce it here
Regards,
Gustavo.
Actually, this does not appear to be a parsing error, which I think is good news:
$ jq length qcufnzxcnp.json
31
This is simply running out of stack space during jv_dump_term(). I was able to make the number of recursions go up by 11,000 or so by dynamically allocating variables instead of using stack space, but at the end of the day, you can work around this (for this test case) by doing:
# ulimit -s 32768
Of course, a much larger (deeper) json file will still make it crash. You can thus, make it happen far faster by doing:
# ulimit -s 256
The crash does not seem to happen due to any buffer overflows or memory corruption; it's simply the kernel saying "you're out of stack space, please die". One simple way to avoid the crash itself is to do a depth-check of the tree and compare it to some value related to the stack size returned by getrlimit() and produce an error if the values are way out of whack.
A more complicated solution would involve removing recursion from jv_dump_term(), which looks like it would not be a trivial fix.
This issue is referenced as CVE-2016-4074.
If I understand correctly this was fixed by 83e2cf6. Suggest closing.
Good point. Closed.
By all indications this problem is not solved. It still appears on the Common Vulnerabilities & Exposure (CVE) list.
@taneishamitchell - how do you figure its not solved? Just tested this with the provided json above and it didn't crash.
bash-5.0# jq . qcufnzxcnp.json.4167733746247029131
parse error: Exceeds depth limit for parsing at line 7, column 257
bash-5.0# jq --version
jq-master-v20200428-28-g864c859e9d
Most helpful comment
By all indications this problem is not solved. It still appears on the Common Vulnerabilities & Exposure (CVE) list.