It is un-robust to use integer values above (2^53)-1 in JSON.
Where we have model fields corresponding to big integers, we should ensure that the corresponding serializer classes have appropriate maximum values. - Revised.
That's fine as long as you also remove (or adjust) hardcoded mappings between AutoField and integers in rest_framework.schemas. get_path_fields
Aside: Presumably this is based on a real case that you've hit, with more than 9 million billion records?
Less obvious what we'd do with the path fields - they would work just fine, it's just that the resulting data that they'd return would include id fields with lost precision (assuming a JS client).
Revised option (this makes even more sense on re-thinking)... Make no changes in the serializers, but ensure that the JSON encoder checks integers are within the interop range specified in https://tools.ietf.org/html/rfc7159 and raises a hard error if passed integer values outside that range.
Note that when such software is used, numbers that are integers and
are in the range[-(2**53)+1, (2**53)-1]are interoperable in the
sense that implementations will agree exactly on their numeric
values.
Background: I have to deal with database with big integer ids and I have to serialize them as strings so they can be handled correctly in javascript.
Generally the problem is not in framework and serializers - where I can adjust the serialization process to my needs - but in documentation API (maybe I haven't said it clearly before). #5011 solves the problem and I can't figure out another solution.
We should also consider putting a min_value/max_value on allowable integer ranges in schemas.
Closing as per #6718
Most helpful comment
We should also consider putting a min_value/max_value on allowable integer ranges in schemas.