When returning a single JSON string of about 3,5 MB from a controller's action (return type byte[]
), the mentioned System.ArgumentException
exception is thrown in my asp.net Core 3.0 (Preview 8) Api project.
Looking through the docs, there seems to be no option to adjust any limit regarding this issue.
```c#
[HttpPost]
public async Task
var largeArray = new byte[3.5 * 1024 * 1024];
return largeArray;
}
Exception:
System.ArgumentException
HResult=0x80070057
Message=The JSON value of length 3770846 is too large and not supported.
Source=System.Text.Json
StackTrace:
at System.Text.Json.ThrowHelper.ThrowArgumentException_ValueTooLarge(Int32 tokenLength)
Call stack:
System.Text.Json.dll!System.Text.Json.ThrowHelper.ThrowArgumentException_ValueTooLarge(int tokenLength)
System.Text.Json.dll!System.Text.Json.Utf8JsonWriter.WriteBase64StringValue(System.ReadOnlySpan
System.Text.Json.dll!System.Text.Json.Serialization.Converters.JsonConverterByteArray.Write(System.Text.Json.Utf8JsonWriter writer, byte[] value, System.Text.Json.JsonSerializerOptions options)
System.Text.Json.dll!System.Text.Json.JsonPropertyInfoNotNullable
System.Text.Json.dll!System.Text.Json.JsonPropertyInfo.Write(ref System.Text.Json.WriteStack state, System.Text.Json.Utf8JsonWriter writer)
System.Text.Json.dll!System.Text.Json.JsonSerializer.Write(System.Text.Json.Utf8JsonWriter writer, int originalWriterDepth, int flushThreshold, System.Text.Json.JsonSerializerOptions options, ref System.Text.Json.WriteStack state)
System.Text.Json.dll!System.Text.Json.JsonSerializer.WriteAsyncCore(System.IO.Stream utf8Json, object value, System.Type type, System.Text.Json.JsonSerializerOptions options, System.Threading.CancellationToken cancellationToken)
System.Text.Json.dll!System.Text.Json.JsonSerializer.SerializeAsync(System.IO.Stream utf8Json, object value, System.Type type, System.Text.Json.JsonSerializerOptions options, System.Threading.CancellationToken cancellationToken)
Microsoft.AspNetCore.Mvc.Core.dll!Microsoft.AspNetCore.Mvc.Formatters.SystemTextJsonOutputFormatter.WriteResponseBodyAsync(Microsoft.AspNetCore.Mvc.Formatters.OutputFormatterWriteContext context, System.Text.Encoding selectedEncoding)
Microsoft.AspNetCore.Mvc.Core.dll!Microsoft.AspNetCore.Mvc.Formatters.TextOutputFormatter.WriteAsync(Microsoft.AspNetCore.Mvc.Formatters.OutputFormatterWriteContext context)
``
There seems to be a fixed max length constant
JsonConstants.MaxBase46ValueTokenSize(btw, typo in the constant's name) set to 125 KB, that
JsonWriterHelper.ValidateBytes()` checks against.
Of course I can change the response type to plain\text
and return it as such. Is that the intention of the hard coded 125 KB limit, or what's the reason for that?
This should probably be made configurable.
The limit is based on maximum 1_000_000_000 size buffer with assumed max encoding factor of 6x and max base64 encoding factor of (4/3=1.33x) for a total max factor of 8x so given a buffer of 125_000_000 characters that could expand into a buffer of 1_000_000_000.
However, there is a bug with the const here. It should resolve to 125_000_000 but instead is only 2_604_166 causing your error using 3_770_846
The const is defined as
public const int MaxBase46ValueTokenSize = (MaxEscapedTokenSize >> 2 * 3) / MaxExpansionFactorWhileEscaping;
but should be
public const int MaxBase64ValueTokenSize = ((MaxEscapedTokenSize >> 2) * 3) / MaxExpansionFactorWhileEscaping;
or
public const int MaxBase64ValueTokenSize = (MaxEscapedTokenSize \ 4 * 3) / MaxExpansionFactorWhileEscaping;
Simple repro:
var buffer = new System.Buffers.ArrayBufferWriter<byte>();
var writer = new Utf8JsonWriter(buffer);
writer.WriteStartArray();
byte[] bytes = Encoding.UTF8.GetBytes(new String('a', 2_604_167));
writer.WriteBase64StringValue(bytes);
cc @ahsonkhan
Of course I can change the response type to
plain\text
and return it as such. Is that the intention of the hard coded 125 KB limit, or what's the reason for that?
Based on how we currently write tokens, there is a reason for a hard-coded upper limit. However, as @steveharter pointed out, the limit was set much lower than intended, and it should be in the ~100s of MB for a single token (which no one should really be hitting against - at least not that I have seen in common scenarios).
This should probably be made configurable.
Since, we effectively allow any reasonable size to go through, I don't know if allowing an option to configure this is necessary. I haven't seen or heard any feedback on trying to control max token size (unlike some of the other configurations we provide).
I agree that the correct size of 125 MB is much more reasonable. If you're returning a single JSON string larger than that, you should probably consider other options anyway because of size and encoding/decoding overhead.
but should be
public const int MaxBase64ValueTokenSize = ((MaxEscapedTokenSize >> 2) * 3) / MaxExpansionFactorWhileEscaping;
Yep, that's exactly the issue. Good catch!
@lauxjpn - is it feasible for your scenario to return Task<ActionResult<string>>
where the string is the Base-64 encoded value of the byte[] as a workaround to this issue (for 3.0)?
This should probably be made configurable.
Since, we effectively allow any reasonable size to go through, I don't know if allowing an option to configure this is necessary. I haven't seen or heard any feedback on trying to control max token size (unlike some of the other configurations we provide).
Agreed, 125MB is more than enough for anything sane :). Wasn't clear that 125KB was unintentional.
@ahsonkhan That is actually how I ended up implementing it for now (with MIME text/plain
). Is is a cleaner approach for a medium sized single value anyway, so for me personally, it just forced me to implement a better design.
I was just surprised over such a small size limit from a public API that is intended to encode byte[]
arrays.
Though i guess this scenario is not uncommon and will be an issue when System.Text.Json
is released to the public as a performance optimized replacement of JSON.NET.
Re-opening until it's fixed in 3.0 - depending on approval.
I think 125MB can be quite tight. One has to be careful calling such cases insane. What exactly is insane about that? Sometimes, bulk data returns are a quick and easy solution that totally works. No need to invest work into devising some streaming solution, complicated format or out-of-band storage.
I think 125MB can be quite tight.
For what use case?
Sometimes, bulk data returns are a quick and easy solution that totally works. No need to invest work into devising some streaming solution, complicated format or out-of-band storage.
@GSPP - this issue primarily highlighted a bug in which the the intended limit was calculated incorrectly, due to a typo, which we fixed. The way the Utf8JsonWriter
is implemented today ends up putting that limit, mainly for optimizing the common cases people hit and JSON token sizes > 125 MB are quite uncommon (I haven't heard any requests for supporting that so far). I can imagine up a theoretical use case for needing to write large data as JSON, but it would be good to have an actual usage pattern to motivate changing it. If you have a concrete/real world scenario that would be blocked due to this limit, please feel free to file a new issue and we can re-evaluate how the Utf8JsonWriter
currently writes data to the underlying output sink and consider the perf trade-offs for enabling that scenario.
This issue was fixed in 3.0 by https://github.com/dotnet/corefx/pull/40796.
Hence, closing.
It might be a good idea to update the docs (at least for Utf8JsonWriter.WriteBase64String) to mention the 125 MB limit. At the moment, the exception is listed as below:
The specified property name is too large.
Also, the message is a bit misleading and could be made clearer:
The value of the specified property name is too large.
The limit itself could be mentioned in the remarks section, so people have a basic idea about, when this method could throw later on in production.
That's a great idea @lauxjpn - would you be willing to submit a PR to update the docs?
For instance, to update the WriteBase64String APIs from https://docs.microsoft.com/en-us/dotnet/api/system.text.json.utf8jsonwriter.writebase64string?view=netcore-3.0, submit the change to the following (along with the overloads):
https://github.com/dotnet/dotnet-api-docs/blob/8d7d21ebf242c7cd64ebd9a83b9c49ff8c72aabf/xml/System.Text.Json/Utf8JsonWriter.xml#L430-L465
The value of the specified property name is too large.
That is partially correct. We would throw ArgumentException for both cases:
We should update the docs to mention both cases, similar to the WriteString() APIs:
https://docs.microsoft.com/en-us/dotnet/api/system.text.json.utf8jsonwriter.writestring?view=netcore-3.0#System_Text_Json_Utf8JsonWriter_WriteString_System_String_System_String_
The specified property name or value is too large.
I'll get on to it.
This restriction is not implemented in Json.Net (Newtonsoft), so why should it by here?
Most helpful comment
The limit is based on maximum 1_000_000_000 size buffer with assumed max encoding factor of 6x and max base64 encoding factor of (4/3=1.33x) for a total max factor of 8x so given a buffer of 125_000_000 characters that could expand into a buffer of 1_000_000_000.
However, there is a bug with the const here. It should resolve to 125_000_000 but instead is only 2_604_166 causing your error using 3_770_846
The const is defined as
but should be
or
Simple repro:
cc @ahsonkhan