Aspnetcore: HTTP/2: Rare data corruption when stress testing against Kestrel

Created on 23 Aug 2019  路  15Comments  路  Source: dotnet/aspnetcore

So we found one instance of data corruption and it's definitely occurring on the server (Kestrel). Our current working hypothesis is there is an off by one error when leasing memory out in our ConcurrentPipeWriter. What we see happening is a request with a large response is getting its first byte corrupted with a value that was previously written. There is a test that is running which sends 1-byte data frames from the client to the server and vice versa. Because these tests are running in parallel, we believe one byte is corrupting the memory of the other response before we write it to the StreamPipeWriter.

I'm bashing my head against this code trying to figure out how this can happen. I may try writing a test which tries to have two streams writing to the response at the same time, making sure their responses are in order. However, consider this repros once in every 3 hours when running a stress suite, I doubt that will really help. My current plan is to custom build kestrel and add more logging to the ConcurrentPipeWriter.
cc @halter73 @Tratcher

area-servers

All 15 comments

Might be unrelated but

ConcurrentPipeWriter does .GetAwaiter().GetResult() on a ValueTask and then returns it for something else to await it:
https://github.com/aspnet/AspNetCore/blob/f0029227cf5ed4f13e2b5ce41c44cbfa763f79a5/src/Servers/Kestrel/Core/src/Internal/Infrastructure/PipeWriterHelpers/ConcurrentPipeWriter.cs#L119-L124

Not sure that's valid?

Http2OutputProducer stores a ValueTask<FlushResult>
https://github.com/aspnet/AspNetCore/blob/f0029227cf5ed4f13e2b5ce41c44cbfa763f79a5/src/Servers/Kestrel/Core/src/Internal/Http2/Http2OutputProducer.cs#L35

Which is created from the async ValueTask method async ValueTask<FlushResult> ProcessDataWrites()
https://github.com/aspnet/AspNetCore/blob/f0029227cf5ed4f13e2b5ce41c44cbfa763f79a5/src/Servers/Kestrel/Core/src/Internal/Http2/Http2OutputProducer.cs#L67

and then returned by WriteStreamSuffixAsync
https://github.com/aspnet/AspNetCore/blob/f0029227cf5ed4f13e2b5ce41c44cbfa763f79a5/src/Servers/Kestrel/Core/src/Internal/Http2/Http2OutputProducer.cs#L204-L219

Can this method be called once and only once per class and is its result only awaited once?

/cc @halter73 @stephentoub

Added PR for the Http2OutputProducer part https://github.com/aspnet/AspNetCore/pull/13398

Added PR for the ConcurrentPipeWriter part https://github.com/aspnet/AspNetCore/pull/13399

Also do the XxxUnsynchronized methods in ConcurrentPipeWriter.FlushAsync need to be in a lock?

Not sure that's valid?

It's not. It's possible depending on the implementation of _innerPipeWriter.FlushAsync that it might happen to work, but it's not guaranteed to.

Closed the ConcurrentPipeWriter PR as not entirely sure what its doing.

These two identical _currentFlushTcs != null tests are a bit weird though. Is it for some data race?

https://github.com/aspnet/AspNetCore/blob/f0029227cf5ed4f13e2b5ce41c44cbfa763f79a5/src/Servers/Kestrel/Core/src/Internal/Infrastructure/PipeWriterHelpers/ConcurrentPipeWriter.cs#L105-L119

Let me just brain dump what we are seeing as the more visibility the better. I'm currently running the Http2 stress tests in the CoreFx repo https://github.com/dotnet/corefx/tree/master/src/System.Net.Http/tests/StressTests/HttpStress, running every test in parallel on a linux VM. Eventually, I hit a failure in the Get Parameters Test. This failure only repros when multiple streams are going at the same time.

Expected response content "8wDZb6Ozh6yPTLO9oVW4lTmQtTcfNSJ5YO6bnoA0QASsUIte6QmJDoOe9nf7TibEntr5EmMzLWdb1tvbayZxf0sk0SRz92XhYD5bRnPJ3nxgAmcFr4XV9eZm064mKNqIzv2ooK9mYFts2BZDpC75V1YudIaKjdzgwfhEjoRXulxtkhkpXqx8FlmxN8AWkp3uGDvMTLT1laAk4fAwXknsgqQBskrbdun5h2cWlNMBMwNfa2shxEyz1AEPnjUjUsNqm5222dA9ObVkTkbps9dPvn3XnCj2tdhUYWtU0pfK3obQfUw27erKEyuy2VEqu8Glu0Bwr7p2UJL7CcrJGguFmMFacuRVboQuYpWeJzB0ooqvcnmQdvaDE0zXkbKxgKQftRYM6C0GVuQxVKxp81rgECfXqythfEJUOB2AEPJQuJhKZ21QdVofSkQgl0V7JIaHawZKImaihRluULyowT4APe1dlLPDtB7cjQkdSh7qu5x1zVBQ2hBpEQsvvvZwNMpVJ9B4bLLFMDFJhzdFAzBC0INJXohX8ASYGSiqK6ohLvT0AYEvIJmOBdU1lsWmdo6Bkg2FHnv3ATq5oN8zalFoN4Ndxg5Ra24WoIFTWzeUOjEhmPCNkXBrZIngJPHbiTS7PJGW7eSb7kQeqfqXThh2qCxWUZ96h3muG6IS9gF54ajpr9qeZj70mQHCMj1YUMu4xWNkMNOnRbxGT8fLjVXxvl7qo66QehaiffrKl6HBLKZAS0jE3gpypw8BOXvGGy8k0F", got ":wDZb6Ozh6yPTLO9oVW4lTmQtTcfNSJ5YO6bnoA0QASsUIte6QmJDoOe9nf7TibEntr5EmMzLWdb1tvbayZxf0sk0SRz92XhYD5bRnPJ3nxgAmcFr4XV9eZm064mKNqIzv2ooK9mYFts2BZDpC75V1YudIaKjdzgwfhEjoRXulxtkhkpXqx8FlmxN8AWkp3uGDvMTLT1laAk4fAwXknsgqQBskrbdun5h2cWlNMBMwNfa2shxEyz1AEPnjUjUsNqm5222dA9ObVkTkbps9dPvn3XnCj2tdhUYWtU0pfK3obQfUw27erKEyuy2VEqu8Glu0Bwr7p2UJL7CcrJGguFmMFacuRVboQuYpWeJzB0ooqvcnmQdvaDE0zXkbKxgKQftRYM6C0GVuQxVKxp81rgECfXqythfEJUOB2AEPJQuJhKZ21QdVofSkQgl0V7JIaHawZKImaihRluULyowT4APe1dlLPDtB7cjQkdSh7qu5x1zVBQ2hBpEQsvvvZwNMpVJ9B4bLLFMDFJhzdFAzBC0INJXohX8ASYGSiqK6ohLvT0AYEvIJmOBdU1lsWmdo6Bkg2FHnv3ATq5oN8zalFoN4Ndxg5Ra24WoIFTWzeUOjEhmPCNkXBrZIngJPHbiTS7PJGW7eSb7kQeqfqXThh2qCxWUZ96h3muG6IS9gF54ajpr9qeZj70mQHCMj1YUMu4xWNkMNOnRbxGT8fLjVXxvl7qo66QehaiffrKl6HBLKZAS0jE3gpypw8BOXvGGy8k0F".
 Diverging at index 0. Uri: /variables?Var0=8wDZb6Ozh6yPTLO9oVW4lTmQtTcfNSJ5YO6bnoA0QASsUIte6QmJDoOe9nf7TibEntr5EmMzLWdb1tvbayZxf0sk0SRz92XhYD5bRnPJ3nxgAmcFr4XV9eZm064mKNqIzv2ooK9mYFts2BZDpC75V1YudIaKjdzgwfhEjoRXulxtkhkpXqx8FlmxN8AWkp3uGDvMTLT1laAk4fAwXknsgqQBskrbdun5h2cWlNMBMwNfa2shxEyz1AEPnjUjUsNqm5222dA9ObVkTkbps9dPvn3XnCj2tdhUYWtU0pfK3obQfUw27erKEyuy2VEqu8Glu0Bwr7p2UJL7CcrJGguFmMFacuRVboQuYpWeJzB0ooqvcnmQdvaDE0zXkbKxgKQftRYM6C0GVuQxVKxp81rgECfXqythfEJUOB2AEPJQuJhKZ21QdVofSkQgl0V7JIaHawZKImaihRluULyowT4APe1dlLPDtB7cjQkdSh7qu5x1zVBQ2hBpEQsvvvZwNMpVJ9B4bLLFMDFJhzdFAzBC0INJXohX8ASYGSiqK6ohLvT0AYEvIJmOBdU1lsWmdo6Bkg2FHnv3ATq5oN8zalFoN4Ndxg5Ra24WoIFTWzeUOjEhmPCNkXBrZIngJPHbiTS7PJGW7eSb7kQeqfqXThh2qCxWUZ96h3muG6IS9gF54ajpr9qeZj70mQHCMj1YUMu4xWNkMNOnRbxGT8fLjVXxvl7qo66QehaiffrKl6HBLKZAS0jE3gpypw8BOXvGGy8k0F

Afterward, I enabled server-side connection logging, which eventually repro'd that the corruption is occurring on the server side.
Server staring request:

dbug: Microsoft.AspNetCore.Server.Kestrel.Core.Internal.LoggingConnectionMiddleware[0]
      ReadAsync[817] 00 03 28 01 05 00 08 D0 6F 82 86 01 0E 31 32 37 2E 30 2E 30 2E 31 3A 35 30 30 31 04 7F 93 05 2F 76 61 72 69 61 62 6C 65 73 3F 56 61 72 30 3D 38 77 44 5A 62 36 4F 7A 68 36 79 50 54 4C 4F 39 6F 56 57 34 6C 54 6D 51 74 54 63 66 4E 53 4A 35 59 4F 36 62 6E 6F 41 30 51 41 53 73 55 49 74 65 36 51 6D 4A 44 6F 4F 65 39 6E 66 37 54 69 62 45 6E 74 72 35 45 6D 4D 7A 4C 57 64 62 31 74 76 62 61 79 5A 78 66 30 73 6B 30 53 52 7A 39 32 58 68 59 44 35 62 52 6E 50 4A 33 6E 78 67 41 6D 63 46 72 34 58 56 39 65 5A 6D 30 36 34 6D 4B 4E 71 49 7A 76 32 6F 6F 4B 39 6D 59 46 74 73 32 42 5A 44 70 43 37 35 56 31 59 75 64 49 61 4B 6A 64 7A 67 77 66 68 45 6A 6F 52 58 75 6C 78 74 6B 68 6B 70 58 71 78 38 46 6C 6D 78 4E 38 41 57 6B 70 33 75 47 44 76 4D 54 4C 54 31 6C 61 41 6B 34 66 41 77 58 6B 6E 73 67 71 51 42 73 6B 72 62 64 75 6E 35 68 32 63 57 6C 4E 4D 42 4D 77 4E 66 61 32 73 68 78 45 79 7A 31 41 45 50 6E 6A 55 6A 55 73 4E 71 6D 35 32 32 32 64 41 39 4F 62 56 6B 54 6B 62 70 73 39 64 50 76 6E 33 58 6E 43 6A 32 74 64 68 55 59 57 74 55 30 70 66 4B 33 6F 62 51 66 55 77 32 37 65 72 4B 45 79 75 79 32 56 45 71 75 38 47 6C 75 30 42 77 72 37 70 32 55 4A 4C 37 43 63 72 4A 47 67 75 46 6D 4D 46 61 63 75 52 56 62 6F 51 75 59 70 57 65 4A 7A 42 30 6F 6F 71 76 63 6E 6D 51 64 76 61 44 45 30 7A 58 6B 62 4B 78 67 4B 51 66 74 52 59 4D 36 43 30 47 56 75 51 78 56 4B 78 70 38 31 72 67 45 43 66 58 71 79 74 68 66 45 4A 55 4F 42 32 41 45 50 4A 51 75 4A 68 4B 5A 32 31 51 64 56 6F 66 53 6B 51 67 6C 30 56 37 4A 49 61 48 61 77 5A 4B 49 6D 61 69 68 52 6C 75 55 4C 79 6F 77 54 34 41 50 65 31 64 6C 4C 50 44 74 42 37 63 6A 51 6B 64 53 68 37 71 75 35 78 31 7A 56 42 51 32 68 42 70 45 51 73 76 76 76 5A 77 4E 4D 70 56 4A 39 42 34 62 4C 4C 46 4D 44 46 4A 68 7A 64 46 41 7A 42 43 30 49 4E 4A 58 6F 68 58 38 41 53 59 47 53 69 71 4B 36 6F 68 4C 76 54 30 41 59 45 76 49 4A 6D 4F 42 64 55 31 6C 73 57 6D 64 6F 36 42 6B 67 32 46 48 6E 76 33 41 54 71 35 6F 4E 38 7A 61 6C 46 6F 4E 34 4E 64 78 67 35 52 61 32 34 57 6F 49 46 54 57 7A 65 55 4F 6A 45 68 6D 50 43 4E 6B 58 42 72 5A 49 6E 67 4A 50 48 62 69 54 53 37 50 4A 47 57 37 65 53 62 37 6B 51 65 71 66 71 58 54 68 68 32 71 43 78 57 55 5A 39 36 68 33 6D 75 47 36 49 53 39 67 46 35 34 61 6A 70 72 39 71 65 5A 6A 37 30 6D 51 48 43 4D 6A 31 59 55 4D 75 34 78 57 4E 6B 4D 4E 4F 6E 52 62 78 47 54 38 66 4C 6A 56 58 78 76 6C 37 71 6F 36 36 51 65 68 61 69 66 66 72 4B 6C 36 48 42 4C 4B 5A 41 53 30 6A 45 33 67 70 79 70 77 38 42 4F 58 76 47 47 79 38 6B 30 46
      \x00\x03(\x01\x05\x00\x08脨o\x82\x86\x01\x0E127.0.0.1:5001\x04\x7F\x93\x05/variables?Var0=8wDZb6Ozh6yPTLO9oVW4lTmQtTcfNSJ5YO6bnoA0QASsUIte6QmJDoOe9nf7TibEntr5EmMzLWdb1tvbayZxf0sk0SRz92XhYD5bRnPJ3nxgAmcFr4XV9eZm064mKNqIzv2ooK9mYFts2BZDpC75V1YudIaKjdzgwfhEjoRXulxtkhkpXqx8FlmxN8AWkp3uGDvMTLT1laAk4fAwXknsgqQBskrbdun5h2cWlNMBMwNfa2shxEyz1AEPnjUjUsNqm5222dA9ObVkTkbps9dPvn3XnCj2tdhUYWtU0pfK3obQfUw27erKEyuy2VEqu8Glu0Bwr7p2UJL7CcrJGguFmMFacuRVboQuYpWeJzB0ooqvcnmQdvaDE0zXkbKxgKQftRYM6C0GVuQxVKxp81rgECfXqythfEJUOB2AEPJQuJhKZ21QdVofSkQgl0V7JIaHawZKImaihRluULyowT4APe1dlLPDtB7cjQkdSh7qu5x1zVBQ2hBpEQsvvvZwNMpVJ9B4bLLFMDFJhzdFAzBC0INJXohX8ASYGSiqK6ohLvT0AYEvIJmOBdU1lsWmdo6Bkg2FHnv3ATq5oN8zalFoN4Ndxg5Ra24WoIFTWzeUOjEhmPCNkXBrZIngJPHbiTS7PJGW7eSb7kQeqfqXThh2qCxWUZ96h3muG6IS9gF54ajpr9qeZj70mQHCMj1YUMu4xWNkMNOnRbxGT8fLjVXxvl7qo66QehaiffrKl6HBLKZAS0jE3gpypw8BOXvGGy8k0F
info: Microsoft.AspNetCore.Hosting.Diagnostics[1]
      Request starting HTTP/2 GET http://127.0.0.1:5001/variables?Var0=8wDZb6Ozh6yPTLO9oVW4lTmQtTcfNSJ5YO6bnoA0QASsUIte6QmJDoOe9nf7TibEntr5EmMzLWdb1tvbayZxf0sk0SRz92XhYD5bRnPJ3nxgAmcFr4XV9eZm064mKNqIzv2ooK9mYFts2BZDpC75V1YudIaKjdzgwfhEjoRXulxtkhkpXqx8FlmxN8AWkp3uGDvMTLT1laAk4fAwXknsgqQBskrbdun5h2cWlNMBMwNfa2shxEyz1AEPnjUjUsNqm5222dA9ObVkTkbps9dPvn3XnCj2tdhUYWtU0pfK3obQfUw27erKEyuy2VEqu8Glu0Bwr7p2UJL7CcrJGguFmMFacuRVboQuYpWeJzB0ooqvcnmQdvaDE0zXkbKxgKQftRYM6C0GVuQxVKxp81rgECfXqythfEJUOB2AEPJQuJhKZ21QdVofSkQgl0V7JIaHawZKImaihRluULyowT4APe1dlLPDtB7cjQkdSh7qu5x1zVBQ2hBpEQsvvvZwNMpVJ9B4bLLFMDFJhzdFAzBC0INJXohX8ASYGSiqK6ohLvT0AYEvIJmOBdU1lsWmdo6Bkg2FHnv3ATq5oN8zalFoN4Ndxg5Ra24WoIFTWzeUOjEhmPCNkXBrZIngJPHbiTS7PJGW7eSb7kQeqfqXThh2qCxWUZ96h3muG6IS9gF54ajpr9qeZj70mQHCMj1YUMu4xWNkMNOnRbxGT8fLjVXxvl7qo66QehaiffrKl6HBLKZAS0jE3gpypw8BOXvGGy8k0F

which isn't corrupted, while:

dbug: Microsoft.AspNetCore.Server.Kestrel.Core.Internal.LoggingConnectionMiddleware[0]
      WriteAsync[779] 00 03 02 00 00 00 08 D0 6F 3A 77 44 5A 62 36 4F 7A 68 36 79 50 54 4C 4F 39 6F 56 57 34 6C 54 6D 51 74 54 63 66 4E 53 4A 35 59 4F 36 62 6E 6F 41 30 51 41 53 73 55 49 74 65 36 51 6D 4A 44 6F 4F 65 39 6E 66 37 54 69 62 45 6E 74 72 35 45 6D 4D 7A 4C 57 64 62 31 74 76 62 61 79 5A 78 66 30 73 6B 30 53 52 7A 39 32 58 68 59 44 35 62 52 6E 50 4A 33 6E 78 67 41 6D 63 46 72 34 58 56 39 65 5A 6D 30 36 34 6D 4B 4E 71 49 7A 76 32 6F 6F 4B 39 6D 59 46 74 73 32 42 5A 44 70 43 37 35 56 31 59 75 64 49 61 4B 6A 64 7A 67 77 66 68 45 6A 6F 52 58 75 6C 78 74 6B 68 6B 70 58 71 78 38 46 6C 6D 78 4E 38 41 57 6B 70 33 75 47 44 76 4D 54 4C 54 31 6C 61 41 6B 34 66 41 77 58 6B 6E 73 67 71 51 42 73 6B 72 62 64 75 6E 35 68 32 63 57 6C 4E 4D 42 4D 77 4E 66 61 32 73 68 78 45 79 7A 31 41 45 50 6E 6A 55 6A 55 73 4E 71 6D 35 32 32 32 64 41 39 4F 62 56 6B 54 6B 62 70 73 39 64 50 76 6E 33 58 6E 43 6A 32 74 64 68 55 59 57 74 55 30 70 66 4B 33 6F 62 51 66 55 77 32 37 65 72 4B 45 79 75 79 32 56 45 71 75 38 47 6C 75 30 42 77 72 37 70 32 55 4A 4C 37 43 63 72 4A 47 67 75 46 6D 4D 46 61 63 75 52 56 62 6F 51 75 59 70 57 65 4A 7A 42 30 6F 6F 71 76 63 6E 6D 51 64 76 61 44 45 30 7A 58 6B 62 4B 78 67 4B 51 66 74 52 59 4D 36 43 30 47 56 75 51 78 56 4B 78 70 38 31 72 67 45 43 66 58 71 79 74 68 66 45 4A 55 4F 42 32 41 45 50 4A 51 75 4A 68 4B 5A 32 31 51 64 56 6F 66 53 6B 51 67 6C 30 56 37 4A 49 61 48 61 77 5A 4B 49 6D 61 69 68 52 6C 75 55 4C 79 6F 77 54 34 41 50 65 31 64 6C 4C 50 44 74 42 37 63 6A 51 6B 64 53 68 37 71 75 35 78 31 7A 56 42 51 32 68 42 70 45 51 73 76 76 76 5A 77 4E 4D 70 56 4A 39 42 34 62 4C 4C 46 4D 44 46 4A 68 7A 64 46 41 7A 42 43 30 49 4E 4A 58 6F 68 58 38 41 53 59 47 53 69 71 4B 36 6F 68 4C 76 54 30 41 59 45 76 49 4A 6D 4F 42 64 55 31 6C 73 57 6D 64 6F 36 42 6B 67 32 46 48 6E 76 33 41 54 71 35 6F 4E 38 7A 61 6C 46 6F 4E 34 4E 64 78 67 35 52 61 32 34 57 6F 49 46 54 57 7A 65 55 4F 6A 45 68 6D 50 43 4E 6B 58 42 72 5A 49 6E 67 4A 50 48 62 69 54 53 37 50 4A 47 57 37 65 53 62 37 6B 51 65 71 66 71 58 54 68 68 32 71 43 78 57 55 5A 39 36 68 33 6D 75 47 36 49 53 39 67 46 35 34 61 6A 70 72 39 71 65 5A 6A 37 30 6D 51 48 43 4D 6A 31 59 55 4D 75 34 78 57 4E 6B 4D 4E 4F 6E 52 62 78 47 54 38 66 4C 6A 56 58 78 76 6C 37 71 6F 36 36 51 65 68 61 69 66 66 72 4B 6C 36 48 42 4C 4B 5A 41 53 30 6A 45 33 67 70 79 70 77 38 42 4F 58 76 47 47 79 38 6B 30 46
      \x00\x03\x02\x00\x00\x00\x08脨o:wDZb6Ozh6yPTLO9oVW4lTmQtTcfNSJ5YO6bnoA0QASsUIte6QmJDoOe9nf7TibEntr5EmMzLWdb1tvbayZxf0sk0SRz92XhYD5bRnPJ3nxgAmcFr4XV9eZm064mKNqIzv2ooK9mYFts2BZDpC75V1YudIaKjdzgwfhEjoRXulxtkhkpXqx8FlmxN8AWkp3uGDvMTLT1laAk4fAwXknsgqQBskrbdun5h2cWlNMBMwNfa2shxEyz1AEPnjUjUsNqm5222dA9ObVkTkbps9dPvn3XnCj2tdhUYWtU0pfK3obQfUw27erKEyuy2VEqu8Glu0Bwr7p2UJL7CcrJGguFmMFacuRVboQuYpWeJzB0ooqvcnmQdvaDE0zXkbKxgKQftRYM6C0GVuQxVKxp81rgECfXqythfEJUOB2AEPJQuJhKZ21QdVofSkQgl0V7JIaHawZKImaihRluULyowT4APe1dlLPDtB7cjQkdSh7qu5x1zVBQ2hBpEQsvvvZwNMpVJ9B4bLLFMDFJhzdFAzBC0INJXohX8ASYGSiqK6ohLvT0AYEvIJmOBdU1lsWmdo6Bkg2FHnv3ATq5oN8zalFoN4Ndxg5Ra24WoIFTWzeUOjEhmPCNkXBrZIngJPHbiTS7PJGW7eSb7kQeqfqXThh2qCxWUZ96h3muG6IS9gF54ajpr9qeZj70mQHCMj1YUMu4xWNkMNOnRbxGT8fLjVXxvl7qo66QehaiffrKl6HBLKZAS0jE3gpypw8BOXvGGy8k0F

is corrupted. Specifically, the first character is corrupted.

There is a test that is running which sends data frames a single byte at a time (Post Duplex Slow). My current thought is one of the bytes written there is corrupting the response in some way. What I'm going to do next is the following:

  • Try changing the size of the single data frames (instead, make it 10 bytes) and see if corruption is now 10 bytes long
  • Add additional logging to the server (specifically in the concurrent pipe writer) to capture more details about what is happening.

cc @eiriktsarpalis

Specifically, the first character is corrupted.

@jkotalik, that sounds like https://github.com/dotnet/corefx/issues/40459. I believe @eiriktsarpalis separated these because he was able to hit them independently.

that sounds like dotnet/corefx#40459

Ah, no, I misread what you wrote (I thought you were talking about a byte from the client rather than a byte from the server). Ignore me :)

cc @Pilchie @halter73 @Tratcher @davidfowl I think I'm getting closer with this.

There is a test that is running which sends data frames a single byte at a time (Post Duplex Slow)

FWIW I'm able to reproduce data corruptions in absence of "Slow" operations:

$ dotnet run -c Release -- -xops 4 8 9 13 14 -cancelRate 0 -serverUri http://localhost:5001

 0: GET                       Success: 15,538,202       Canceled: 0     Fail: 0
 1: GET Partial               Success: 15,538,203       Canceled: 0     Fail: 0
 2: GET Headers               Success: 15,538,186       Canceled: 1     Fail: 17
 3: GET Parameters            Success: 15,538,201       Canceled: 1     Fail: 2
 5: POST                      Success: 15,538,191       Canceled: 0     Fail: 12
 6: POST Multipart Data       Success: 15,538,193       Canceled: 0     Fail: 10
 7: POST Duplex               Success: 15,538,198       Canceled: 0     Fail: 4
10: POST ExpectContinue       Success: 15,538,199       Canceled: 1     Fail: 2
11: HEAD                      Success: 15,538,201       Canceled: 0     Fail: 0
12: PUT                       Success: 15,538,201       Canceled: 0     Fail: 0

In this case, I'm seeing corruptions in the POST, POST Duplex and POST Multipart Data operations. Corruptions always impact the first byte in those cases. Note that these operations don't do checksum validations, so I'm rerunning the suite with checksums added to diagnose where the corruption happens. I also want to investigate whether presence of Duplex operations are contributing to the corruptions.

EDIT: can confirm that I've had one instance of POST Duplex where corruption happens on server writes.

@jkotalik Should be closed now, right?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

MaximRouiller picture MaximRouiller  路  338Comments

barrytang picture barrytang  路  89Comments

pekkah picture pekkah  路  200Comments

danroth27 picture danroth27  路  79Comments

moodya picture moodya  路  153Comments