A lot of features in cryptography and X.509 certificates rely on the ability to read or write ASN.1-structured data. Not only would a good API for ASN.1 reading/writing simplify some of our existing code, it's very handy when someone wants to try adding support for something we don't have inbox, like reading a new certificate extension.
There are a couple of different aspects to the problem:
One of the harder parts to remember to get right is rejecting unexpected data. For example, from https://tools.ietf.org/html/rfc3280#section-4.1.1.2:
AlgorithmIdentifier ::= SEQUENCE {
algorithm OBJECT IDENTIFIER,
parameters ANY DEFINED BY algorithm OPTIONAL }
The AlgorithmIdentifier
structure defines two, and only two fields. If a third field is present the data is invalid. (In our current internal implementation this means you need if (reader.HasData) throw new CryptographicException();
at the end of reading it).
This means that we really don't want to have to resort to a ReadNext() type API, since it's easy to do it poorly.
One strawman is to build on the model of existing serializers, so one would so something like
```C#
[StructLayout(StructLayoutKind.Sequential)]
internal struct AlgorithmIdentifier
{
internal ObjectIdentifier Algorithm;
[Optional]
internal ??? Parameters;
}
The fact that the first example drew a need for a base class might be telling that it needs to be a Sequential-layout class instead of a struct. Or maybe ASN-ANY has to be a byte[] that gets reinterpreted by the caller (but the serializer validates that it was legally encoded).
But sequential ordering on the type seems like a requirement.
C-based libraries generally seem to solve this problem with macro-based template languages. So maybe a template is the way to go. But, maybe not.
Needs to consider:
* OPTIONAL (DER says never to write down ASN-NULL for an OPTIONAL value)
* DEFAULT (DER says to never write down a default value)
* Sometimes a default is on a complex type (like saying that a default AlgorithmIdentifier is SHA-2-256).
* ANY
* Context-specific tagging
* And Application-specific, and Private. I suppose
* That tags *technically* are not constrained to one-byte values.
* Collections (arrays/etc) can be SEQUENCE OF (write them down in the order I gave you) or SET OF (write them down in a proscribed order)
* This *could* be skipped in the encoder and declared to be a problem for the data provider; but since the sort order for SET OF frequently comes down to the data length (before data values) it is sort of hard for them to do it right.
* ASN INTEGER is C# BigInteger. But also supporting Int32/Int64/etc has virtue.
* There are lots of string encodings in ASN
* I'm sure floating point is unpleasant.
* DateTime values can be pretty weird. X.509 adds more restrictions than DER does. Is that an encoding? An attribute? A post-deserialization validation?
* Any other quirks that exist when combining ITU-T X.680 (ASN), ITU-T X.690 (BER/CER/DER) and C#.
## 2) What encodings are supported?
ITU-T X.690 defines 3 different encoding rulesets:
* BER (Basic Encoding Rules)
* CER (Canonical Encoding Rules)
* DER (Distinguished Encoding Rules)
Generally speaking; BER defines all the possibilities for representing data, CER describes a set of restrictions on BER, and DER describes even more restrictions. Occasionally CER and DER give conflicting recommendations, though.
* Read-as-BER can read anything.
* Read-as-CER (maybe) should reject an octet-string with a primitive encoding in excess of 1000 octets (ITU-T REC X.690 2015-08 section 9.2)
* (Maybe there needs to be CER-strict and CER-flexible?)
* Read-as-DER should (definitely) reject any indefinite length encoding (ITU-T REC X.690 2015-08 section 10.1).
Almost all things in cryptography use DER, making DER a minimum bar. But, since [CMS / S/MIME / PKCS#7](https://tools.ietf.org/html/rfc2315) says that it's designed to allow BER indefinite length encodings, there are clearly cases where readers need to be more flexible. (Windows and OpenSSL both write all CMS objects with DER; macOS uses indefinite length encodings)
## 3) What does an API look like?
Largely this comes out of the inputs/outputs section, but encodings need to be taken into account.
```C#
public static AsnEncoding
{
public static ???? Encode<T>(T value, AsnEncodingType type, Span<byte> output);
public static ???? Decode<T>(ReadOnlySpan<byte> input,
}
If there's a base class then there could be an OnDeserialize()-style validation callback per hydrated member. If it's a convention that ends up being reflection, which isn't super fun. So maybe it's just up to the user... but that's somewhat annoying for embedded structures. Particularly through OPTIONAL.
It probably makes sense to try to design the API, and use it internally, and then eventually expose it (essentially letting internal consumption be the proving ground).
I'm interested to see what you come up with so I can compare it to my fast forward-only ASN.1 reader.
Pretty excited to hear this being considered. I was working on a TSP client a while back and wrote my own generic parser for it. There's naturally BouncyCastle but the API looks quite overcomplicated and alloc-heavy.
Let's just say the task is very very far from trivial and since it's usually security related there's a very low margin for error. Sometimes you may want BER+DER in the same input stream or mixed implicit+explicit tagging.
Not sure about the attribute/serializer approach. I ditched this pretty early on as it was too complex to implement and not flexible enough. Using an XmlReader-like approach (with peek instead of ReadNext and some sequence scoping) worked quite well though. I could then avoid reflection entirely and add app-specific logic to the deserialization like inline-decoding the context-specific content of an EncapsulatedContentInfo.
This could also serve as a basis for a higher level abstraction if you still want to go attribute-based.
This could also serve as a basis for a higher level abstraction if you still want to go attribute-based.
Yeah, the idea is that someone should be able to, for example, look at https://tools.ietf.org/html/rfc5652#section-9.1 and quickly (and correctly) be able to write a structured reader to read the AuthenticatedData object structure. (And, one supposes, write it as well).
While it's possible to do that with a reader-style API it's really easy to forget to check for unknown values, which could be there as part of a length-extension (or other) payload attack.
If it is helps with the consideration and planning at all, I consider this feature important enough to me that I am willing to spend a significant amount of time helping, if you're open to it.
@vcsjones The core feature work is on the feature-set that I've given up my management chain. There's a chance that we won't want to mark the API as public during the 2.1 timeframe; it depends on how polished we think it is.
My official suggestions were that we'd need to start on this first, since it's useful in a number of things I'd like us to be doing, and when we're happy with the API flip it to public. I also decided that the Reader-style API is important in addition to the template/serialization API, for different reasons... so maybe what happens for 2.1 is a good primitive reader, but the serialization part isn't ready.
And, of course, there's a chance that all "my" funding gets cut and nothing at all happens here. But I was told I can mark my proposals as being in the 2.1 milestone :smile:.
@bartonjs estimates 1 week remaining
Awesome! Reader only or does that also include serialization?
@ebekker The "one week remaining" is for the reader+writer and the serializer+deserializer; all as internal API only. For becoming public it looks like:
The "one week remaining" means that I think there are between 20 and 40 bartonjs-hours remaining to move the "We are here" down one-and-a-half points, so a reader+writer and a serializer+deserializer will be checked in, non-public.
There's still a fair bit of things to do before "Make it public", and that means that it's probably not going to be public in the 2.1 release. But it will be in it, getting tested by being the internals powering a couple of things.
Wow, thanks for the update.
I was hoping to use this ASN.1 reader/writer in .NET 4.x projects. @bartonjs do I understand it correctly that it will available only for .NET Core?
@jariq When the ASN.1 types become public API they will become public only for netcoreapp; and then would appear in .NET Framework at some later point; but it won't be applicable retroactively to existing .NET Framework versions.
The code for the reader/writer/serializer/deserializer should work on .NET Framework (since it was built using only netstandard references) if you want to copy it into a project locally.
FWIW, NuGet seems to have had to implement quite a bit of ASN.1 parsing/writing as part of getting signatures working: https://github.com/NuGet/NuGet.Client/tree/dev/src/NuGet.Core/NuGet.Packaging/Signing/Signatures
They even have their own DerEncoder.
It should would be nice to get something public here so we can all stop re-implementing this :)
When the ASN.1 types become public API they will become public only for netcoreapp; and then would appear in .NET Framework at some later point; but it won't be applicable retroactively to existing .NET Framework versions.
Why does this have to be the case? Can they be added to a new standalone .NET Standard
library that can be used everywhere rather than baked into the platform components itself?
For ASN.1 parsing, that seems to be the kind of thing that can be standalone.
FWIW, NuGet seems to have had to implement quite a bit of ASN.1 parsing/writing as part of getting signatures working
Sort of. They just copied the first draft of this.
Can they be added to a new standalone .NET Standard library that can be used everywhere rather than baked into the platform components itself?
No, because then the platform can't use it. (It's particularly a problem for the ability to go back to .NET Framework)
Can this be done like the ValueTuple nuget? Separate package for older platforms, built-in for newer platforms.
@bartonjs can I get a "It works on baronjs machine" seal of "it should work but I'm not gonna give you any guarantees\support" approval, so that I can copy the code into my own project? :)
I'd rather use yours then use my own "just write those bytes to disk should be fine" pem reader\writer code :D
@bartonjs so I was copying the code and noticed this, (yes I couldn't wait)
I think you are missing a error message because I only see a SR.Format but no SR.UnhelpFullErrorMessage
https://github.com/dotnet/corefx/blob/07e9caf00ea0f1893d4c25a5ee287000903fbbe2/src/Common/src/System/Security/Cryptography/Asn1V2.Serializer.cs#L1066-L1070
an I get a "It works on baronjs machine" seal of "it should work but I'm not gonna give you any guarantees\support" approval
In .NET Core 2.1 these classes are powering SignedCms, and the Unix implementation of EnvelopedCms.
I think you are missing a error message
Huh, yep. Good thing that's a design-time exception, since it means no one will get a field name as an exception message with CMS :smile:.
@bartonjs I must say that the current AsnWriter is making it a breeze to manually write PKCS files
much much better then the garbage of code I built
@KLuuKer, are you just copying all the code locally and changing all the internal classes to public? Just looking to get an idea of the best approach to experiment with this myself. I would like to support exporting private keys (RSA, EC) using as much in-the-box components (or what will hopefully become) as possible.
@ebekker I straight up copy the code into a "IBorrowedThisGetRidOfItWhenPublicReleased" folder and remove the calls to the internal SR.xxxxx
resource calls
grab more files when it complains about them, rinse and repeat
you could mark the classes as public, I am making helper methods in that project anyway so I left them internal
don't forget to enable languageversion 7.2 <LangVersion>7.2</LangVersion>
and allow unsafe code <AllowUnsafeBlocks>true</AllowUnsafeBlocks>
@bartonjs so I am running into I think a security check from https://github.com/dotnet/corefx/blob/07e9caf00ea0f1893d4c25a5ee287000903fbbe2/src/Common/src/System/Security/Cryptography/AsnWriter.cs#L390
because I am trying to write this value
var asnWriter = new AsnWriter(AsnEncodingRules.DER);
asnWriter.WriteInteger(new ReadOnlySpan<byte>(new byte[] { 0xff, 0xcf }));
its the first two bytes of a RSAParameter after bruteforce writing encodes of new RSACryptoServiceProvider(512)
keys
am I supposed to pre-encode it somehow?
That's an illegal encoding for -49
, or 65487
. -49 would be { 0xCF }. 65487 would be { 0x00, 0xFF, 0xCF }.
So the real question is who is supposed to fix this? the consumer or the writer?
do I just need to add a 0x00 in front of my byte[] if it test positive for that check?
same goes for when reading it back in again, do I need to check for it again before I put it back in my RSAParameter?
AsnWriter.WriteInteger
writes ASN.1 BER/CER/DER integers. That is: signed, big-endian integers using a minimum number of bytes.
RSAParameters.Exponent
(and RSAParameters.Modulus), on the other hand, is an unsigned, big-endian integer using a minimum number of bytes.
RSAParameters.P
(and Q, DP, DQ, InverseQ) is an unsigned, big-endian fixed-width integer whose byte[].Length
value must be exactly ((RSAParameters.Modulus.Length + 1) / 2)
. (And RSAParameters.D must have the same length as RSAParameters.Modulus).
The responsibility of translating between the two formats would belong in the code which speaks both formats (in this case, the code you are writing). Inserting an extra 0x00 when writing isn't always correct (because it can violate the "minimum number of bytes" rule), and removing it during read isn't always correct (because the properties other than Modulus and Exponent have specific required lengths).
@bartonjs I hate to ask this but do you have some code that does this correctly?
I have tried building it myself but it's kinda hard 😅
I have tried some stuff and that works when I import it back in using my own code
but it fails to import using bouncycastle
var pem = "-----BEGIN RSA PRIVATE KEY-----\n" + Convert.ToBase64String(der) + "-----END RSA PRIVATE KEY-----";
var pr = new PemReader(new StringReader(pem));
var keyPair = (AsymmetricCipherKeyPair)pr.ReadObject();
var rsaParams = DotNetUtilities.ToRSAParameters((RsaPrivateCrtKeyParameters)keyPair.Private);
this is how I export mine
var asnWriter = new AsnWriter(AsnEncodingRules.DER);
asnWriter.PushSequence();
asnWriter.WriteInteger(0);
asnWriter.WriteInteger(parameters.Modulus);
asnWriter.WriteInteger(parameters.Exponent);
asnWriter.WriteInteger(FixValues(parameters.D));
asnWriter.WriteInteger(FixValues(parameters.P));
asnWriter.WriteInteger(FixValues(parameters.Q));
asnWriter.WriteInteger(FixValues(parameters.DP));
asnWriter.WriteInteger(FixValues(parameters.DQ));
asnWriter.WriteInteger(FixValues(parameters.InverseQ));
asnWriter.PopSequence();
return asnWriter.Encode();
p.s. FixValues is my testing helper to add and remove 0x00 from the byte array
We'll also most probably need ASN.1 parsing and writing soon. Our use case is not related to x.509 certs, but different data which happens to be serialized in ASN.1.
We're aiming for a stable toolchain, if possible including a compiler. So far, we found https://github.com/drasil/BinaryNotes (with its NetStandard fork https://github.com/matizk144/BinaryNotes), which looks promising, but the Compiler is in Java, and the project itself had it's last updates 2017, so it's questionable whether it's still maintained.
I went through the API and tried to switch AsnReader
for DerSequenceReader
in some places. I propose the following API additions as a result of that experiment:
c#
public long ReadInt64();
public long ReadInt64(Asn1Tag expectedTag);
public ulong ReadUInt64();
public ulong ReadUInt64(Asn1Tag expectedTag);
public int ReadInt32();
public int ReadInt32(Asn1Tag expectedTag);
public uint ReadUInt32();
public uint ReadUInt32(Asn1Tag expectedTag);
public short ReadInt16();
public short ReadInt16(Asn1Tag expectedTag);
public ushort ReadUInt16();
public ushort ReadUInt16(Asn1Tag expectedTag);
public sbyte ReadInt8();
public sbyte ReadInt8(Asn1Tag expectedTag);
public byte ReadUInt8();
public byte ReadUInt8(Asn1Tag expectedTag);
They would act exactly as their Try<name>
counterparts except that instead of returning false
they would throw new CryptographicException(SR.Cryptography_Der_Invalid_Encoding);
.
Rationale is that all the current usages of the TryRead[U]Int
methods do that already. It would move the common pattern into AsnReader
and the caller would need one less resource string and one less thing to worry about.
Hmm - those could be implemented as Extension Methods, right?
Yes, they could be implemented as extension methods, but the API is not public yet and there's basically no reason not to add them as first class methods. You could argue that the parameter-less TryRead
methods could be extension methods too. One part of the rationale is that you don't have to copy the Cryptography_Der_Invalid_Encoding
resource string once the API goes public (as part of System.Security.Cryptography.Encoding
I guess).
Even as extension methods, they can still be defined within the same assembly, in the same namespace, with the same visibility level. :-)
This way, when the original class is published, the extension methods can be published at the same time.
The main advantage of extension methods is that they don't bloat the original class, helping to keep components small, self-contained and maintainable.
Class bloat can grow into a real mess, bad maintainability etc... I know a codebase with some classes having around 20k to 50k lines... Extension methods could have helped there, but they were not available when the codebase was started. :-(
So as long as there's no "hard" advantage of making them instance methods, I'll favour extension methods...
If you are concerned about file size you can always use partial
classes. I don't think extension methods are warranted unless the API contract is already set in stone.
I'm more concerned about class size, than file size... A huge file can always be split into a set of smaller files, but a class by itself cannot be split (although the source of the class can be split using partial...).
The AsnSerializer
currently doesn't have a way to serialize large unsigned integers (the underlying AsnWriter.WriteIntegerUnsigned
method). It is possible to workaround with prepending zero to the integer when necessary, but there are some legitimate use cases that could warrant a direct support in the serializer (eg. TBSCertificate SerialNumber field).
Update: After thinking about it more it's really just an insufficient definition to use
[Integer]
public ReadOnlyMemory<byte> SerialNumber;
It is used all over the place in CoreFX and it is assigned the normalized serial number (eg. X509Certificate.GetSerialNumber
result). However, when the serial number is generated it can contain leading zeros (should be stripped in DER) or leading 0x80 (which needs to be prefixed to keep the integer unsigned). So maybe it should just be defined as BigInteger
, but the existing code doesn't do that.
TBSCertificate.SerialNumber
isn't unsigned. Or, rather, tools exist which write negative numbers. I know, because I recently had to fix the serialization from normalizing it back to a positive number in the Pkcs library. (The RFCs say that a _conforming_ CA will not use negative serial numbers, but the ASN.1 structure doesn't prevent them)
BigInteger
is a supported field type. I just didn't want to use it there, because I didn't need any math on a serial number... and ReadOnlyMemory is cheaper than BigInteger.
There's not a way to say "this is a BigInteger and it must not be negative", but that's just an application problem... same as if a field is INTEGER (1..999) OPTIONAL
, saying ushort
leaves 15 thousand "application problem" values,
So, anywhere that I used [Integer] ReadOnlyMemory<byte>
was because I wanted to.
Stumbled upon this thread and wanted to mention that I have created a DER / ASN.1 encoder/decoder package which can be used with .NET Framework: https://github.com/huysentruitw/pem-utils
Is there any specific reason why the Asn1Attribute(s) cannot target properties as well as fields?
I was thinking of submitting a PR that adds property support because it seems to fit the pattern of serializer attributes throughout the rest of .NET Core.
Is there any specific reason why the Asn1Attribute(s) cannot target properties as well as fields?
1) Property setters can intercept and reject assignments. That makes it really unclear what happens when it rejects an assignment.
2) Property setters can run custom code. Custom code + serializer = eventual security bug.
3) Property ordering is even less well-defined than field ordering.
I was thinking of submitting a PR that adds property support
Because of point (2) this isn't something that I'd want in the ASN deserializer, since it potentially introduces deserialization vulnerabilities into components which are supposed to make trust decisions.
The structures involved in ASN (de)serialization should be minimalistic, and probably not exposed as public API types.
Good point and makes sense. Thank you.
Because of point (2) this isn't something that I'd want in the ASN deserializer, since it potentially introduces deserialization vulnerabilities into components which are supposed to make trust decisions.
The structures involved in ASN (de)serialization should be minimalistic, and probably not exposed as public API types.
@KLuuKer If you don't mind me asking, how did you resolve ByReference
, GetRawSzArrayData
and _pointer
in Asn1V2.cs
when backporting this to your .NET Framework project? Did you track down the .NET Core implementations of these and copied them as well?
I'm currently working on an LDAP library (RFC 4511) using version 35320e086b5f159d74737558f4e3806c53dfaf8c (straight copied as @KLuuKer suggested).
I think this could serve as
PR a feature making use of it
Please let me know if feedback would be helpful at this point or if someone else tried this before.
This is pretty early, but I think I already found some edge cases which might be worth looking at (e.g. LDAPString
vs. UTF8String
, having a generator for ASN.1, etc.).
@zivillian I'd be happy if you give some feedback. I converted all the internal DER/ASN code to use the new APIs in order to provide some validation of the API design. I'm generally happy with it now except for few issues that are tracked here or in other issues.
First of all, thanks for your great work.
I'm not sure wether this issue is the correct place to track all my possibly unrelated issues and have created a new repo at zivillian/ldap.net.
Until now, I've found the following:
Please let me know if those are already tracked somewhere else, or if I should open another issue over here.
I may add some more while I'm trying to implement the remaining data types.
Another pain point is that I would not be able to implement the protocol without a debugger. Whenever I have an error in my type definitions I get a CryptographicException
which is rooted at ASN1 corrupted data
.
I will definitely look into the code generator mentioned in 28683, but this may not help others.
I know that the exception might be security related, but maybe there's a way to set a static property or detect a debugger to get more helpful exceptions during debugging or in unit tests (just skip the catch in https://github.com/dotnet/corefx/blob/35320e086b5f159d74737558f4e3806c53dfaf8c/src/Common/src/System/Security/Cryptography/Asn1V2.Serializer.cs#L367-L375).
Definitely look at the code generator, it solves the issue with useless exceptions. I will look at the other issues in detail.
The original exception should be accessible in the InnerException...
Am 18.09.2018 um 21:16 schrieb zivillian <[email protected]notifications@github.com>:
Another pain point is that I would not be able to implement the protocol without a debugger. Whenever I have an error in my type definitions I get a CryptographicException which is rooted at ASN1 corrupted data.
I will definitely look into the code generator mentioned in 28683, but this may not help others.
I know that the exception might be security related, but maybe there's a way to set a static property or detect a debugger to get more helpful exceptions during debugging or in unit tests (just skip the catch in https://github.com/dotnet/corefx/blob/35320e086b5f159d74737558f4e3806c53dfaf8c/src/Common/src/System/Security/Cryptography/Asn1V2.Serializer.cs#L367-L375).
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHubhttps://github.com/dotnet/corefx/issues/21833#issuecomment-422513418, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AAQnJIsI8ucDSOoRPuliUxvHURGA1KW7ks5ucUZ7gaJpZM4OMk-H.
After trying to migrate to the XML generator I found a blocking issue (zivillian/ldap.net#5) regarding the tag class.
With attributes it is possible to specify the tagClass and tagValue:
[ExpectedTag(TagClass.Application, 10)]
[OctetString]
public ReadOnlyMemory<byte>? DelRequest;
as specified in RFC 4511 - 4.8
DelRequest ::= [APPLICATION 10] LDAPDN
LDAPDN ::= LDAPString -- Constrained to <distinguishedName> [RFC4514]
LDAPString ::= OCTET STRING -- UTF-8 encoded, -- [ISO10646] characters
I wasn't able to find a way to set the tagClass in xml, since there is no attribute. Using
<asn:OctetString name="DelRequest" implicitTag="10"/>
results in a CryptographicException
because the generated code is checking for TagClass.ContextSpecific
:
if (tag.HasSameClassAndValue(new Asn1Tag(TagClass.ContextSpecific, 10)))
{
if (reader.TryGetPrimitiveOctetStringBytes(new Asn1Tag(TagClass.ContextSpecific, 10), out ReadOnlyMemory<byte> tmpDelRequest))
{
decoded.DelRequest = tmpDelRequest;
}
else
{
decoded.DelRequest = reader.ReadOctetString(new Asn1Tag(TagClass.ContextSpecific, 10));
}
}
else
{
throw new CryptographicException();
}
@zivillian You could add an attribute in your copy of asn.xsd and process it in asn.xslt to change the class https://github.com/dotnet/corefx/blob/master/src/Common/src/System/Security/Cryptography/Asn1/asn.xslt#L882. We simply haven't had a need for it yet (nothing that we've built used the APPLICATION or PRIVATE classes)
@bartonjs thanks for your hint, works like a charm https://github.com/zivillian/ldap.net/issues/5#issuecomment-422920372
@bartonjs What are the odds of getting these types made public for .NET Core 3?
@onovotny Public: not great ("temporal mechanics"). But I've closed the gap a bit by copying the reader and writer (and support types) to corefxlab.
The API there is different than the one here, because it got an extra pass of me looking at the shape, and starting to write down how I'd present it for API review, and then me making all the fixes that I'd bring up during that meeting if it wasn't my feature :). (The good old "let it sit for a couple months and see if you still like it" strategy)
I understand that Application/Private/ContextSpecific tag classes might be out of scope here, but it would be great to not leave them out absolutely, as they are widely used in smart cards industry, for example, including banking (see EMV specifications). And it's all mostly related to cryptography in those areas too.
The major complexities I see here are:
Asn1Reader
and Asn1Writer
classes.Asn1Writer
work with streams, especially with constructed tags.What if we actually try to break the two layers mentioned before apart in the code and introduce one more to control how the actual Asn.1 elements are serialized, say TlvSerializer
class that holds a collection of TlvConverter
, one of which will be the Asn1Converter to contain all those methods that serealize/deserialize TLV types right now in Asn1Reader
and Asn1Writer
.
This way we give users the ability to extend this implementation with their own converters for their Application/Private tag classes. Or even provide extension libraries to deserialize TLV/ASN.1 data into POCO.
@lil-Toady I'm not sure what the context of your comment is...
I understand that Application/Private/ContextSpecific tag classes might be out of scope here
The AsnReader
and AsnWriter
types, both the corefx-internal versions and the corefxlab-public ones, handle Application, Private, Context-Specific, and Universal tags.
The asn.xslt
code generator in corefx doesn't handle Application or Private, but it isn't really a public aspect.
X-690-201508 BER/DER/CER encodings not only control how the "framing layer" is done - TLV, but also limit the application level in Universal tags.
I don't follow. AsnReader
and AsnWriter
understand that UNIVERSAL 2 means Integer, so they won't let you read or write an Integer as UNIVERSAL 3. But you can write an Integer as PRIVATE 900, or APPLICATION 3 or CONTEXT-SPECIFIC 0... it can't say it's not true, so it lets it happen.
... introduce one more to control how the actual Asn.1 elements are serialized, say
TlvSerializer
...
I'm not sure how this would help write to Stream
... for a DER encoded object you can't write the second byte until you know what the last one is.
I'm also not sure where you're suggesting it would be / how it would be utilized. The interpretation I'm coming up with is codifying the Encode/Decode contract from what asn.xslt
is doing:
```C#
public abstract class AsnSerializer
{
public T Decode(AsnReader reader) { throw null; }
public void Decode(AsnReader reader, out T value) { throw null; }
protected abstract void DecodeCore(AsnReader reader, out T value) { throw null; }
public void Encode(in T value, AsnWriter writer) { throw null; }
protected abstract void EncodeCore(in T value, AsnWriter writer) { throw null; }
}
```
but that still seems to require explicit calls to reader.ThrowIfNotEmpty()
(making that implicit was my main goal of a serializer). I'm not sure what it buys other than pattern-building and composition (granted, those are useful things). -- But at the same time it sounds like you were intending it within the reader and writer, instead of utilizing the reader and writer, so I might not be following.
I'm definitely not understanding what going down to something called Tlv
instead of Asn
gains; if the tag is an ASN.1 class and number (and P/C bit) then it's still "Asn".
There's now a spec/feature proposal at https://github.com/dotnet/designs/pull/93.
Fixed by #36729.
Most helpful comment
Fixed by #36729.