Efcore: Char support broken in 3.0.0-preview1

Created on 13 Dec 2018  路  12Comments  路  Source: dotnet/efcore

I'm finalizing porting the Npgsql provider to the recently-released 3.0.0-preview1, and I have one outstanding problem. The Npgsql provider has a built-in mapping which maps CLR char to PostgreSQL char(1) - this is supported at the ADO level. This used to work fine in 2.2, but the relevant test in BuiltInDataTypes now breaks.

I've seen #9447 and am aware that the C# compiler emits a conversion expression from char to int, and I can indeed see that happening in the expression tree in the pipeline. However, I can confirm that in 2.2 the ADO layer got a parameter with a value of type char, whereas in 3.0 the parameter is populated with an int value instead.

Unless I'm mistaken, all CLR char support in SqlServer is based on value converters, so you wouldn't see this on your end. I could also change Npgsql to go omit the built-in mapping and to go through value converters, but it seems like a bit of a shame... Specifically, it's seems good for char(1) columns to get created at the database rather than a general text one or whatever.

Let me know what you think about all this, and if the breakage is somehow intentional. Once we decide what to do I can release 3.0.0-preview1.

/cc @austindrenski @yohdeadfall

closed-fixed type-bug

All 12 comments

@smitpatel @ajcvickers

@roji, it鈥檚 not your case?
https://github.com/dotnet/roslyn/issues/29396

@roji ~The break is not intentional, but I can't think what has gone into 3.0 that would cause it. It certainly does seem like something we should fix in the core code.~

See below--SKB (Smit Knows Best)

This is intentional break. It happens in parameter extracting phase. We evaluate maximal subtree on client, which involve the convert to int also (which was not happening before). So the parameter you get is of type int. I will think about how to fix it so providers can generate correct thing.

@roji wrote:

However, I can confirm that in 2.2 the ADO layer got a parameter with a value of type char, whereas in 3.0 the parameter is populated with an int value instead.

Yikes...

@smitpatel wrote:

This is intentional break. It happens in parameter extracting phase. We evaluate maximal subtree on client, which involve the convert to int also (which was not happening before). So the parameter you get is of type int. I will think about how to fix it so providers can generate correct thing.

Could you link to the relevant source (or commit)?

@austindrenski - https://github.com/aspnet/EntityFrameworkCore/pull/13862
To clarify myself, the change is intentional. We need to figure out how to deal with it. We can also special case char-to-int convert (just like enum-to-int) one. Or deal with it when we add parameters to SQL. Certainly not having char support in SqlServer makes us miss such case.

Thanks for the link!

@smitpatel wrote:

To clarify myself, the change is intentional. We need to figure out how to deal with it. We can also special case char-to-int convert (just like enum-to-int) one. Or deal with it when we add parameters to SQL. Certainly not having char support in SqlServer makes us miss such case.

Is this the code you're referring to?

```c#
public override Expression Visit(Expression expression)
{
if (expression == null)
{
return null;
}

if (_evaluatableExpressions.TryGetValue(expression, out var generateParameter)
    && !PreserveConvertNode(expression))
{
    return Evaluate(expression, _parameterize && generateParameter);
}

return base.Visit(expression);

}

```c#
private static bool PreserveConvertNode(Expression expression)
{
    if (expression is UnaryExpression unaryExpression
        && (unaryExpression.NodeType == ExpressionType.Convert
            || unaryExpression.NodeType == ExpressionType.ConvertChecked))
    {
        if (unaryExpression.Type == typeof(object)
            || unaryExpression.Type == typeof(Enum))
        {
            return true;
        }

        if (unaryExpression.Operand.Type.UnwrapNullableType().IsEnum)
        {
            return true;
        }
    }

    return false;
}

In the case of char, could PreserveConvertNode(...) be modified to preserve the outer convert node, while still evaluating the inner expression (rather than bailing out altogether)?

@austindrenski - That is one option to fix this.

Thanks for the explanation @smitpatel, makes perfect sense. I'll skip the offending tests and wil we'll release 3.0.0-preview1 soon.

@sdanyliv that issue definitely discusses this conversion, but it seems like it's here to stay - so we need to properly handle it...

@sdanyliv Thanks for linking dotnet/roslyn#29396. I've dealt with that while translating expression trees, but hadn't been able to find any documentation of it. Great to finally have a reference explaining it. Much appreciated.

I wasn't aware of this discussion and made the simplest change to make this work for Cosmos in 70c3cb36deade8dcffa30063cda8bdfbc8aeaf89:
```C#
protected virtual bool PreserveConvertNode(Expression expression)
{
if (expression is UnaryExpression unaryExpression
&& (unaryExpression.NodeType == ExpressionType.Convert
|| unaryExpression.NodeType == ExpressionType.ConvertChecked))
{
if (unaryExpression.Type == typeof(object)
|| unaryExpression.Type == typeof(Enum))
{
return true;
}

    if (unaryExpression.Operand.Type.UnwrapNullableType().IsEnum
        || unaryExpression.Operand.Type.UnwrapNullableType() == typeof(char))
    {
        return true;
    }

    return PreserveConvertNode(unaryExpression.Operand);
}

return false;

}
```

Was this page helpful?
0 / 5 - 0 ratings