Runtime: DataTable like API in .Net Core

Created on 18 May 2016  Â·  63Comments  Â·  Source: dotnet/runtime

System.Data.DataTable is present in .Net core as an empty class to complete the interfaces implementation.
This issue is to track the work needed to bring in an API to provide DataTable like API in .Net Core.

area-System.Data

Most helpful comment

@ahhsoftware, you seem to be misunderstanding the conversation.

The current version of .NET Core does not have DataSet/DataTable, this will not change. The next version of .NET Core _will_ have DataSet/DataTable - this has already been confirmed here. There's no need to say how important these components are for you.

However, it's important to understand that what you're accomplishing with DataSet/DataTable can be accomplished with _other_ means. In other words, if you need to port to .NET Core _today_ than you have options for dropping DataSet/DataTable. Otherwise you can wait for netstandard20 which will have DataSet/DataTable, and your current code should work without changes then.

All 63 comments

cc @ajlam

@saurabh500 does this mean you guys are bringing back DataTable to .NET Core? Or are you planning to introduce a new API the performs some (or all) of DataTable's tasks?

FWIW I do think that dropping DataTable/DataSet is a good idea. I'm wondering if you guys are considering bringing it back purely for backwards-compatibility reasons or is there something else?

Hopefully the right place to post this. Will this include the ability to use a DataTable as a parameter in a stored procedure call? I have lots of SP's that take a TableValue parameter. In ADO.NET I fill a DataTable with data and pass it using SqlDbType.Structured as the parameter type. Unless there is currently another way to do this?

I do not think that sending a whole bunch of insert/update/delete statements is a viable alternative.

@nmunday maybe there already is an option to do that. SqlBulkCopy now accepts DbDataReader as an input instead of DataTable so I expect that this could work similar way, try pass DbDataReader instead of DataTable as the parameter. If not, I second that it should be possible somehow.

You can just use an IEnumerable<SqlDataRecord> in place of a DataTable

@sqmgh where would I find such a class (SqlDataRecord) in .NET Core? I tried the docs and intelisense in VS and it does not seem to be there. In the full framework SqlDataReader implements IDataRecord but not so in .NET Core.

@kenticomartinh that may work for some of the SP's that do straight inserts. Many SP's contain other logic that is executed at the same time, some SP's do not even end up inserting data at all. I will look into SqlBulkCopy for sure and use it when it does the job. Thanks.

@nmunday Add System.Data.SqlClient to your dependencies, and it's in Microsoft.SqlServer.Server. IntelliSense will recommend adding that using if you try and construct one. It's been in Core for quite a while.

DataTable/DataSet is rich in features, bit bloated but it do the job well.
Not sure the real reason why this object should be dropped.

Some notable features are:

  • Deserialize/Serialize schema and data from/to xml
  • Column expression
  • Change tracking via DataRowState
  • Copyable and cloneable object
  • Column's default value
  • DataView
  • etc

If there is something bad is that its store the data with object array which promote boxing and unboxing, this hopefully solved by tuple in c# 7 maybe
Accept/Reject Change bit tricky to use but it works

Not everyone use ORM, not everyone use EF. Those who stay with ADO.NET should be punished like this. Those who battle with database query optimizer understand the pain losing control over your query because your query is automatically generated.

You use Stored Procedure, you change your select statement, you don't want to change the application just because your schema is changing.

With DataSet and DataTable I feel like having in memory database.

Optimize the object, make it lighter, faster but dropping its features.

@sqmgh thanks for the pointers. I did indeed find the class once I added the SqlClient reference. Not sure how I missed it. The code is now working. I was always unsatisfied with loading a DataTable just to then send it to a stored procedure. Seemed like a waste of memory and resources. Now by streaming with SQLDataRecord I can avoid the load altogether. Thanks again.

@nmunday No problem, and glad I could help. I was doing the exact same thing in the past, and I felt the exact same way before I discovered that I could just yield SqlDataRecords.

After that point I basically never used DataTable again. Now they just seem like a janky and overly complex for what you get ORM.

I suppose could potentially see some use for them in a very quick application where you have to load tabular data without knowing what columns will exist. However, that doesn't ever apply to me personally. I only ever deal in situations where I at least know which columns _might_ be there. In that case I think it's more than worth the effort to just generate a record class. Dealing with hard typed columns tend to make the codes way easier to read or write, and that's without even looking at things on the performance and reliability side.

It seems many people are asking for something lightweight between EF Core and low-level System.Data.Common interfaces. I'm the author of the NI.Data library (.NET 4.x) that offers abstract (db-independent) queries and simple db-independent interface for CRUD-operations. I can port most demanded parts to a new corefx-oriented library (queries, dataview-abstractions, SQL-generation components) with a simple interface for high-performance operations with POCO and/or IDictionary instead of DataTable/DataRow. If someone is interested in this library don't hesitate to contact me.

UPD: just finished first version of data access library (supports both .NET Core and .NET 4.x) that implements SQL command builder and data adapter for CRUD operations: NReco.Data library. I hope it will be helpful for everyone who looking for DataRow/DataTable/SqlCommandBuilder/DbDataAdapter alternative in .NET Core.

This Microsoft article clearly demonstrates the advantages of using DataSets/DataTables to improve performance when connecting to an Azure SQL Database.

https://azure.microsoft.com/en-gb/documentation/articles/sql-database-use-batching-to-improve-performance/

corefx needs this to work nicely with Microsoft's own SQL Database service.

@j055 I don't think there's any performance optimization provided by DataSets/DataTables that isn't possible with regular commands. You can batch multiple inserts into a single DbCommand just as easily.

As easily as this?

https://msdn.microsoft.com/library/aadf8fk2.aspx

Please do tell.

How about new SqlCommand("INSERT ...; INSERT ...; INSERT ...;", conn)?

@j055 in my comment above I mentioned that I've just published NReco.Data library that may generate SQL commands like old ADO.NET DbCommandGenerator/DbDataAdapter. Command batching (generating several SQL commands into one DbCommand) can be easily added; if you really need this feature feel free to add an issue with batching feature request. I believe that with NReco.Data you'll get definitely better performance than with DataTable/DataAdapter in old .NET Framework.

@roji I don't know how many people want to build up their commands with a big concatenated string. The term 'SQL Injection attack' also springs to mind. Of course it's doable. I just like a more elegant approach.

@VitaliyMF I'm sure your library is good for new projects but we have 10years worth of data access code using DataSets/DataTables.

@j055 SQL injection attacks are about concatenating _external_ input into your SQL, this has nothing to do with the batching question. In other words you can simply do:

c# new SqlCommand("INSERT INTO mytable (myfield) VALUES (@p1); INSERT INTO mytable (myfield) VALUES (@p2)", conn)

and be completely safe.

Whether you like this API or find DataTable/DataSet more elegant is a different (and not necessarily relevant) question. I happen to agree that there should be a structured batching ADO.NET API, which is why I've opened https://github.com/dotnet/corefx/issues/3688, but DataTable/DataSet definitely aren't the answer to that problem.

The main point here regarding what you said is that batching is entirely possible without DataTable/DataSet, so no optimization is being taken away by removing them.

DataTable/DataSet is a layer on top of basic ADO.NET (which is about connections and commands). IMHO it's neither here nor there: not quite an ORM, not strongly-typed and generally not so good. There may be a place for something like this out in the world, but it definitely shouldn't be a part of the basic ADO.NET layer in any way. The separation is important.

@roji DataSets are good for other things too and I guess we are not the only people using them in our applications. I think a lot of people will find it difficult to migrate without them. Yes, put it in a different project, that's fine.

The fact that they are not strongly-typed is actually very useful when transporting SQL data from stored procs to the business layer and back again. ORMs, yuck!

@sqmgh even using IEnumerable of SqlDataRecord I don't see how you can use it with Core SqlBulkCopy since WriteToServer accept only a parameter of type DbDataReader.

@max-favilli I don't use it with SqlBulkCopy. In the cases where I might use a SqlBulkCopy, simply using a TVP to do the bulk inserts has ended up performing better.

Obviously not a great answer if you are porting a ton of code that already uses SqlBulkCopy, or if you have a different case than mine where SqlBulkCopy does end getting that last tiny bit of needed performance.
However, I had already switched away pre-netcore so neither of those things applied to me.

@sqmgh Sorry for answering late. I have used TVP in the past, you need to define Table Types in SQL to use TVP in a store procedure as far as I remember, right?

@max-favilli Indeed you do.
I haven't run into this as being an issue personally, but I haven't had to deal with dynamic inserts on really wide tables.
I guess if that requirement really became an issue you could dynamically make a table type for each query and drop it afterwards?

@sqmgh You're right, I hadn't thought about doing that.

Another common (where I work) use for DataTable is to examine the schema of the output of a stored procedure. We have enormous dependence upon da.FillSchema(dt, SchemaType.Mapped); where da is a DataAdaptor and dt is a DataTable. If we could get detailed reflection over the output (specifically column names) of a stored procedure another way, it would be welcomed.

@PaluMacil as has been discussed in dotnet/runtime#14302, .NET Core includes an alternative API for examining the schema of a query (stored procedure or otherwise): see dotnet/runtime#16305. The new API has no dependencies on DataTable/DataSet and is superior in that it's strongly-typed.

Thanks. I spent quite a while looking through issues but there is a lot to read and I hadn't gotten there yet. I appreciate the pointer.

@roji Reading through the thread again, it still seems to be missing the point. The schema can be retrieved for a table, but I need it for a stored procedure result, which doesn't appear to be addressed there.

@PaluMacil .NET Core's new resultset schema API (#5915) is DbDataReader.GetColumnSchema(). It replaces DbDataReader.GetSchemaTable() which does the same thing but returns a DataTable instead of a strongly-typed object model. Both APIs can be called on any DbDataReader, whether it represents results for a table query, a stored procedure, or anything else. You should be able to use DbDataReader.GetColumnSchema() to get the schema for a stored procedure result.

Note that this shouldn't be confused with dotnet/runtime#15953, which is about providing a similar strongly-typed replacement for DbConnection.GetSchema(); while dotnet/runtime#16305 is about _result set_ schema, dotnet/runtime#15953 is about _database/table_ schema.

If things are still unclear let me know.

@roji They're clear now! Thanks.

Would this include the ability to read Excel (e.g. System.Data.OleDb)?
Just about to start a new project and having to use 4.6.1/2 because this hasn't been implemented (yet?). Thanks.

@alastairanderson for reading/writing OpenXml-based Excel files (xlsx) you may use netstandard build of EPPlus: https://www.nuget.org/packages/Shaman.EPPlus/

@stephentoub as I understand DataSet/DataTable stuff comes back to System.Data.Common assembly for now; does this mean that if all these legacy classes are not needed, there are no possibility to exclude them? Or you are planning to extract these classes to separate assembly (say, System.Data.DataSet) ?..

does this means that if all these legacy classes are not needed, there are no possibility to exclude them? Or you are planning to extract these classes to separate assembly (say, System.Data.DataSet) ?

I'll let @terrajobst and @weshaggard comment.

@VitaliyMF are there scenarios where you need only parts of what is now in System.Data.Common? Is it the size on disk that is of concern for you or is there another reason you want them separated?

@weshaggard since we already have small and lightweight System.Data.Common in netstandard1.6, I think it is good idea to keep ability to have minimal footprint for .net core microservices / .net native apps.

Main concern not the size on the disk of course, but excessive dependencies: in .NET Core we have possibility to declare dependencies only on (small) assemblies that are really needed; good example is System.Collections and System.Collections.NonGeneric - if my library doesn't need non-generic collections, I'm not obligated to depend on these classes.

Actually question should be: what is the reason to have DataSet/DataTable/DataRow in System.Data.Common? Right now (in netstandard1.6) we have simple and lightweight System.Data.Common, so this should be very important reason to add a lot of code into this assembly instead of using new one.

@VitaliyMF I agree with you in principle and perhaps we can even do that in this case but like with everything there is a balancing act we have to play in determining how many pieces we want to split things into. Having tons of tiny libraries isn't always better then having a single large library, they each have different trade-offs. For example with tons of tiny libraries there is acquisition and deployment problems, how do people determine where to find a type and how do they figure out what they need to deploy.

We are definitely interested in the minimal footprint scenarios and we will be working on making sure we have reasonably sized building blocks while not causing people to pay too much for what they aren't using.

@stephentoub while doing this work were there any types that were already in S.Data.Common that had new members that pulled in other types like DataTable/DataSet, or do you think those would layer nicely on top?

We use data tables extensively. When you are reading records off a queue with high throughputs and that data needs to be inserted in batches, you create stored procedures that take user defined table types, then create a data table that emulates the type. Now you can pass a full set of data to the stored procedure and do inserts, updates, or merges in a set based way. Trying to one off insert at very high volumes simply doesn't hold up. We gotta have data tables

@weshaggard and @VitaliyMF, wanted to add a +1 for keeping DataSet and friends in a separate package. These components don't really have anything to do with the the base ADO.NET layer - they are a data access layer on top of ADO.NET, similar to how ORMs are a layer. The initial decision to drop DataSet/DataTable entirely in netstandard shows to what extent this is true - these components are considered deprecated by many and are only being reintroduced for backwards compatibility. They therefore seem like an ideal candidate for splitting off into a separate package.

@ahhsoftware, the decision to reintroduce DataTable has already been made - that's not being discussed. The only question is whether it should be in the System.Data.Common library or in its own library.

Also, as I wrote above, DataSet/DataTable don't provide any performance benefits that can't be achieved via other means. Simply batching inserts should work just as well.

@weshaggard as .net components author, I understand your point about fragmentation; in some cases better to provide all-in-one assembly, but in case of .net standard libraries (including System.Data.Common) situation is a bit different: a lot of benefits comes if they will be as small as possible (and as stable as possible).

We all remember "API-changes-hell" last years (I'm talking about incompatibilities in k-runtime/DNX/RC1/RC2 and finally 1.0), and a lot of people don't use even .NET Core 1.0 for real projects yet because they afraid that everything may be changed again. From this perspective, it is much better to perform only necessary/minor changes into existing netstandard1.6 assemblies, and bring new functionality with new assemblies.

And the last, DataSet/DataTable/DataRow is an old technology with a lot of complicated code written in times of .NET 1.0. Do you really want to bring it back into System.Data.Common as netstandard 2.0 library, without possibility to refactor/enhance it in the future with minimal impact on the platform?

I know that you (MS guys) see whole picture and maybe you have strong arguments to have DataSet/DataTable in the System.Data.Common. But if you have a chance to have it separately, maybe it is better to spend more time on this now and avoid backward-compatibility problem in future?.. After all, .NET Core claimed to be a new (better) platform, not just copy of Mono.

@ahhsoftware you can use 3rd party libraries (netcore-compatible) for generating batching inserts and table-valued types with stored procedures (you don't need to wait for DataTable, actually). If you need a concrete solution feel free to contact me (or ask on SO) - I don't think this issue is a right place for these things.

So for starters we are in a vertical that must follow OWASP standards.
High on the list is that all data access must be done through stored
procedures.

Now consider that we have a service that takes an order, and all associated
items.
PROCEDURE PUT_ORDER
(
@PERSON_ID INT --FK to person placing order
@ORDER_NAME NVARCHAR(32) --name of order
@ORDER_ITEMS dob.ORDER_ITEM_UDT
)
BEGIN

On Sat, Oct 15, 2016 at 4:24 AM, Vitaliy Fedorchenko <
[email protected]> wrote:

@ahhsoftware https://github.com/ahhsoftware you can use 3rd party
libraries (netcore-compatible) for generating batching inserts and
table-valued types with stored procedures (you don't need to wait for
DataTable, actually). If you need a concrete solution feel free to contact
me (or ask on SO) - I don't think this issue is a right place for these
things.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/dotnet/corefx/issues/8622#issuecomment-253970515, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ARDmFBQMEU4YXxLVHpu2AkzXRLz6_Auyks5q0I2wgaJpZM4IgwIS
.

@weshaggard @stephentoub DataTable / Dataset need S.Data.Common base types and Data.Common have APIs which return DataTable objects.

@saurabh500 if you're intending to bring back the DataTable-based schema APIs in DbConnection and DbDataReader, then indeed it seems that DataTable/DataSet need to live in System.Data.Common...

@weshaggard @stephentoub DataTable / Dataset need S.Data.Common base types and Data.Common have APIs which return DataTable objects.

@stephentoub @saurabh500 can you please point out the particular APIs in question? If we do have these tight API dependencies it will tie our hands and will require is to keep these together in order provide compatible APIs.

For example, IDataReader.GetSchemaTable:
https://github.com/dotnet/corefx/blob/release/1.0.0/src/System.Data.Common/ref/System.Data.Common.cs#L109
IDataReader was already in System.Data.Common, with this member, and DataTable was already in System.Data.Common as a result, just with an empty implementation:
https://github.com/dotnet/corefx/blob/release/1.0.0/src/System.Data.Common/src/System/Data/DataTable.cs

@stephentoub currently (in netstandard1.6) IDataReader.GetSchemaTable returns stub "DataTable" class, and all current implementations just throw NotImplementedException on GetSchemaTable, am I right? In other words, this method de-facto is deprecated, and not used any more; nobody calls it from netstandard assemblies because it returns useless DataTable stub and actually throws an exception.

In addition to that, we already have an alternative way to get reader's data schema with IDbColumnSchemaGenerator.GetColumnSchema. It is also optional to implement and actually is not implemented even by MS connectors like Microsoft.Data.Sqlite.

I don't know why IDataReader.GetSchemaTable method wasn't removed in the beginning; since it is not used, is it possible to remove it, and implement "GetSchemaTable" in the way like "GetColumnSchema" works? I mean, separate interface + extension method for IDataReader or DbDataReader.

Simply put, if I can't pass sets of data to stored procedures, I can't move
to .net core, and at this point in time a data table is what provides the
means. You already have entity framework for those who can get by with
lighter weight SQL, but for the heavy lifting that ado.net provides, to
make it in a way that is not full featured is silly.

https://github.com/ahhsoftware/EzAdoVSSolution - look at how the order are
passed

On Mon, Oct 17, 2016 at 1:33 PM, Vitaliy Fedorchenko <
[email protected]> wrote:

@stephentoub https://github.com/stephentoub currently (in
netstandard1.6) IDataReader.GetSchemaTable returns stub "DataTable" class,
and all current implementations just throw NotImplementedException on
GetSchemaTable, am I right? In other words, this method de-facto is
deprecated, and not used any more; nobody calls it from netstandard
assemblies because it returns useless DataTable stub and actually throws an
exception.

In addition to that, we already have an alternative way to get reader's
data schema with IDbColumnSchemaGenerator.GetColumnSchema. It is also
optional to implement and actually is not implemented even by MS connectors
like Microsoft.Data.Sqlite.

I don't know why IDataReader.GetSchemaTable method wasn't removed in the
beginning; since it is not used, is it possible to remove it, and implement
"GetSchemaTable" in the way like "GetColumnSchema" works? I mean, separate
interface + extension method for IDataReader or DbDataReader.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/dotnet/corefx/issues/8622#issuecomment-254276452, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ARDmFMusqEVDPLgKENt2QpP8tlJ1tgjBks5q07FNgaJpZM4IgwIS
.

@ahhsoftware, you seem to be misunderstanding the conversation.

The current version of .NET Core does not have DataSet/DataTable, this will not change. The next version of .NET Core _will_ have DataSet/DataTable - this has already been confirmed here. There's no need to say how important these components are for you.

However, it's important to understand that what you're accomplishing with DataSet/DataTable can be accomplished with _other_ means. In other words, if you need to port to .NET Core _today_ than you have options for dropping DataSet/DataTable. Otherwise you can wait for netstandard20 which will have DataSet/DataTable, and your current code should work without changes then.

fair enough

On Tue, Oct 18, 2016 at 3:14 AM, Shay Rojansky [email protected]
wrote:

@ahhsoftware https://github.com/ahhsoftware, you seem to be
misunderstanding the conversation.

The current version of .NET Core does not have DataSet/DataTable, this
will not change. The next version of .NET Core _will_ have
DataSet/DataTable - this has already been confirmed here. There's no need
to say how important these components are for you.

However, it's important to understand that what you're accomplishing with
DataSet/DataTable can be accomplished with _other_ means. In other words,
if you need to port to .NET Core _today_ than you have options for
dropping DataSet/DataTable. Otherwise you can wait for netstandard20 which
will have DataSet/DataTable, and your current code should work without
changes then.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/dotnet/corefx/issues/8622#issuecomment-254425783, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ARDmFJZC0d6dv5p3DSFK5rN-QVmuzn5Yks5q1HHTgaJpZM4IgwIS
.

Hey, while you're in there you might want to discover a better approach to
processing JSON results. As it stands there is a character limit on
execute scalar, so we typically have to loop through the reader with a
string builder at index 0. It would nice to have a method
cmd.ExecuteJson() or something along those lines that would do the same
thing under the covers.

Maybe not make things sealed either, if I could derive from a parameter I
could build out a more customized tier.

Just thoughts

On Tue, Oct 18, 2016 at 10:23 AM, alan hyneman alan.h.[email protected]
wrote:

fair enough

On Tue, Oct 18, 2016 at 3:14 AM, Shay Rojansky [email protected]
wrote:

@ahhsoftware https://github.com/ahhsoftware, you seem to be
misunderstanding the conversation.

The current version of .NET Core does not have DataSet/DataTable, this
will not change. The next version of .NET Core _will_ have
DataSet/DataTable - this has already been confirmed here. There's no need
to say how important these components are for you.

However, it's important to understand that what you're accomplishing with
DataSet/DataTable can be accomplished with _other_ means. In other
words, if you need to port to .NET Core _today_ than you have options
for dropping DataSet/DataTable. Otherwise you can wait for netstandard20
which will have DataSet/DataTable, and your current code should work
without changes then.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/dotnet/corefx/issues/8622#issuecomment-254425783,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ARDmFJZC0d6dv5p3DSFK5rN-QVmuzn5Yks5q1HHTgaJpZM4IgwIS
.

@roji You said DataSet and DataTable will be in the next version of .NET Core, where did you see that ? I didn't see this feature in the roadmap

@AdrienTorris just scroll up this issue to see where it's closed: https://github.com/dotnet/corefx/pull/12426

What version of the s tandard is this included in?

@michal-ciechan According to apisof.net, it's .Net Standard 2.0.

Unless I have a way of pushing sets of data into stored procedures we have
no use for ado core. It's that simple.

On Mon, Feb 27, 2017 at 12:39 PM, Petr Onderka notifications@github.com
wrote:

@michal-ciechan https://github.com/michal-ciechan According to
apisof.net https://apisof.net/catalog/System.Data.DataSet, it's .Net
Standard 2.0.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/dotnet/corefx/issues/8622#issuecomment-282792703, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ARDmFIwnVYwcXW7nedqRQidGyVaRZlWwks5rgwpvgaJpZM4IgwIS
.

@ahhsoftware I already mentioned previously that you can pass table-valued parameters with netstandard1.6, you don't need to wait for netstandard2.0 and DataSet/DataTable; netcore implementation of SqlClient support this type of parameters, and you can pass TVP with custom implementation of DbDataReader. As alternative, you can use NReco.Data library that implements RecordSet structure and RecordSetReader that could be used for specifying TVP in similar way to DataTable.

No point in fixing things if there not broken, core might not be for us
moving forward

On Tue, Feb 28, 2017 at 1:47 AM, Vitaliy Fedorchenko <
[email protected]> wrote:

@ahhsoftware https://github.com/ahhsoftware I already mentioned
previously that you can pass table-valued parameters with
netstandard1.6, you don't need to wait for netstandard2.0 and
DataSet/DataTable; netcore implementation of SqlClient support this type
of parameters
https://github.com/dotnet/corefx/blob/master/src/System.Data.SqlClient/src/System/Data/SqlClient/SqlEnums.cs,
and you can pass TVP with custom implementation of DbDataReader. As
alternative, you can use NReco.Data library that implements RecordSet
structure and RecordSetReader
https://github.com/nreco/data/wiki/RecordSet#use-recordset-as-table-valued-parameter-for-sql-server-stored-procedure
that could be used for specifying TVP in similar way to DataTable.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/dotnet/corefx/issues/8622#issuecomment-282957238, or mute
the thread
https://github.com/notifications/unsubscribe-auth/ARDmFGtUb_u8m2Ry4pMwjH0idRfh1PJQks5rg8L7gaJpZM4IgwIS
.

@VitaliyMF i tried the NReco.Data library and it is a nice library, BUT can you tell me how to make queries on a table with custom schema name, or how to make joins?

@wendt88 could you add an issue with your questions to nreco/data, I think it is offtopic providing an answers here (it is possible to specify custom schema name; complex selects with joins should be declared as "virtual" dataviews processed by the library on .net side).

So... rewrite and retest entire DAL or wait. Going to have to wait - far too much code.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

aggieben picture aggieben  Â·  3Comments

jkotas picture jkotas  Â·  3Comments

nalywa picture nalywa  Â·  3Comments

omariom picture omariom  Â·  3Comments

bencz picture bencz  Â·  3Comments