Hi, I have previously implemented a custom target that uses SQL Server's proprietary bulk insert API. I'm thinking about updating it for inclusion into the main NLog project.
Would this contribution likely to be accepted? I guess it would be in its own project/package, NLog.Extensions.SqlServer or something.
Cheers, Aled.
Unless you are writing 50.000 records/sec. Then I think a NLog target that could concat multiple Sql-statements into a single transaction would be more useful (Alternative using DataTable). Anyway not using the DatabaseTarget, so maybe not the best to answer this question :)
Thinking about this.
With an async/buffer target, it could make sense.
Was thinking of a NLog.Target.SqlServerBulk package, but it will create a new repo for that.
@aled do you think we could proper unit test this?
Unit tests should be no problem. Am I right in thinking that the builds/tests are run on appveyor? If so, SQL Server is available in their build environments, so it should be possible to run tests against a real database, rather than mocking it out.
@aled
Need any help on this?
A late answer on your question:
Am I right in thinking that the builds/tests are run on appveyor? If so, SQL Server is available in their build environments, so it should be possible to run tests against a real database, rather than mocking it out.
Yes those are on appveyor. You could do it without mocking, but it would make the test slower. Also it would be an integration test (which we already have for the DB target). So I prefer mocking :)
I wasn't sure whether I should start it or not, as the issue was assigned to you.
The only area I might need help with is testing with all the various runtimes that nlog supports (Xamarin iOS, windows phone, silverlight, etc). I'll make it work with .net core and .net framework initially and then look at the others.
I wasn't sure whether I should start it or not, as the issue was assigned to you.
that's means I have to react ;) (or someone else). You can't set notification to unread in Github :(
The only area I might need help with is testing with all the various runtimes that nlog supports (Xamarin iOS, windows phone, silverlight, etc). I'll make it work with .net core and .net framework initially and then look at the others.
IMO you could disable it for Xamarin iOS, windows phone, silverlight. See the various #if regions in the code
Initial implementation here: https://github.com/aled/NLog.Targets.SqlServerBulk
Only implemented as a .net standard library for now. Probably needs more error handling (this is one area of NLog I'm not very familiar with tbh)
Any criticism or comments welcome.
briefly looked at, but looks good!
What is the performance difference between your MsSqlServer specific approach, and this more generic approach that works for MsSql, MySql and SqlLite?
Would "just" require that one loads the DbProvider-SqlDataAdapter, and then one can send bulk-DataTable to many different databases.
SqlBulkCopy is SqlServer only and AFAIK the recommend approach when using Sql Server
But if SqlDataAdapter is just as fast and works on many database type. Why not focus on that?
But if SqlDataAdapter is just as fast and works on many database type. Why not focus on that?
good point, but I don't know for sure if that's true. e.g. the bulkInsert could ignore triggers and checks, and that MS specific.
there is nothing about SqlBulkCopy there? (tl;dr the post there - just scanned it)
FYI: SQLBulkCopy is still a recommend approach of MS: https://docs.microsoft.com/en-us/previous-versions/sql/sql-server-2008/dd425070(v=sql.100)
SQLBulkCopy is still a recommend approach of MS
It would be silly if Microsoft did not recommend the MS-approach for a MS-database, else there would not be any lock-in of technology.
The goal is to avoid calling the database 100 times, for saving 100 LogEvents. Whether using SqlBulkCopy or SqlDataAdapter gives the same result by saving a DataTable in one call.
I think the difference between SqlDataAdapter and SqlBulkCopy is the ability to save 50.000 records/sec vs 100.000 records/sec. I don't think anyone is planning to reach those levels.
Anyway don't mind me. Please continue your quest. I was just curios about the choice of direction.
It would be silly if Microsoft did not recommend the MS-approach for a MS-database, else there would not be any lock-in of technology.
That's kinda of biased ;) I have my fate in MS since .NET Core/Nadella :)
I think the bulk-insert-sql-server-only-target is a nice addition, even when it's SqlServer only.
I think the bulk-insert-sql-server-only-target is a nice addition, even when it's SqlServer only.
Sure is. Especially when changing inheritance to the builtin DatabaseTarget with support for IRawValue.
inheritance to the builtin DatabaseTarget
That's a very good idea.
@aled could you inherit from the current DatabaseTarget ?
@aled could you inherit from the current DatabaseTarget ?
I guess so, hadn't thought about it. What would be the advantage of doing this?
What is the performance difference between your MsSqlServer specific approach, and this more generic approach that works for MsSql, MySql and SqlLite?
Looking at the code in the linked StackOverflow question, it doesn't seem like a generic approach; the C# code is different depending which database it is writing to, and it requires a stored procedure on the database (again, different depending which database it is).
That SQL Server technique (passing a table-valued parameter into a stored procedure) is in my experience likely to be similar in performance to a bulk insert; though with the disadvantage that it requires a stored procedure. The advantage of that approach is the ability to do clever things in the stored procedure, for instance inserting new rows or updating rows if they already exist; but for logging it is not particularly useful.
The MySql technique (passing a big JSON payload and parsing it in the stored procedure) is perfectly legitimate, but it isn't the fastest technique on SQL Server. So I would prefer to keep the SqlServerBulk target as it is, i.e. specific to, and optimised for, SQL Server.
I would create a MySqlBulkInsertTarget that works with MySql, if that was my database.
@aled Sorry about that example. Just saw the use of MySqlDataAdapter and DataTable. Didn't see the JSON-payload
Maybe this is a more generic example:
```C#
var mySqlDataAdapter = new MySqlDataAdapter();
MySqlCommand ins = new MySqlCommand("INSERT INTO test (id, name) VALUES (?p1, ?p2)", conn); // Replace with SQL where parameters matches columns
mySqlDataAdapter.InsertCommand = ins;
ins.UpdatedRowSource = UpdateRowSource.None;
AddParametersToCommand(ins); // One parameter for each column
var dataTable = CreateDataTable(); // Initialize columns in DataTable
foreach (var l in batch)
AddRow(dataTable, l.LogEvent);
mySqlDataAdapter.UpdateBatchSize = 100;
mySqlDataAdapter.Update(dataTable);
```
Most efficient way to insert Rows into MySQL Database
DataAdapter.Update slowdown
@aled Sorry about that example. Just saw the use of MySqlDataAdapter and DataTable. Didn't see the JSON-payload
Maybe this is a more generic example:
var mySqlDataAdapter = new MySqlDataAdapter(); MySqlCommand ins = new MySqlCommand("INSERT INTO test (id, name) VALUES (?p1, ?p2)", conn); // Replace with SQL where parameters matches columns mySqlDataAdapter.InsertCommand = ins; ins.UpdatedRowSource = UpdateRowSource.None; AddParametersToCommand(ins); // One parameter for each column var dataTable = CreateDataTable(); // Initialize columns in DataTable foreach (var l in batch) AddRow(dataTable, l.LogEvent); mySqlDataAdapter.UpdateBatchSize = 100; mySqlDataAdapter.Update(dataTable);Most efficient way to insert Rows into MySQL Database
DataAdapter.Update slowdown
This looks like a good solution for MySql. But I can't see how it would work on SQL Server (correct me if I am wrong).
Having said that, most of the code in the SqlServerBulk target is not actually specific to SQL Server. It should be possible to factor out the generic code into an abstract class, and have a subclass for each database, implementing the bulk insert in the most appropriate way for that database.
This looks like a good solution for MySql. But I can't see how it would work on SQL Server (correct me if I am wrong).
I was guessing the logic would be the same for SQL-Server. "Just" replace MySqlDataAdapter with SqlDataAdapter and MySqlCommand with SqlCommand (Ex. Just like NLog DatabaseTarget DbProvider is used for deciding on the specific type of DbConnection).
But yes one could start with an abstract target that could initialize DataTable.Columns and fill with DataRow. Then the specialized class could decide how to handle that DataTable.
That SQL Server technique (passing a table-valued parameter into a stored procedure) is in my experience likely to be similar in performance to a bulk insert; though with the disadvantage that it requires a stored procedure.
Also in that case you need to creating a Table-Valued Parameter Type. Which isn't always what you want.
Also in that case you need to creating a Table-Valued Parameter Type. Which isn't always what you want.
Not sure I understand. The example shown here does not depend on custom user defined table type. It is not sending the DataTable directly over the wire. Ex the Microsoft SqlDataAdapter will build an internal SqlCommandSet (Collection of DbCommands) and send them over in a single batch.
for passing a table-valued parameter into a stored procedure, you need to create a Table-Valued Parameter Type? See https://docs.microsoft.com/en-us/dotnet/framework/data/adonet/sql/table-valued-parameters
for passing a table-valued parameter into a stored procedure, you need to create a Table-Valued Parameter Type?
But the the Microsoft SqlDataAdapter takes the rows in the DataTable and converts into DbCommands that are then bulk-saved into the database using SqlCommandSet. I repeat that my example doesn't sent the DataTable as a Table-Valued parameter.
But the the Microsoft SqlDataAdapter takes the rows in the DataTable and converts into DbCommands that are then bulk-saved into the database using SqlCommandSet. I repeat that my example doesn't sent the DataTable as a Table-Valued parameter.
I don't think I've ever used SqlDataAdapter.Upate(). Looks like a good way to do bulk updates without needing a TVP.
Initial implementation here: https://github.com/aled/NLog.Targets.SqlServerBulk
Only implemented as a .net standard library for now. Probably needs more error handling (this is one area of NLog I'm not very familiar with tbh)
Any criticism or comments welcome.
Hey, this looks great. I'm planning on using something like this with the buffering wrapper. I'm not a fan of copy/pasting code, so I was wondering if this is something that was being considered for incorporation into a future release. As it stands right now, sending 5000 individual insert statements as separate commands every few seconds in my app is abysmal. The old log4net database adapter had a built-in buffer so I was surprised to see NLog's OOTB support for bulk inserts was so limited.
@D9001 Can you show me the log4net-appender that performs bulk-insert? If you mean the default adoappender then I think it is merely reusing the same connection and dbcommand when doing batching. I think the NLog DatabaseTarget can be easily modified to do the same.
If I create a pre-release nuget-package of NLog, would you be willing to test performance?
Hey, I would be very glad to test it. I may be mistaken about how the
adoappender worked. I hadn't used it in a microservice environment where I
would've tested its performance to the same extent.
On Wed, Feb 20, 2019, 1:33 AM Rolf Kristensen <[email protected]
wrote:
@D9001 https://github.com/D9001 Can you show me the log4net-appender
that performs bulk-insert? If you mean the default adoappender then I think
it is merely reusing the same connection and dbcommand when doing batching.
I think the NLog DatabaseTarget can be easily modified to do the same.If I create a pre-release nuget-package of NLog, would you be willing to
test performance?—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/NLog/NLog/issues/2821#issuecomment-465444002, or mute
the thread
https://github.com/notifications/unsubscribe-auth/AtlarEi-IFpQDTnRZuJD8DBbG8TDmxdmks5vPOw_gaJpZM4VnTha
.
@D9001 Created https://github.com/NLog/NLog/pull/3131#issuecomment-465779908 (With custom nuget-package-build attached).
Created #3150 as a prototype for using SqlDataAdapter / MySqlDataAdapter
Created #3764 that enables batching with transactions, that improves performance.
Still not as fast as SqlBulkCopy, but that will never work for the generic DatabaseTarget.
@304NotModified Guess this should be closed, since we will not be blocking anyone from creating their own Nuget-package for SqlBulkCopy.
I guess a really cool guy/girl could implement a custom DbProvider that creates IDbConnection-objects that calls SqlBulkCopy underneath (Maybe use BeginTransaction to signal batching). Then it will work seamless with DatabaseTarget.
Closing as NLog will not get a Microsoft-SQL-Server specific target. Instead it should be created as seperated nuget-package.
Most helpful comment
Initial implementation here: https://github.com/aled/NLog.Targets.SqlServerBulk
Only implemented as a .net standard library for now. Probably needs more error handling (this is one area of NLog I'm not very familiar with tbh)
Any criticism or comments welcome.