tedious TVP, is a becoming blocking operation for large data

Created on 4 Nov 2016  路  11Comments  路  Source: tediousjs/tedious

I am trying to do a call a stored procedure with a TVP as a parameter . This TVP is built from a CSV, with ~ 30000 rows . When the request begins, the thread gets blocked and any request into the application is not handled .

I have tried calling a sp whihc internally sends the response back after a 5 mins delay, but in that case the operayionn is Async . This leads me to believe that the Sp call itself is not a blocking call, but the build up upto the sp call processing the data inside tedious is blocking

 let tvp = {
        columns: [
            { name: 'Domain', type: TYPES.VarChar, length: 100, nullable: true },
            { name: 'LanguageISO', type: TYPES.VarChar, length: 10, nullable: true },
            { name: 'Channel', type: TYPES.VarChar, length: 10, nullable: true },
            { name: 'DateStart', type: TYPES.VarChar },
            { name: 'DateEnd', type: TYPES.VarChar },
        ],
        rows: [
        ]
    };

//some for loop
   tvp.rows.push(newRow)

     const connection = new Connection(connectionStrings[environment]);

    // /**on establishing connection execute teh request */
    connection.on('connect', function (err) {
        if (err) {
            console.error("Exception==>", err);
            callback(err);
        } else {
            logger.info("Connection established with sql server", stopwatch.ms);

            const request = new Request("someStoredProcedure", function (error, rowCount) {
                connection.close();
                if (!error) {
                    logger.info("Procedure call complete", stopwatch.ms);
                     callback(null, output);
                    }
                    /**excel is not valid */
                    else {
                        callback('File validation Failed', output);
                   }
                } else {
                    logger.error("DB Error => ", error, stopwatch.ms);
                    callback(error.message, output);
                }
            });
            request.addParameter('Parameter1', TYPES.TVP, tvp);
            request.on('row', function (columns) {
                var row = [];
                columns.forEach(function (column) {
                    row.push(column.value);
                });
                output.push(row);
            });
            logger.info("Begin procedure call", stopwatch.ms);
            connection.callProcedure(request);
        }
    });
Follow up known issue released

Most helpful comment

Making different aspects of tedious less blocking and more stream-oriented is one of my long term goals for tedious. Currently, I'm working on the receiving and parsing side of things, as that is where most users of tedious will feel the biggest impact. Once that is tackled, we'll also look at the parts that write to the database as well.

All 11 comments

Yes, you're right, writing large TVP is currently a blocking operation.

The main issue here is that before tedious sends the request off to the database, it builds an internal buffer containing the full request data. If you have a TVP with many, many rows as you described, building this request is going to take quite some time and will also use up a considerable amount of memory. And later, when the request buffer gets written to the socket, writing such a large buffer object might be blocking again.

Fixing this would require a pretty big overhaul of the internals of tedious. It's definately something we'd like to have in the future, but it's nothing that can be fixed quickly.

:) Does a Bulk upload work in a similar fashion ? I basically need to send a large chunk of data to my SP. I am open to modify the working's if a Bulk upload would solve the problem

Yeah, looks like bulk inserts work in a similar fashion. :(

Making different aspects of tedious less blocking and more stream-oriented is one of my long term goals for tedious. Currently, I'm working on the receiving and parsing side of things, as that is where most users of tedious will feel the biggest impact. Once that is tackled, we'll also look at the parts that write to the database as well.

Appreciate the effort, Arthur. Thanks

Any update on this? If I wanted to investigate this myself where would I start?

Ditto, would love to see a fix for this issue

Putting some more thought into this. Would it be possible to add a setTimeout/nextTick in between packet writes to allow other processing to also take place and not completely block the event loop?

If so where would be a likely spot for this to go?

@arthurschreiber Looks like not, as you said above, the required changes would be pretty extensive. I am still interested in looking into this if you'd accept a PR but would likely need a little guidance for first steps.

Hi @RajatBanerjee do you still get the same slow performance when using BULK INSERT Operations instead? As stated in the MSDN docs, "table-valued parameters perform well for inserting less than 1000 rows."

:tada: This issue has been resolved in version 8.1.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket:

Was this page helpful?
0 / 5 - 0 ratings

Related issues

SaloniSonpal picture SaloniSonpal  路  5Comments

codethyme picture codethyme  路  7Comments

yosserO picture yosserO  路  4Comments

jstephens7 picture jstephens7  路  5Comments

spacem picture spacem  路  4Comments