-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimal batch size for SQL Server #121
Comments
Hello @premsai , Is it possible to get your SQL Table definition, including index and trigger? You can send it in private here: [email protected] It will be easier for us to give you a better answer with those informations as 1h30 for only 5 million looks very slow at first sight. Best Regards, Jon |
Hello @premsai Since our last conversation, we haven't heard from you. Don't hesitate to contact us if you need further information. Best regards, Jon |
Hi it is a straight forward table
definition is
Create table pos(
posid bigint identity(1,1) not null,
aliasid int null,
aliasid2 it null,
assetid varxhar(50),
assetid2 varxhar(50),
assetid3 varxhar(50),
assetname varxhar(255),
assetissuername varxhar(255),
invtype varxhar(60),
etc....
the table has about 100 columns of varxhar int and decimal(38,5)
datetime date type
we have one clustered index on the ID field
3 non clustered index on a code field another on a date field and another
on another date field no foreign key inserts for the table any data we load
is going straight to this table
thanks
Premsai
…On Fri, Dec 16, 2022, 7:19 AM Jonathan Magnan ***@***.***> wrote:
Hello @premsai <https://github.com/premsai>
Since our last conversation, we haven't heard from you.
Don't hesitate to contact us if you need further information.
Best regards,
Jon
—
Reply to this email directly, view it on GitHub
<#121 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAQSEI7GBP3SJBD25WA2TI3WNSB7XANCNFSM6AAAAAAS5VN4YQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Hello @premsai , Thank you for the additional information. This is what I expected, your table has hundreds of columns, which explains why this is long. In your case, one thing to make sure of is that the By making sure the So all options such as specifying the identity must be removed (you can specify it as a key instead). See the following example: https://dotnetfiddle.net/bdiM7D , you will see in the log returned the following line: SqlBulkCopy: [Invoice]
-- BulkCopyTimeout:120 So in this case, we know that the Let me know if you see any performance gains or if the Best Regards, Jon |
Thanks for your reply let me try that will get back to you soon take care
Prem
…On Mon, Dec 19, 2022, 8:51 AM Jonathan Magnan ***@***.***> wrote:
Hello @premsai <https://github.com/premsai> ,
Thank you for the additional information.
This is what I expected, your table has hundreds of columns, which
explains why this is long.
In your case, one thing to make sure of is that the SqlBulkCopy is
executed directly in the destination table and not your temporary table, as
it will make the operations way longer otherwise.
By making sure the SqlBulkCopy is executed in the destination table, you
will gain a performance increase. However, you will not be able to output
values such as the identity value.
So all options such as specifying the identity
<https://dapper-plus.net/getting-started-identity-propagation#auto-identity-propagation>
must be removed (you can specify it as a key instead).
See the following example: https://dotnetfiddle.net/bdiM7D , you will see
in the log returned the following line:
SqlBulkCopy: [Invoice]
-- BulkCopyTimeout:120
So in this case, we know that the SqlBulkCopy has been done in the
destination table [Invoice].
Let me know if you see any performance gains or if the SqlBulkCopy was
already made in the destination table.
Best Regards,
Jon
—
Reply to this email directly, view it on GitHub
<#121 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAQSEI2TR4GQPGJXBCCGKETWOCHC5ANCNFSM6AAAAAAS5VN4YQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Hello @premsai, Since our last conversation, we haven't heard form you. How if your evaluation going? Let me know if you need further assistance. Best regards, Jon |
Hi
I need to get proper support and also the sales information so that we can
talk further on procuring the license
thanks
Prem
…On Mon, Jan 9, 2023, 8:38 PM Jonathan Magnan ***@***.***> wrote:
Hello @premsai <https://github.com/premsai>,
Since our last conversation, we haven't heard form you.
How if your evaluation going?
Let me know if you need further assistance.
Best regards,
Jon
—
Reply to this email directly, view it on GitHub
<#121 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAQSEI7Z2PI3B4SF4FWNNGTWRQSXZANCNFSM6AAAAAAS5VN4YQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Hello @premsai, Sure! You can contact us directly at [email protected] I will be able to assist you with any questions or concerns. Best regards, Jon |
Send me the details of your sales team and contact to
***@***.***
…On Mon, Jan 9, 2023 at 11:56 PM premsaik ***@***.***> wrote:
Hi
I need to get proper support and also the sales information so that we can
talk further on procuring the license
thanks
Prem
On Mon, Jan 9, 2023, 8:38 PM Jonathan Magnan ***@***.***>
wrote:
> Hello @premsai <https://github.com/premsai>,
>
> Since our last conversation, we haven't heard form you.
>
> How if your evaluation going?
>
> Let me know if you need further assistance.
>
> Best regards,
>
> Jon
>
> —
> Reply to this email directly, view it on GitHub
> <#121 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AAQSEI7Z2PI3B4SF4FWNNGTWRQSXZANCNFSM6AAAAAAS5VN4YQ>
> .
> You are receiving this because you were mentioned.Message ID:
> ***@***.***>
>
|
Hi
We are in the process of evaluating your library for our firm our tech stack is
.NETCore, C# and SQLServer
I have a massive data load problem I am trying to solve about 5 million rows to be inserted, what is the best approach, list size, connection handling etc. We want the load to finish in reasonable time like 30 min or so using the library I was able to get it done in an hour and 30 minutes, Can you help?
Thanks
P
The text was updated successfully, but these errors were encountered: