Skip to content

Commit

Permalink
Increase max bulk_batch_size
Browse files Browse the repository at this point in the history
The previous limit was a max of 1000 query parameters. This is changed
to max of 1000 rows (the max allowed rows for inserting) or 2050 query
parmeters (ms sql reports a max allowed of 2100 parameters but a few
parameters are reserved for executing the query).
  • Loading branch information
danifus committed Nov 8, 2020
1 parent 7982506 commit eb0a0c8
Showing 1 changed file with 11 additions and 3 deletions.
14 changes: 11 additions & 3 deletions sql_server/pyodbc/operations.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,11 +43,19 @@ def bulk_batch_size(self, fields, objs):
are the fields going to be inserted in the batch, the objs contains
all the objects to be inserted.
"""
objs_len, fields_len, max_row_values = len(objs), len(fields), 1000
if (objs_len * fields_len) <= max_row_values:
# inserts are capped at 1000 rows. Other operations do not have this
# limit.
objs_len = min(len(objs), 1000)
fields_len = len(fields)
# MSSQL allows a query to have 2100 parameters but some parameters are
# taken up defining `NVARCHAR` parameters to store the query text and
# query parameters for the `sp_executesql` call. This should only take
# up 2 parameters but I've had this error when sending 2098 parameters.
max_query_params = 2050
if (objs_len * fields_len) <= max_query_params:
size = objs_len
else:
size = max_row_values // fields_len
size = max_query_params // fields_len
return size

def bulk_insert_sql(self, fields, placeholder_rows):
Expand Down

0 comments on commit eb0a0c8

Please sign in to comment.