Skip to content

Commit

Permalink
Increase max bulk_batch_size
Browse files Browse the repository at this point in the history
The previous limit was a max of 1000 query parameters. This is changed
to max of 1000 rows (the max allowed rows for inserting) or 2050 query
parmeters (ms sql reports a max allowed of 2100 parameters but a few
parameters are reserved for executing the query).
  • Loading branch information
danifus committed Nov 8, 2020
1 parent 7982506 commit 61e50ea
Showing 1 changed file with 9 additions and 6 deletions.
15 changes: 9 additions & 6 deletions sql_server/pyodbc/operations.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,12 +43,15 @@ def bulk_batch_size(self, fields, objs):
are the fields going to be inserted in the batch, the objs contains
all the objects to be inserted.
"""
objs_len, fields_len, max_row_values = len(objs), len(fields), 1000
if (objs_len * fields_len) <= max_row_values:
size = objs_len
else:
size = max_row_values // fields_len
return size
fields_len = len(fields)
# MSSQL allows a query to have 2100 parameters but some parameters are
# taken up defining `NVARCHAR` parameters to store the query text and
# query parameters for the `sp_executesql` call. This should only take
# up 2 parameters but I've had this error when sending 2098 parameters.
max_query_params = 2050
# inserts are capped at 1000 rows. Other operations do not have this
# limit.
return min(1000, max_query_params // fields_len)

def bulk_insert_sql(self, fields, placeholder_rows):
placeholder_rows_sql = (", ".join(row) for row in placeholder_rows)
Expand Down

0 comments on commit 61e50ea

Please sign in to comment.