You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just a small possible enhancement, but would it be possible to have the download function automatically split the queries in chunks for searches when the length of list_of_accession_ids is >4000.
Now I do this myself, e.g. to fetch the most recently uploaded records, using
Even better could be to also have this parallelized (if GISAID would allow that), as the above is still relative slow - it now takes about 1.5 hours to download these 103K records from the last 5 days. If I tried with a chunk size of 5000 I received a Server error, so reduced it to 4000 and that seemed to work...
The text was updated successfully, but these errors were encountered:
Just a small possible enhancement, but would it be possible to have the
download
function automatically split the queries in chunks for searches when the length oflist_of_accession_ids
is >4000.Now I do this myself, e.g. to fetch the most recently uploaded records, using
Even better could be to also have this parallelized (if GISAID would allow that), as the above is still relative slow - it now takes about 1.5 hours to download these 103K records from the last 5 days. If I tried with a chunk size of 5000 I received a Server error, so reduced it to 4000 and that seemed to work...
The text was updated successfully, but these errors were encountered: