Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DataAccessException triggered when handling a large number of CBZ files #1669

Open
5 tasks done
Kuan-Lun opened this issue Aug 28, 2024 · 7 comments
Open
5 tasks done

Comments

@Kuan-Lun
Copy link

Steps to reproduce

  1. Place 2,788 CBZ files, totaling 59 GB, in a single directory.
  2. Add the directory to libraries, unchecking all settings.
  3. Click on "Scan library files" (the scan does not start automatically after adding).
  4. A certain number of CBZ files will be scanned, and the scanned files will appear as "To be analyzed" in Komga.
  5. The scan interrupts, and an error is displayed in the logs (see Logs).
    *. Repeating steps 3 to 5 multiple times eventually allows the entire library to be scanned.

Expected behavior

The scan should complete after a single click of the scan button.

Actual behavior

Please refer to the Steps to reproduce.

Logs

mylogs.csv

Komga version

v1.11.3-master

Operating system

Linux

Installation method

Docker

Other details

No.

Acknowledgements

  • I have searched the existing issues (open AND closed) and this is a new ticket, NOT a duplicate or related to another open issue.
  • I have written a short but informative title.
  • I have checked the FAQ.
  • I have updated the app to the latest version.
  • I will fill out all of the requested information in this form.
@gotson
Copy link
Owner

gotson commented Aug 29, 2024

your database is not reachable or too slow:

SqliteMainPool - Connection is not available, request timed out after 30000ms

@Kuan-Lun
Copy link
Author

I understand. However, could this possibly be due to a one-time insertion of too much data into the "local" database?

@gotson
Copy link
Owner

gotson commented Aug 29, 2024

no, Komga processes data sequentially, people have been successfully processing TBs of data in Komga without any problem. Most likely the issue lies in how you host Komga.

Can you share your dockerfile, and provide details on how the drives are mapped, and what is the underlying hardware and storage infrastructure ?

@Kuan-Lun
Copy link
Author

I understand that Komga can handle TBs of data without issues, as I successfully processed 2 TB of data with the older version (v1.10.4-master) without any problems. I believe this issue might be related to the new version of Komga.

Dockerfile: docker-compose.yml
Drives: Synology DS1520+

@gotson
Copy link
Owner

gotson commented Aug 29, 2024

Nothing has changed on that front for quite some time. On Synology you may need to check that the config folder is not indexed / managed by Synology in any way. Syno media indexing can be very taxing on some folders.

@cooperspencer
Copy link

I have the same issue but on ubuntu with komga, version 1.14.1, via docker.
Did you find a solution for this?

@Kuan-Lun
Copy link
Author

I have the same issue but on ubuntu with komga, version 1.14.1, via docker.

Did you find a solution for this?

No.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants