-
-
Notifications
You must be signed in to change notification settings - Fork 290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Can't copy large files to container #1180
Comments
Thanks for creating the issue. It looks like that it fails copying the file stream content to the underlying testcontainers-dotnet/src/Testcontainers/Containers/TarOutputMemoryStream.cs Lines 145 to 146 in ac58b9f
How large are a few gigabytes? It should not be difficult to reproduce, I guess 😬. I can try to reproduce it later the day. |
I did not remember the actual implementation, but after spending a few minutes looking at it, the exception and error you are seeing make sense. The maximum size of a public sealed class GitHub : IResourceMapping
{
public MountType Type => MountType.Tmpfs;
public AccessMode AccessMode => AccessMode.ReadOnly;
public string Source => "foo";
public string Target => "foo";
UnixFileModes IResourceMapping.FileMode => Unix.FileMode755;
[Fact]
public async Task Issue1180()
{
using var memoryStream = new MemoryStream();
// using var fileStream = new FileStream(Target, FileMode.CreateNew, FileAccess.Write, FileShare.Read);
using var tarOutputMemoryStream = new TarOutputMemoryStream(memoryStream, NullLogger.Instance);
await tarOutputMemoryStream.AddAsync(this);
}
Task IFutureResource.CreateAsync(CancellationToken ct)
{
return Task.CompletedTask;
}
Task IFutureResource.DeleteAsync(CancellationToken ct)
{
return Task.CompletedTask;
}
Task<byte[]> IResourceMapping.GetAllBytesAsync(CancellationToken ct)
{
// https://learn.microsoft.com/en-us/dotnet/api/system.array.
const int maxArrayDimension = 2147483591;
return Task.FromResult(new byte[maxArrayDimension]);
}
} This example demonstrates it very well. If you change the |
I've only glanced at the implementation when debugging the issue, but is it possible to determine the size of the files and then choose either file or memory based on that? |
I do not think that is an appropriate fix. Writing it to a file and then reading it again won't be very performant. I think it is better to properly support and forward a stream. |
Testcontainers version
3.8.0
Using the latest Testcontainers version?
Yes
Host OS
Windows
Host arch
x86
.NET version
8.0.3
Docker version
Docker info
What happened?
Trying to use a large file with
.WithResourceMapping
results in an IOException stating that the stream is too long. The file I'm copying is a few gigabytes in size.Relevant log output
Additional information
No response
The text was updated successfully, but these errors were encountered: