I have a web application for downloading files. Everything works fine except when I want to download a file more than 1GB .
This is my java code:
InputStream in = new FileInputStream(new File(folderFile.getAbsolutePath()));
org.apache.commons.io.IOUtils.copy(in, response.getOutputStream());
response.flushBuffer();
in.close();
HTTP request :
$http({
method:'get',
url:this.apiDownloadFileUrl,
responseType:'arraybuffer',
cache: false
});
and here is client side: I got data successfully on client, but when I make it Blob , if the data size was more than 500MB , nothing happened and it wasn't downloaded. Also, I can download 300MB ...
How can I check if it is a memory problem, or a server problem? ... When I download from gmail , I can download more than 1GB .
.success(function(databack) {
var file = new Blob([ databack ], {
type : 'application/csv'
});
var fileURL = window.URL.createObjectURL(file);
var a = document.createElement('a');
a.href = fileURL;
a.target = '_blank';
a.download = data;
document.body.appendChild(a);
a.click();
Have you tried using the copyLarge() methods from IOUtils? For the copy() methods the JavaDoc says: