Chrome crashes on uploading large files

Hi,

I have an Upload component to input csv/text files. If I upload large files (I tested with a 170mb CSV file), Chrome crashes. I tried setting the max_size property of the Upload component to -1. It works fine in Firefox.

I see that Chrome limits the file size of upload. Is there a way to get around this? I would want to load only a few lines of the CSV file at a time, but the file must be completely uploaded before it can be read.

Another way I am thinking would be to use something like a FileSelector to select the file path. Then I can use pandas to read the CSV as chunks? But I see that the path of the file selected gets stripped while uploading to the browser.

Any ideas?

Hm, I wasnā€™t aware of this. How did you figure this out?

Yeah, thatā€™s a good idea. Weā€™ll probably need a separate component for this. Maybe something like this: React Resumable JS. These components can be ported from React to Dash easily with the Dash plugin framework: Build Your Own Components | Dash for Python Documentation | Plotly. The customer engineering team at Plotly can also be contracted to build these components for your company or organization as well: Consulting, Training & Open-Source Development

Is the app always running on the same machine as the location of the files to upload? If so, then you could create a dcc.Dropdown that lists all of the available file paths and the callback could read the file from the disk.

However, if you are deploying the app and your users are uploading files from their machines, then youā€™ll need to use some type of upload component.

Hello Chris,

Thanks for the response.

I used the ā€œUploadā€ component to upload a CSV file and display it in a dash table. When I use the ā€œUploadā€ component and upload a large CSV, it crashes in Chrome (ā€˜Aww snap pageā€™) but runs without an issue in Firefox. I added breakpoints and see that it crashes as soon as it enters the Upload component callback. However, if I upload a smaller file, both browsers run it without a problem. I was thinking that the whole file gets passed into the ā€˜Uploadā€™ component callback function (as binary stream?) first? ; before I can do any operations/optimizations.

Thank you! I tried to use the Plugin before to use to port GitHub - casesandberg/react-color: šŸŽØ Color Pickers from Sketch, Photoshop, Chrome, Github, Twitter & more to Dash, but havenā€™t been able to do so. I shall try again and see if I can get it to work.

One more option would be to make a local copy of the file and then import. But again, I do not know how to do this without the file path available.

Thanks,
Vivek

Ah, I see. Yeah, itā€™s most likely a memory issue. There might be some ways that we can improve the memory management in Dash to get to 1xx-2xxMB files but itā€™d probably be a lot of deep architectural work, possible only through a corporate sponsorship.

Feel free to open a thread about this, Iā€™d be happy to help you out here :slight_smile: The documentation for creating plugins could certainly be improved and elaborated, and a community thread might be a good place to work on this.

This one looks interesting because it has a ā€œserviceā€ property, which could be hooked into a custom flask route (@app.server.route). The callback could be fired with a property that sets something like n_files_uploaded, similar to how we map click events to stateful properties (n_clicks).

1 Like

See Show And Tell -- Dash Resumable Upload for a new approach to uploading very large files :tada:

1 Like