Dash - Caching Data / Handling Multiple Users

Hi all, I have recently built an app using Dash at work. While the app works for the most part, there are still crashes and I believe I need to find a better overall design, but I am not sure what the best one would be.

The app:

It is a “real-time” dashboard that updates every 15 seconds (“interval-component”). Update means it performs one “data retrieval” callback which runs 3 separate queries on a kdb database (using qpython), which return 3 pandas dataframes of reasonably small size (~10,000 x 10 each max.) containing numbers and short strings. That same callback performs a few operations on the frames (aggregation, computing a couple of new columns; i.e. nothing too expensive). This data is then JSON encoded and stored in a Store component (memory).

The store component is used as an input on several other callbacks which update various charts. Some of those charts are also using some dropdowns as inputs in addition to the store component, which basically act as a filter for the data.

Issues/Questions

  1. How exactly does the store component manage the data in the background (in memory)? In my case, would it overwrite the data on every callback?

  2. Occasionally, the queries don’t return anything or qpython can’t properly parse the data to a dataframe. They tend to be temporary problems and normally just need a rerun of the query for them to work properly. In this case, I would not want the store component to update on the callback, but rather just keep the data from the previous update and just try again in 15 seconds on the next update. I have added the following code to achieve this, although I am not entirely sure whether that does exactly what I want it to:

    ‘’‘try:
    df = query_database(args)
    except (Exception):
    raise PreventUpdate’’’

  3. If the app is deployed on a server, I think it would be best to have one data “repository” that is refreshed every 15 seconds (on the server) which is used in each users individual session as the input data will be the same. They might just apply different “filters” through the dropdowns but the source data doesnt change between users.

  4. In terms of creating several instances of the app, does Dash do this automatically? What is the best way to have multiple users, say up to 15, use such an app that is deployed centrally on a server?

Having done some research, Example 3 on this page (Part 5. Sharing Data Between Callbacks | Dash for Python Documentation | Plotly) seems like a good fit to manage the data, but would very much like to hear some thoughts on this and on the other points!

Thank you

A beginner here.! Appreciate the work that the Dash team has done so far here, but mildly disappointed in the fact that most if not all questions around multi users and scaling the app finds no responses here. I have some fundamental questions around that as well. Is there perhaps a page in the documentation that might help the beginners with this?

Hi @Sandimusic Did you found a workaround?

For this kind of workflow, i would do a setup along the lines of,

  • Create a process on the server that pulls the data at regular intervals and inserts it into a database. It could be a scheduled Python script or something else.
  • Load the data from the database in the Dash app.

With this setup, the data will be pulled from the original source only once. And you should be able to scale the app to 15 users (or much more) without any problems.

Hi @Emil what do you mean ~15 users? Isn’t gunicorn and others to claim thousands of requests?

I mentioned 15 as this was the number that the thread author wanted. But yes, thousands should be possible.

1 Like