Issues deploying Dash to Google App Engine python3 runtime


#1

I am trying to deploy a very basic Dash app to GAE using the Python 3 flexible runtime, but the deployment always fails with: “ERROR: (gcloud.app.deploy) Error Response: [13] An internal error occurred during deployment”. My app runs fine locally, I suspect that the problem has to do with GAE’s python3 runtime and Dash’s dependencies

My app.yaml file is as follows:

runtime: python
env: flex
entrypoint: gunicorn -b :$PORT main:app

runtime_config:
  python_version: 3

manual_scaling:
  instances: 1
resources:
  cpu: 1
  memory_gb: 0.5
  disk_size_gb: 10

My main.py file is as follows:

import dash
import dash_html_components as html
import dash_core_components as dcc
import plotly.graph_objs as go
from dash.dependencies import Input, Output, State, Event
from flask import Flask

# Initialize dash app
server = Flask(__name__)
app = dash.Dash(__name__, server = server)
app.config['suppress_callback_exceptions'] = True
app.css.config.serve_locally = True
app.scripts.config.serve_locally = True
s
app.layout = html.Div([
    dcc.Input(id='my-id', value='initial value', type='text'),
    html.Div(id='my-div'),
    dcc.Graph(id = 'go')
])

@app.callback(
    Output(component_id='my-div', component_property='children'),
    [Input(component_id='my-id', component_property='value')]
)
def update_output_div(input_value):
    return 'You\'ve entered "{}"'.format(input_value)


if __name__ == '__main__':

    app.run_server(debug = True)

My requirements.txt file is as follows:

dash==0.30.0
dash-core-components==0.38.1
dash-html-components==0.13.2
dash-renderer==0.15.1
Flask==1.0.2
Flask-Compress==1.4.0
gunicorn==19.9.0
plotly==3.4.2

This is about as basic a deployment possible (trying to see what is causing the deployment to fail). Are there known issues with Dash’s dependencies and the GAE python3 runtime? Should I just go with python2?

Thanks!


#2

I got the similar error. You can also set the --verbosity=debug to get more details. What works for me was I have to increase the size of memory from memory_gb: 0.18 to memory_gb: 1.

I would recommend you go with docker container and push image to gcr.io and later deploy.


#3

Forgot to mention your post might need to be 8080 :

app.run_server(debug = True, port=8080).

I have like this:

external_stylesheets = ['https://codepen.io/chriddyp/pen/bWLwgP.css']

app = dash.Dash(__name__, external_stylesheets=external_stylesheets)
server = app.server
# app.scripts.config.serve_locally = False

app.layout = html.Div([

instead of

server = Flask(__name__)
app = dash.Dash(__name__, server = server)
app.config['suppress_callback_exceptions'] = True
app.css.config.serve_locally = True
app.scripts.config.serve_locally = True
s
app.layout = html.Div([

That might not make any difference.


#4

Thanks @thoo, unfortunately increasing memory size and specifying port 8080 doesn’t help, I keep getting the same error.

Any advice for going with a docker container and pushing the image to gcr.io? I don’t have much experience with GAE so this is all quite new to me. If you of any resources or quick start tutorials it would be a huge help


#5

You can look at my repo for docker file and docker commands.

You also want to test your docker first by switching these command in docker

#For local app
#CMD ["gunicorn", "-b","0.0.0.0:8080", "main:server", "-t", "3600"]

#For Google_cloud
CMD exec gunicorn -b :$PORT main:server --timeout 1800

And then, you can run
docker run -d -p 8080:8080 your_app

All the docker, and gcloud can be found here. You will also need to set up gcloud credential along the process if you haven’t.

For a side note, GAE is expensive for me. So I have a cron to run (gcloud app versions start --quiet your_instance_digit) and to stop during specific time of the day. Hope this would help. Let me know if you have issues.


#6

Thanks a lot for the help! Will try the docker container and post my progress to this thread :slight_smile:


#7

Your suggestion worked! Docker image seems like the way to go. I just had to make a couple tweaks to get numpy/pandas to load. I went with the alpine OS because it is supposed to be more efficient (although loading numpy adds ~800MB to the image.

Here is my docker image

FROM python:alpine3.7
COPY . /app
WORKDIR /app
RUN apk add make automake gcc g++ subversion python3-dev
RUN pip install numpy
RUN pip install -r requirements.txt

# Test Locally
CMD ["gunicorn", "-b","0.0.0.0:8080", "main:server", "-t", "3600"]

# Deploy to GAE
# CMD exec gunicorn -b :$PORT main:server --timeout 1800

Note that I had to pip install numpy before installing my other requirements because of a dependency issue. Also the “RUN apk add make automake gcc g++ subversion python3-dev” line supposedly allows c++ add ons to be used with python run on an alpine OS.

My app code was as before. To push the image and deploy it I ran the following in my GAE cloud console:

docker build -t gcr.io/cr2c-monitoring/app:v1 .
gcloud auth configure-docker
docker push gcr.io/cr2c-monitoring/app:v1

to deploy just run:

gcloud app deploy --image-url gcr.io/cr2c-monitoring/app:v1

otherwise, to run and test locally:

docker run --rm -p 8080:8080 gcr.io/cr2c-monitoring/app:v1

Also, to delete an image (and prevent clutter), I just used:

docker rm <id_or_name&gt