Stream API use when there are >60s gaps between write operations [Solved]

Hi,

I’d like to use the stream API to update a graph with the loss and accuracy metrics from a neural network. I’m using the Keras library to write the network, which has a callbacks function. I can use on_train_begin, on_train_end, on_batch_begin, on_batch_end, on_epoch_begin and on_epoch_end methods to make the various stream.open() etc. calls.

The fundamental problem I’m experiencing right now, however, is that often an individual epoch takes longer than 60s, so plotly closes the stream. Upon reopening it, the data is lost and I often end up with an empty graph at the end.

I’m not sure how to do a stream.heartbeat() call because none of the methods available to me in the callback run with a predictable enough frequency (much less a frequency that would be <30s).

As a workaround, I’m currently just adding new data using the fileopts='extend' argument in py.plot(). While this achieves the goal of streaming data to a graph, it isn’t updated in realtime - that is, I have to refresh the page to see the new points.

After all that preamble, is there a way to achieve 1) Constantly updated data sent to plotly, and 2) a graph that updates in realtime?

Thanks for your help in advance!

I think I solved my own problem.

I reverted back to the stream API and used multiprocessing to run a while loop in the background. Using on_train_start(), I first open the streams (stream.open()) and then I wrote a simple hearbeater() function as follows:

training = threading.Event()
def heartbeater(training):
    while not training.isSet():
        stream.heartbeat()
        time.sleep(5)

t = threading.thread(target=heartbeater,args=(training,))
t.setDaemon(True)
t.start()

Now, the on_train_start() method will invoke the plot and spin off a separate thread to keep the connection alive. At the end of training, the on_train_end() method runs training.set(), which ends the loop.

Hope this helps someone else eventually.

[Edits: fixed a few things for as I learned more…]