After upgrading to Jupyter version 5.0.0b1 on Mac OS X, I'm getting errors like:
IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.
To change this limit, set the config variable
`--NotebookApp.iopub_data_rate_limit`.
for just about every notebook I run, all of which have various visualizations. From that message I had no idea what units the rate limits might be expressed in, but by adding enough zeros I eventually figured out that the problems go away if I launch jupyter with:
jupyter notebook --NotebookApp.iopub_data_rate_limit=10000000000
As mentioned in #1821, it is not obvious what limits are reasonable to set for this value by default, but I would like to put in my vote that the current value is at least an order of magnitude too low. A sample matplotlib script that usually causes the problem is below, but I mostly use HoloViews, where I see the problem for just about every visualization when using the Bokeh backend, which requires a lot of JavaScript to be sent to the browser.
As someone who in part maintains HoloViews, I now have to recommend that all of our users start Jupyter in this way, which makes it very awkward for them, for what I think is a very reasonable way to be using a Jupyter notebook for visualizations. So I would strongly plead with you to increase this value by default so that it is only reached for actual error cases, not for ordinary visualization workflows.
import numpy as np
import matplotlib.pyplot as plt
freq = 100
def sine(x, phase=0, freq=freq):
return np.sin((freq * x + phase))
dist = np.linspace(-0.5,0.5,256)
x,y = np.meshgrid(dist, dist)
grid = (x**2+y**2)
testpattern = sine(grid, freq=freq)
methods = [None, 'none', 'nearest', 'bilinear', 'bicubic', 'spline16',
'spline36', 'hanning', 'hamming', 'hermite', 'kaiser', 'quadric',
'catrom', 'gaussian', 'bessel', 'mitchell', 'sinc', 'lanczos']
fig, axes = plt.subplots(3, 6, figsize=(96, 48))
fig.subplots_adjust(hspace=0.3, wspace=0.05)
for ax, interp_method in zip(axes.flat, methods):
ax.imshow(testpattern , interpolation=interp_method, cmap='gray')
ax.set_title(interp_method)
plt.show()
BTW, I increased some of the numbers involved in the above script so that it would fail reliably; the original values from https://github.com/VolkerH/interpolation_and_aliasing_matplotlib/blob/master/Interpolation%2Band%2BAliasing.ipynb
were as above but replacing 256 with 128, 96 with 48, and 48 with 24. With the original smaller values, it would fail some of the time but not every time, which makes it less good as a test case.
@jbednar thanks! Looks like we need to tune the defaults a bit.
I don't think the data rate limit is working very well. Large outputs that include things like images don't generally cause problems. They may be slow on poor networks, but throttling probably isn't going to be helpful in those cases.
The main output type that is helped by the data limit is stream output. (a few MB of stdout can cause major issues, whereas a few MB image is generally no problem). One option is to apply the data-rate limit exclusively to stream outputs.
That PR looks perfect, thanks!
This was reverted in https://github.com/jupyter/notebook/pull/2326, so reopening for consistency.
Closed again by #2368
I know this is closed, but thought I'd add a comment re. the stream output discussion above, and I can confirm that audio output is affected, at least when playing back high bitrate audio files, e.g.,:
from IPython.display import Audio
Audio(filename="somefile.mp3", autoplay=True)
Overriding NotebookApp.iopub_data_rate_limit = 10000000 in jupyter_notebook_config.py as mentioned everywhere does the trick. Should possibly be better documented in the display module docs?
When we release notebook 5.1, the rate limit should only apply to stdout/stderr output, so audio output should work without needing any special config again. I'm -0.5 on documenting the temporary workaround - our docs are big and complex enough that I don't think people have a much better chance of finding it there than they do on this issue.
Before you could even see a file named jupyter_notebook_config.py and then, proceed with this fixing, you must run first jupyter notebook --generate-config (linux users).
Overriding this in the config file doesn't work for me. Same error regardless of what I set NotebookApp.iopub_data_rate_limit = to in the config file. Do I have to put that config file somewhere?
If you ran jupyter notebook --generate-config to create it, it should be in the correct place already. If not, try putting it at ~/.jupyter/jupyter_notebook_config.py
For anyone else still putting this in the config files - don't forget that Notebook.App must be referenced from the c object (whatever that is?)
c.NotebookApp.iopub_data_rate_limit = 123456789
the c. at the beginning is necessary
Is there a way to just silence the message? e.g. if I don't want others to have to change their config. I acknowledge that I am outputting to fast and don't care to see this particular message?
Also how does one set this in the notebook.json file?
I changed the data rate to 123456789 in the Notebook app and i still have thesame problem. Do i have to change the notebook limit window from 3 secs as well?
Empty your data frames and lists while rerunning your scripts. Normally it solves the issue!
Most helpful comment
When we release notebook 5.1, the rate limit should only apply to stdout/stderr output, so audio output should work without needing any special config again. I'm -0.5 on documenting the temporary workaround - our docs are big and complex enough that I don't think people have a much better chance of finding it there than they do on this issue.