We don't have an example of doing 3D, animated visualization on Binder.
Based on the meshcat-python demo:
https://github.com/rdeits/meshcat-python/blob/549171bcf11ee422904fcf4858e231a354191eae/demo.ipynb
I briefly tried out showing a Jupyter cell in Binder:
import meshcat
vis = meshcat.Visualizer()
vis.url()
vis.jupyter_cell()
The Meshcat server could run, but I could not connect to the client.
I (very naively) tried to use http://{ip}.0.0.1:7000/static/, where {ip} is the publicly-visible IP address from the running instance on Binder:
import json
from urllib.request import urlopen
ip = urlopen('https://api.ipify.org').read().decode("utf-8")
print(ip)
But this did not work. I'm assuming we may have to get creative (if it's at all possible) to show the visualization on the client side.
Per this comment, meshcat may be viable, either via static HTML or by forwarding ports.
Other alternative being investigated are other direct / indirect three.js solutions, such as:
itkwidgets (vtk.js)three.js directlytikwidgets (vtk.js)meshcat static HTMLmeshcat via forwarding ports via Binder's mechanismsTBD
UPDATE (2020/01/28): Per Jamie's comment below, the best solution for this may be to not use MeshCat, but instead another visualizer that can better handle the Binder-/JupyterHub- style workflow.
For displaying the widget / cell pointing to the appropriate cell, we may need to somehow specify the host option in meshcat, but it's unclear how to do so. Some relevant lines of code:
https://github.com/rdeits/meshcat-python/blob/549171bcf11ee422904fcf4858e231a354191eae/src/meshcat/visualizer.py#L42-L44
https://github.com/rdeits/meshcat-python/blob/549171bcf11ee422904fcf4858e231a354191eae/src/meshcat/servers/zmqserver.py#L167-L173
If the notebook is run locally, everything is fine; see #12646 for an example.
\cc @RussTedrake @TobiaMarcucci
For reference, it seems like pyntcloud has notebooks that run on Binder and can show (WebGL?) visualization widgets:
https://github.com/daavoo/pyntcloud/blob/master/README.rst
Example Binder link:
https://mybinder.org/v2/gh/daavoo/pyntcloud/master?filepath=examples/[visualization]%20Polylines.ipynb
EDIT: Looks like the default backend uses pythreejs:
https://github.com/jupyter-widgets/pythreejs
Just to expand, this difference in functionality can be reproduced in a local Docker container, per this bit:
https://github.com/RobotLocomotion/drake/tree/1ffb58bd1fa5e0c91e38c4bc7ec0de4cbcbfa56e/.binder#docker-image-for-binder
I cannot see the meshcat visualization in Jupyter within docker, but can see it on my local machine.
I can also install open3d==0.9.0 (with a newer version of Jupyter notebook), and can visualize point clouds with code like:
# http://www.open3d.org/docs/release/tutorial/Basic/working_with_numpy.html
import numpy as np
import open3d as o3d
# generate some neat n times 3 matrix using a variant of sync function
x = np.linspace(-3, 3, 401)
mesh_x, mesh_y = np.meshgrid(x, x)
z = np.sinc((np.power(mesh_x, 2) + np.power(mesh_y, 2)))
z_norm = (z - z.min()) / (z.max() - z.min())
xyz = np.zeros((np.size(mesh_x), 3))
xyz[:, 0] = np.reshape(mesh_x, -1)
xyz[:, 1] = np.reshape(mesh_y, -1)
xyz[:, 2] = np.reshape(z_norm, -1)
# Pass xyz to Open3D.o3d.geometry.PointCloud and visualize
pcd = o3d.geometry.PointCloud()
pcd.points = o3d.utility.Vector3dVector(xyz)
# http://www.open3d.org/docs/release/tutorial/Basic/jupyter.html
visualizer = JVisualizer()
visualizer.add_geometry(pcd)
visualizer.show()
The first part of this is going to be whether we can actually expose a port when running on Binder. If it happens to run the equivalent of docker run -P (it uses kubernetes, I think, however) then that part is easy. Locally, you would want to add something like -p 7000:7000 to the docker run command.
And (after just trying) as you would imagine, it does not allow you to expose 7000, so no websocket connection, so no meshcat visualization.
(so the short answer would be that using a different visualizer would be a million times easier)
Aye, makes sense...
Are going to therefore close this as not possible or re-scope it as use a different visualizer or rewrite much of meshcat?
Re-scoping to achieve some level of 3D visualization on Binder.
Also filed: https://github.com/rdeits/meshcat/issues/67
Next step is to try another three.js-based solution, but one that doesn't require WebSockets (may be slower... but works?)
We also have something like itkwidgets based on vtk.js:
https://pypi.org/project/itkwidgets/
with Binder examples here:
https://mybinder.org/v2/gh/InsightSoftwareConsortium/itkwidgets/master?urlpath=lab/tree/examples
I did some experimentation with itkwidgets.
I tried some of the itk examples linked above (#13182). These examples work in Binder assuming itk and itkwidgets are available. Note that the version 0.27.0 of itkwidgets is not rendering in Binder (itkwidgets issue), I used itkwidgets==0.26.1 to test.
As a sidenote, I tried pip installing itk and itkwidgets directly from Binder , e.g.
import sys
!{sys.executable} -m pip install itk
!{sys.executable} -m pip install itkwidgets
But I couldn't import itkwidgets?
The vtk examples currently don't work in Binder (brief explanation).
I haven't explored this at all, but this disscussion seems to indicate that it might be possible?
@BetsyMcPhail For your WIP branch, can you instead try to hack the .binder/Dockerfile (not mac prereqs) to install a custom-built version of VTK for the version you need and then post the exact command-line you used to run the Docker container image locally?
If it takes too long for vtk.js-based solutions, then we may want to just do direct pythreejs / three.js solutions as shown in above examples.
.binder/Dockerfile has been updated to install and setup any necessary modules.
Use the docker image for Binder to try locally.
@EricCousineau-TRI do you want to evaluate this as is and decide whether/what other visualizer options Betsy should explore?
Thanks! I've just ran through the examples and provided some feedback in the PR.
@jamiesnape When you have a chance, per your prior comment:
And (after just trying) as you would imagine, it does not allow you to expose 7000, so no websocket connection, so no meshcat visualization.
Would you or Betsy have time to add a permalink link to the JupyterHub code which indicates that this is not possible?
There isn't public code. It would be a server firewall configuration.
(Also it is Binder, not JupyterHub.)
Sorry; what I meant was if there's anything in JupyterHub's nominal deployment ~configuration~ that might preclude additional ports. I understand that won't eliminate the possibility that Binder's hosts may place additional restrictions.
And also, it would be nice to confirm beyond assumption.
Can you see if a similar question has been asked / answered in the Binder forum, or in the Binder docs?
For example, it looks like you can run a Bokeh server through a proxy?
https://mybinder.readthedocs.io/en/latest/sample_repos.html?highlight=port#running-a-bokeh-server-with-binder
For example, it looks like you can run a Bokeh server through a proxy?
https://mybinder.readthedocs.io/en/latest/sample_repos.html?highlight=port#running-a-bokeh-server-with-binder
The proxy looks interesting, though it would still need some moderate patching on the meshcat side if you can indeed proxy the server through tcp/8888 and forward that to tcp/443.
Sweet! Moderate patching sounds wayyy better than re-architecting, esp. if the proxying gets us what we want!
Is it possible to queue that up to attempt with a Meshcat branch, say over the next month?
FTR, @rdeits was able to take some time to comment on this issue:
https://github.com/rdeits/meshcat/issues/67
@BetsyMcPhail On a separate thread, would it be possible to try Robin's suggestion of transmitting static HTML, first for a static scene, then for an animation?
Similar to this PR comment, could you make it look like this workflow?
https://nbviewer.jupyter.org/github/RobotLocomotion/drake/blob/nightly-release/tutorials/pyplot_animation_multibody_plant.ipynb#Run-with-Playback
(My estimate is this would take ~3 Perfect Engineering Hours - is this viable within the next 2 weeks?)
At present, no need to patch the current drake.systems.meshcat_visualizer code; just writing a targeted Meshcat-only example would work to boot!
@BetsyMcPhail On a separate thread, would it be possible to try Robin's suggestion of transmitting static HTML, first for a static scene, then for an animation?
Similar to this PR comment, could you make it look like this workflow?
https://nbviewer.jupyter.org/github/RobotLocomotion/drake/blob/nightly-release/tutorials/pyplot_animation_multibody_plant.ipynb#Run-with-Playback(My estimate is this would take ~3 Perfect Engineering Hours - is this viable within the next 2 weeks?)
At present, no need to patch the current
drake.systems.meshcat_visualizercode; just writing a targeted Meshcat-only example would work to boot!
Sounds like a good plan - within the next 2 weeks should be doable.
Per convo with @ToffeeAlbina-TRI, I've updated the overview, enumerating some of the potential solutions discussed here.
At present, if we can do port-forwarding on Binder, then meshcat may be the lowest-energy solution.
Alternatively, we can static HTML rendering, recording the animation (like PyPlotVisualizer).
Failing that, then making tooling around itkwidgets.js / vtk.js / three.js directly may be the main viable solution (but higher overhead).
Just added some checkboxes at top-level. Can sequence it out if need be!
Re-assigning to Jamie to investigate the port proxy on Binder.
I have looked deeply into why yet, but it is not looking promising with the proxy. Certainly the HTTP is being correctly proxied from 7000 to 8888, but the upgrading to a websocket is running into problems. I will keep working on it, but I am not hopeful.
I've been thinking about this again. A few observations:
I've been thinking about this again. A few observations:
- I believe we consider the actual Binder service to be flakey (probably too flakey for use with student problem sets, for instance). I think the work we are doing here is really paving the way to us having 3d visualization in a JupyterHub / BinderHub if we have our own instances. Correct?
We would have more flexibility with our own JupyterHub, so the solution could be different/easier.
Colab was a reliable resource for class. I was less optimistic about getting visualization working in Colab, but have been experimenting a bit more this morning.
- I confirmed that Altair works, even interactively in colab.
- I've also found some examples of three.js (which meshcat is built on) displaying in colab. So I think that the issues with meshcat are in principle surmountable; it's the network connection between the client and server that is causing problems, not the actual visualization?
Yes, it is the network connection. In principle, most of the JavaScript visualizers should work fine with any of the managed platforms so long as we can get the necessary data to them.
Poking around a little more, it looks like meshcat-server is not actually needed. I'm new to this, but based on some code browsing, I think we could probably hand the meshcat viewer js directly to the jupyter notebook (instead of the iframe through the meshcat-server), and connect to it via a websocket from our pydrake visualizer. And a little googling suggests that websockets do (must?) work on colab. ?
@rdeits -- does that sound viable to you?
Here is a link to my very preliminary experiments with meshcat on colab. It doesn't work yet!
ok, I've got it basically working in the notebook above. ngrok for the win.
(specifically, I can launch an meshcat-server on the colab notebook, connect to it on colab with meshcat-python, and open up a new window on my local machine and see the results). the last step is to get the jupyter_cell rendering working right in the colab notebook, which is close.
@jamiesnape -- i've got a concrete ask. I've cleaned up my colab example, which describes the last hurdle: I need wss to work through ngrok in order to connect to meshcat in a colab cell. Any help is appreciated.
I will look into it.
A small amount of progress. This gets me a connection, but not necessarily a working connection:
from IPython.display import HTML
HTML("""
<div id="meshcat-pane" style="height: 400px; width: 100%; overflow-x: auto; overflow-y: hidden; resize: both">
</div>
<script type="text/javascript" src="{js_url}"></script>
<script>
var viewer = new MeshCat.Viewer(document.getElementById("meshcat-pane"));
viewer.connect({wss_url});
</script>
<script id="embedded-json"></script>
""".format(js_url='https://localhost:7000/static/main.min.js', wss_url='`wss://${location.host}:7000`'))
Among other things, seems wss://localhost:7000 does not work the same way as https://localhost:7000 in relation to a server running on Colab. Also location.host is missing the port number (so is hard-coded for the moment).
Spent another hour on this today w @sjahl (thanks!). Here is my current understanding. tl;dr: I think I need to teach meshcat server to accept a secure websocket.
I agree with @jamiesnape -- and think we shouldn't need ngrok. Even if we wanted to use ngrok, opening a TLS tunnel on ngrok requires a paid version. I'm not against paying, but do not have an option that would allow any user of the notebook to have a paid ngrok version. So the ngrok branch of thinking is pruned.
I believe that the problem now is that the meshcat server is not expecting a wss conneection. This is in my power to fix.
Update: I got a lot farther, but got stuck again. notebook is here.
curl. But curl needs --insecure to accept the self-signed cert.Will have to shelve this again for now.
Minor update: I've got meshcat websockets working over SSL on my local machine (not colab), and only with the visualizer in a separate window, not a jupyter cell. As expected, it results in an error message in chrome, until you manually accept the self-signed certificate. https://github.com/rdeits/meshcat/pull/71
Next step (after landing the meshcat PRs): connect via ssl in a jupyter cell on my local machine.
Huzzah. I've got meshcat running in my local notebook via viz. Once my meshcat PRs land, the following works:
cell 1:
import meshcat
zmq_url="tcp://127.0.0.1:6000"
vis = meshcat.Visualizer(zmq_url=zmq_url)
box = meshcat.geometry.Box([0.5, 0.5, 0.5])
vis.set_object(box)
vis.jupyter_cell()
cell 2:
import math
import time
for i in range(500):
theta = (i + 1) / 100 * 2 * math.pi
now = time.time()
vis.set_transform(meshcat.transformations.rotation_matrix(theta, [0, 0, 1]))
time.sleep(0.01)
when I launch meshcat server via:
python3 -m meshcat.servers.zmqserver --certfile localhost.crt --keyfile localhost.key
(and have accepted the certificates at least once). Everything flows through ssl.
Just for now, will re-assign this to you Russ until there're more action items for someone else.
Just to update my thoughts on this...
After talking to @rdeits a week or so ago, he mentioned that the entire meshcat network setup was actually inspired by the way that jupyter kernels work (websockets and zmq). Assuming colab is the same, it might be that the most viable option here is actually to figure out if we can piggyback on the existing websocket connection between the browser and the server instead of trying to instantiate our own. @rdeits said he had managed to do that with Jupyter once before.
Hi guys ! I had a similar issue. I managed to get it work after "fixing" websocket, by enabling cross-traffic. Here is a working snippet enabling to render a custom javascript meshcat viewer in jupyter:
from IPython.display import HTML
HTML("""
<div style="height: 400px; width: 100%; overflow-x: auto; overflow-y: hidden; resize: both">
<iframe id="testiframe" src='data:text/html,
<div id="meshcat-pane" style="height: 100%; width: 100%; overflow-x: auto; overflow-y: hidden; resize: both">
</div>
<script type="text/javascript" src="http://{url}/static/main.min.js"></script>
<script>
var viewer = new MeshCat.Viewer(document.getElementById("meshcat-pane"));
viewer.connect("ws://{url}");
</script>
<script id="embedded-json"></script>'; style="width: 100%; height: 100%; border: none">
</iframe>
</div>
""".format(url='127.0.0.1:32784'))
Ok, here's a direction which doesn't require any port forwarding, server configuration, certificates, or proxy. Rather than using a websocket, we can send commands from python to the frontend using the built-in Jupyter comms protocol. I've set up a dumb example that renders an iframe and then sends messages to it from Python here: https://github.com/rdeits/meshcat-python/blob/jupyter-comms-demo/comms.ipynb
This means:
This would require some changes to meshcat to support sending messages this way instead of over ZMQ, but it might not be too bad. Essentially, we'd have to replace the calls to zeromq.send with command_channel.send(). Figuring out a nice way to do that without breaking the existing use-cases would require some thought, but I don't think it's too bad.
@RussTedrake this is what I was referring to when I talked about using the Jupyter comms protocol itself. It should also work on Colab (although I haven't tested) since it's the same underlying mechanism that things like altair and ipywidgets use.
@duburcqa -- thanks. That one we knew how to do, and Robin already has it basically available in the jupyter_cell() method. The trick is that colab won't allow the connection via ws. It needs wss.
@rdeits -- awesome! it worked exactly as expected on my local machine, but my first copy/paste to colab did not:
https://colab.research.google.com/drive/1wsgE6EzRORqc2uXotRaL5lpbCZYvBYvH?usp=sharing
It looks like window.parent.Jupyter is undefined on colab. I found window.colab (also window.parent.colab), which has kernel, but not comm_manager. Will dig a little more.
Ah, just found this: https://github.com/googlecolab/colabtools/blob/master/packages/outputframe/lib/index.d.ts
(from https://colab.research.google.com/notebooks/snippets/advanced_outputs.ipynb#scrollTo=Ytn7tY-C9U0T)
which does have APIs that refer back to https://jupyter-notebook.readthedocs.io/en/stable/comms.html
It looks like this example is what we want:
https://colab.research.google.com/notebooks/snippets/advanced_outputs.ipynb#scrollTo=1-FIHCdGKCyO
Made some progress, but hit (I think) a missing feature/export in colab. I've asked on stackoverflow.
OK, I have a proof of life, albeit with the probably inefficient version of opening a new comm channel for every message.
https://colab.research.google.com/drive/1O-WgLtqKGuHKEeD4CFT69870dPEJRsQJ
As @rdeits says, I think the rest is just plumbing, and shouldn't be that bad!
Woot! I think I've get everything working now (though it will get more efficient if the stackoverflow generates a response).
https://colab.research.google.com/drive/1O-WgLtqKGuHKEeD4CFT69870dPEJRsQJ
meshcat PR is linked above.