Running cargo doc --open currently opens the file in the browser with the file:// protocol.
In this protocol, web extensions like Vimium don't have access to the page and can't provide keyboard shortcuts.
It would be nice if running cargo doc --open would start a web server like http then navigate to the generated index file. I've tested that and Vimium works normally. Thank you!
This has historically been controversial. I'm pro, but there are a lot of people against.
What are the con arguments?
I've just installed Vimium on Firefox, it works great. So maybe install Firefox :).
As far as I understand, it's browser or vimium problem.
But if we do this, it makes sense to use separate key, for example: cargo doc --open_with_embedded_webserver
Yeah it is a Chrome bug. In general, Chrome cares only very little about the file:// protocol: https://bugs.chromium.org/p/chromium/issues/detail?id=47416
But why should firefox users being made suffer? Firefox works great, but http://localhost is far more inconvenient than file protocol based services.
having a built-in server would be really useful for those that usually work inside a ssh session/vm and use a browser outside the shell session.
for those that usually work inside a ssh session/vm
Why trying to squash the universe inside one tool? Let cargo doc generate documentation, let another tool host it.
This gist has some one-liners for spinning up a web server for static content:
https://gist.github.com/willurd/5720255
eg:
cargo doc && cd target/doc && python3 -m http.server 8000
will spin up an http server on 0.0.0.0:8000 (potentially world accessible; take appropriate care!).
It's not as convenient as cargo doc --open but it's not super terrible.
While I'm on the fence about building this support directly into cargo, it would be nice if there was an easy way to do this, or that the docs/help pushed folks towards a couple of nice recipes to help with this use case.
I'm hacking together a cargo docserver for this, hopefully it gets somewhere 馃檹
just pushed https://crates.io/crates/cargo-docserver, hopefully that's useful for someone besides me
@qmx it works, thanks!
Probably if it is an edge case scenario, you can use https://www.npmjs.com/package/serve to serve your docs to a localhost and then open that in chrome(ium).
This is much simpler solution than building a server inside cargo itself.
This has historically been controversial. I'm pro, but there are a lot of people against.
Then why you use a web browser.
Probably if it is an edge case scenario, you can use https://www.npmjs.com/package/serve to serve your docs to a localhost and then open that in chrome(ium).
This is much simpler solution than building a server inside cargo itself.
How can this be simple? You need nodejs to run it. Have you lost your mind?
How can this be simple? You need nodejs to run it. Have you lost your mind?
No I haven't. What I meant with the suggestion was that cargo is not meant to be a server. Including an http server just to view docs on localhost doesn't make much sense when there are libraries such as cargo-docserver.
I was just suggesting a simple solution in terms of the time it would take for someone to use it. Granted it is simple only for those who already have node installed, but it is a simple solution nonetheless.
Most helpful comment
This gist has some one-liners for spinning up a web server for static content:
https://gist.github.com/willurd/5720255
eg:
will spin up an http server on 0.0.0.0:8000 (potentially world accessible; take appropriate care!).
It's not as convenient as
cargo doc --openbut it's not super terrible.While I'm on the fence about building this support directly into cargo, it would be nice if there was an easy way to do this, or that the docs/help pushed folks towards a couple of nice recipes to help with this use case.