Browserify
Moving to this architecture has advantages and disadvantages. Please add your thoughts.
Note: this does not require three.js consumers to use browserify.
One advantage is that this would enforce a modular architecture for the ongoing development of three.js.
The common style in node/browserify has each file declare its dependencies at the top, and considers global variables an anti-pattern.
Here is an example snippet:
// src/geometry/BoxGeometry.js
var Geometry = require('./Geometry.js');
var Vector3 = require('../core/Vector3.js');
module.exports = BoxGeometry;
function BoxGeometry() {
// ...
}
BoxGeometry.prototype = Geometry.prototype;
Another advantage is that consumers of three.js
using browserify would be able to pick and choose the parts they want. They could just import Scene
, BoxGeometry
, PerspectiveCamera
, and WebGLRenderer
, get the dependencies for all those automatically ( Object3D
etc ), and have a small bundle of javascript that supports just the feature set they want.
This could be done in a way that imposes no breaking changes. At the top level, we would export all classes we consider to be part of the standard package
// src/three.js
var THREE = { rev: 101 }
module.exports = THREE
THREE.Geometry = require('./geometry/Geometry.js')
THREE.BoxGeometry = require('./geometry/BoxGeometry.js')
// ...
note: I'm not exactly requiring the dependencies at the top in this example, because this file would almost exclusively be require statements.
Finally we would wrap that in a Universal Module Definition that detects if a module system (node/browserify, AMD) is in use, and if so exports it, or otherwise appends it to the global object ( window
).
Lets review:
three.js
consumers using browserify to pick and choose functionalityThis would require replacing the build system, but the new one would be pretty straight forward.
Some other advantages:
@shi-314 I guess I'm a little confused, I feel like You can structure your code
and You can build for production
are things that you can do without the architectural shift? Are you talking about three.js source or things built using three.js?
One practice that three.js uses that makes it tricky to use in commonjs environments is the use of instanceof
: https://github.com/mrdoob/three.js/blob/master/src/core/Geometry.js#L82
This is because in an application you often end up with different versions of the same library in your source tree, so checking instanceof doesn't work between different versions of the same library. It would be good in preparation for a move to a commonjs module system to replace those instanceof checks with feature-checking behind a Geometry.isGeometry(geom)
style interface.
@kumavis I am talking about things built in three.js. Let's say you want to create your own material with your shaders etc. At the moment you need to extend the global THREE object to stay consistent with the rest of the three.js code:
THREE.MeshMyCoolMaterial = function (...) { ... }
But if we had Browserify than you could do:
var MeshLambertMaterial = require('./../MeshLambertMaterial');
var MeshMyCoolMaterial = function (...) {...}
So your namespace stays consistent and you don't need to use THREE.MeshLambertMaterial
and MeshMyCoolMaterial
in your code.
And with You can build for production
I basicly meant the same thing you mentioned: allows three.js consumers using browserify to pick and choose functionality
.
@shi-314 thank you, that is more clear. That does impact my proposed general solution to deserialization consumer-defined classes:
// given that `data` is a hash of a serialized object
var ObjectClass = THREE[ data.type ]
new ObjectClass.fromJSON( data )
This is from my proposed serialization / deserialization refactor
https://github.com/mrdoob/three.js/pull/4621
Performance shouldn't be affected by a change like this.
This is a pretty huge change but I'm also in favour of it.
Some other major advantages:
standalone
option to generate a UMD build for you. No need to manually tinker with UMD wrappers.threejs-vecmath
without worrying about everyone's code breaking. And on the flip side, if we make a patch or minor release in a particular module, people consuming those modules will be able to get the changes automatically.npm install threejs-shader-bloom
)require()
the modules that our app is actually using.To @mrdoob and the other authors; if you don't have much experience with NPM/Browserify I would suggest making a couple little projects with it and getting a feel for its "philosophy." It's very different from ThreeJS architecture; rather than big frameworks it encourages lots of small things.
Another advantage of this approach is that there can be an ecosystem of open source, third party Three.JS modules, especially shaders, geometries, model loaders etc. Published through NPM or Github/Component which people can then easily just reference and use. At the moment stuff is shared by hosting a demo which people then 'view source' on. Three.JS deserves better!
I think one of the problems I have with Three.JS is how quickly code becomes incompatible with the current version of Three.JS. Another advantage of switching to something like this is being able to specify specific versions of _bits_ of Three.JS would be very powerful and handy.
+1
+1 for a CommonJS/browserify architecture, it would make the core more lightweight and extensions would fit even if they come from third-parties
Fragmenting three.js into little modules has a lot of costs as well. The current system allows pretty simple third party addons (witness eg. jetienne's THREEx modules). There's a lot to be said about the simplicity of the current setup, as long as the JS module systems are just wrappers around build systems.
Another way of minimizing build size is what ClojureScript does. They follow some conventions to allow Google's Closure compiler to do whole-program analysis and dead code elimination.
+1 for the unappreciated, and often overlooked, elegance of simplicity
+1
Fragmenting three.js into little modules has a lot of costs as well. The current system allows pretty simple third party addons (witness eg. jetienne's THREEx modules).
The idea here is that a UMD build would still be provided for non-Node environments. Plugins like THREEx would work the same way for those depending on ThreeJS with simple <script>
tags.
The tricky thing will be: how do we require()
a particular plugin if we are in a CommonJS environment? Maybe browserify-shim could help.
There's a lot to be said about the simplicity of the current setup, as long as the JS module systems are just wrappers around build systems.
ThreeJS's current plugin/extension system is pretty awful to work with, and far from "simple" or easy. Most ThreeJS projects tend to use some form of plugin or extension, like EffectComposer, or FirstPersonControls, or a model loader, or one of the other many JS files floating around in the examples
folder. Right now the only way to depend on these plugins:
vendor
folderNow, imagine, with browserify you could do something like this:
var FirstPersonControls = require('threejs-controls').FirstPersonControls;
//more granular, only requiring necessary files
var FirstPersonControls = require('threejs-controls/lib/FirstPersonControls');
Those plugins will require('threejs')
and anything else that they may need (like GLSL snippets or text triangulation). The dependency/version management is all hidden to the user, and there is no need for manually maintained grunt/gulp concat tasks.
The tricky thing will be: how do we require() a particular plugin if we are in a CommonJS environment?
I've been using CommonJS for THREE.js projects for a bit now. It's a bit of a manual process, converting chunks of other people's code into modules and I don't think there'll be an easy way to avoid that for legacy code that isn't converted by the authors or contributors.
The important bit is that there's a module exporting the entire 'standard' THREE object, which can then be required by anything that wishes to extend it.
var THREE = require('three');
THREE.EffectComposer = // ... etc, remembering to include copyright notices :)
This has worked pretty well for me, especially as the project grows and I start adding my own shaders and geometries into their own modules etc.
As long as there's a 'threejs-full' or 'threejs-classic' npm package then this becomes a pretty viable way of working with old Three.js stuff in a CommonJS environment but I suspect this is pretty niche!
+1
I believe once fragmented threejs modules are available in npm, plugin
developers will love to migrate to CommonJS env.
On Jun 5, 2014 9:19 PM, "Charlotte Gore" [email protected] wrote:
The tricky thing will be: how do we require() a particular plugin if we
are in a CommonJS environment?I've been using CommonJS for THREE.js projects for a bit now. It's a bit
of a manual process, converting chunks of other people's code into modules
and I don't think there'll be an easy way to avoid that for legacy code
that isn't converted by the authors or contributors.The important bit is that there's a module exporting the entire 'standard'
THREE object, which can then be required by anything that wishes to extend
it.var THREE = require('three');
THREE.EffectComposer = // ... etc, remembering to include copyright notices :)This has worked pretty well for me, especially as the project grows and I
start adding my own shaders and geometries into their own modules etc.As long as there's a 'threejs-full' or 'threejs-classic' npm package then
this becomes a pretty viable way of working with old Three.js stuff in a
CommonJS environment but I suspect this is pretty niche!—
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-45236911.
It could also make the shaders made modular as well, e.g. using glslify. Even things like making an Express middleware that generates shaders on demand becomes easier then.
Some months ago I moved frame.js to require.js and I finally understood how this AMD stuff works.
I still need to learn, however, how to "compile" this. What's the tool/workflow for generating a three.min.js
out of a list of modules?
I prefer gulp.js as a build system with the gulp-browserify plugin. It's really easy to understand and the code looks cleaner than grunt in my opinion. Check this out: http://travismaynard.com/writing/no-need-to-grunt-take-a-gulp-of-fresh-air :wink:
some thoughts: (based on my limited experience with node, npm, browserify of course)
that said, following the discussion on this thread, i'm not sure if everyone had the same understanding of browserify (browserify, commonjs, requirejs, amd, umd are somewhat related although they may not necessary be the same thing).
now if you may follow my chain of thoughts a little.
There's where Browserify comes into the picture. Well, technically one can use requireJS in the browser. But you wish to bundle js files together without making too many network calls (unlike file system require()s which are fast). There's where Browserify does some cool stuff like static analysis to see which modules needs to be imported and creates build which are more optimized for your application. (There are limitations of course, it probably can't parse require('bla' + variable)) it can even swap out parts which requires an emulation layer for node.js dependent stuff. yeah, it generates a js build which i can now include in my browser.
Here are some of the stuff browserify can do https://github.com/substack/node-browserify#usage
Sounds like everything's great so far... but there are a few points I thought worth considering we move to a "browserify architectural"
So if we see this diversity and convenient module loading (mainly riding on the npm ecosystem) along with customized builds a great thing, then it might be worth a short is having a change in paradigm, refactoring code and changing our current build system.
@mrdoob some tools around browserify are listed here: https://github.com/substack/node-browserify/wiki/browserify-tools.
regarding the three.min.js
, you would not use the minified code in your project. all you do is var three = require('three')
in your project.js
and then run browserify project.js > bundle.js && uglifyjs bundle.js > bundle.min.js
. note: you still can ship minified code for <script src="min.js">
.
i am currently wrapping three.js with
if ('undefined' === typeof(window))
var window = global && global.window ? global.window : this
var self = window
and
module.exports = THREE
then i wrap extensions with
module.exports = function(THREE) { /* extension-code here */ }
so i can require it like that:
var three = require('./wrapped-three.js')
require('./three-extension')(three)
so this is not optimal, but i personally can actually live with it and think its not so bad - though @kumavis proposal would be a _huge_ advantage.
but maybe it would make sense to fork three and put all the things in seperate modules just to see how it would work out.
also checkout http://modules.gl/ which is heavily based on browserify (though you can use every module on its own without browserify).
@mrdoob @shi-314 gulp-browserify has been blacklisted in favour of just using browserify directly (i.e. via vinyl-source-stream).
Tools like grunt/gulp/etc are constantly in flux, and you'll find lots of differing opinions. In the end it doesn't matter which you choose, or whether you just do it with a custom script. The more important questions are: how will users consume ThreeJS, and how much backward-compatibility do you want to maintain?
After some more thought, I think it will be _really_ hard to modularize everything without completely refactoring the framework and its architecture. Here are some problems:
../../../math/Vector2
etc. three-scene
would be decoupled from three-lights
etc. Then you can version each package separately. This kind of fragmentation seems unrealistic for a framework as large as ThreeJS, and would be a pain in the ass to maintain.require('three/src/math/Vector2')
My suggestion? We consider two things moving forward:
I'd love to see everything modularized, but I'm not sure of an approach that's realistic for ThreeJS. Maybe somebody should some experiments in a fork to see how feasible things are.
Thanks for the explanations guys!
What I fear is complicating things to people that are just starting. Forcing them to learn this browserify/modules stuff may not be a good idea...
Would have to agree with @mrdoob here. I, and a lot of colleagues, are not web programmers (rather VFX/animation TDs). Picking up WebGL and Three has certainly been enough work as is on top of our current workload (and in some cases some of us had to learn js on the spot). Much of what I have read in this thread, at times, makes me shudder thinking about how much more work would be added to my plate if Three moved to this structure. I could be wrong but that is certainly how it reads to me.
With a precompiled UMD (browserify --umd
) build in the repo, there's no change to the workflow for existing developers.
@mrdoob The idea of a dependency management system is simplicity. Reading dozens of posts on options and build systems may be overwhelming, but ultimately the current system is not sustainable. Anytime one file depends on another, that's a hunt-and-search any new developer has to perform to find a reference. With browserify, the dependency is explicit and there's a path to the file.
@repsac A dependency system should make Three more accessible to users of other languages as it avoids global scope, load order nightmares and follows a paradigm similar to other popular languages. var foo = require('./foo');
is (loosly) akin to C#'s using foo;
or Java's import foo;
I'd love to see everything modularized, but I'm not sure of an approach that's realistic for ThreeJS. Maybe somebody should some experiments in a fork to see how feasible things are
I think this is the way to go, really. Get the work done, show how it works.
And consuming the API would be pretty
ugly: require('three/src/math/Vector2')
As an experiment I just converted the 'getting started' from the Three docs to this new modular approach. I can imagine there being a lot of references unless people are pretty strict about breaking up their code into tiny modules.
The main advantage of doing this would be that the resulting build size would be a tiny fraction of the size of the full Three.js because you'd only be including the things specifically referenced here and then the things that those things depend on.
I guess referencing all the dependencies you need (and installing them all individually) could prove a bit too awful in practice.
If you're explicitly targeting mobile devices then this highly granular approach would be perfect but in reality I suspect we'll need packages that export the entire THREE api which will work as normal, then smaller packages that encapsulate all the bonus geometry, all the renderers, all the math, all the materials etc, then down to the individual module level so that developers can decide for themselves.
And yes, coding for the web is a pain.
ANyway, on with the experiment...
Install our dependencies..
npm install three-scene three-perspective-camera three-webgl-renderer three-cube-geometry three-mesh-basic-material three-mesh three-raf
Write our code...
// import our dependencies..
var Scene = require('three-scene'),
Camera = require('three-perspective-camera'),
Renderer = require('three-webgl-renderer'),
CubeGeometry = require('three-cube-geometry'),
MeshBasicMaterial = require('three-mesh-basic-material'),
Mesh = require('three-mesh'),
requestAnimationFrame = require('three-raf');
// set up our scene...
var scene = new Scene();
var camera = new Camera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
var renderer = new Renderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// create the cube...
var geometry = new CubeGeometry(1, 1, 1);
var material = new MeshBasicMaterial({color: 0x00ff00});
var cube = new Mesh(geometry, material);
scene.add(cube);
// position the camera...
camera.position.z = 5;
// animate the cube..
var render = function () {
requestAnimationFrame(render);
cube.rotation.x += 0.1;
cube.rotation.y += 0.1;
renderer.render(scene, camera);
};
// begin!
render();
then build our file
browserify entry.js -o scripts/hello-world.js
then include it in our page
<script src="/scripts/hello-world.js" type="text/javascript"></script>
I guess referencing all the dependencies you need (and installing them all individually) could prove a bit too awful in practice.
The end user doesn't necessarily need to use browserify in their project in order for Three to use browserify to manage its codebase. Three can be exposed as the global THREE
as it is now...include the build file and run with it.
@repsac @mrdoob the changes would be backward-compatible, so that current users do not need to change anything if they don't want to. These suggestions are to improve the long-term maintainability and longevity of ThreeJS's sprawling and monolithic codebase. Things like dependency and version management might sound like a headache to the uninitiated, but they are awesome for those developing tools, frameworks, plugins, and large scale websites on top of ThreeJS.
i.e. End-user code can still look the same, and the examples
don't need to change at all:
<script src="three.min.js"></script>
<script>
var renderer = new THREE.WebGLRenderer();
</script>
For more ambitious developers who are looking for a modular build, _or_ for those looking to develop long-term solutions on top of ThreeJS (i.e. and take advantage of version/dependency management), it might look more like this:
npm install three-vecmath --save
Then, in code:
var Vector2 = require('three-vecmath').Vector2;
//.. do something with Vector2
And, furthermore, this allows people to use things like ThreeJS's vector math, color conversions, triangulation, etc. outside of the scope of ThreeJS.
Even though I think the require() mess is a bad idea and bad tradeoff, it would be an even worse idea to expose users to two different kinds of three.js code, telling users one is the fancy module system flavour and other is the simpler (but second-class) module system flavour.
@erno I think you've missed the point, three.js
would be organized by a module structure internally, but that is used to produce a build file no different from the current setup.
The primary gain is improving the experience of developing and maintaining three.js
.
@kumavis - no @erno did not actually miss that, but I understood(*) that he makes the point that if three.js
is sometimes used via require and sometimes not it can be confusing. For example someone looks both at the three source and then some 3rd party examples and encounters differences in how it all is and works.
(*)we talked about this on irc earlier today.
I think that's a kind of valid point but am not sure whether / how it works out in the end -- whether is an issue with the module & build thing usages really. But certainly seems worth a thought and overall has seemed good to me that the overall matter has been considered here carefully, thanks for infos and views so far from my part.
@antont I see. People have suggested a variety of different approaches here, I was assuming that we would mainly provide documentation for top-level usage (pulling everything off of THREE
), but others may create examples that would not follow this and it may lead to some confusion. And thats a valid concern.
I think I was a little confused by the language.
and other is the simpler (but second-class) module system flavour.
This is just referring to the build file, yes?
In my understanding, yes. Can't imagine what else but can be missing something though.
antont, kumavis: The proposals here have talked about exposing the require() style code to end users too, see eg. mattdesl's most recent comment.
"For more ambitious developers who are looking for a modular build, or for those looking to develop long-term solutions on top of ThreeJS (i.e. and take advantage of version/dependency management) [...]"
one way to having a more optimized builds is actually to have a script which figure out your dependencies automatically and churn out the required modules.
right now bower & browserify does out with require, but they aren't the only solutions. i don't know if there are other off-the-shelf open source projects which does that (maybe like ng-dependencies) but i've written such tools before that i think there would be other approaches to solving these pains.
google's closure compiler might be such a tool?
On the user's side, might this be of some help?
http://marcinwieprzkowicz.github.io/three.js-builder/
that's pretty interesting @erichlof :) i wonder if @marcinwieprzkowicz generated this by hand... https://github.com/marcinwieprzkowicz/three.js-builder/blob/gh-pages/threejs-src/r66/modules.json
One practice that three.js uses that makes it tricky to use in commonjs environments is the use of instanceof: https://github.com/mrdoob/three.js/blob/master/src/core/Geometry.js#L82
This is because in an application you often end up with different versions of the same library in your source tree, so checking instanceof doesn't work between different versions of the same library. It would be good in preparation for a move to a commonjs module system to replace those instanceof checks with feature-checking behind a Geometry.isGeometry(geom) style interface.
in git/three.js/src:
grep -r instanceof . | wc -l
164
in git/three.js/examples:
grep -r instanceof . | wc -l
216
so there are in total 380 uses of instanceof
in three.js. What would be the best implementation as a replacement?
I recently added a type
property which can be used to replace most of those.
I recently added a type property which can be used to replace most of those.
nice! Will prepare a PR.
For an example of how this is handled in another popular and large JS library take a look at https://github.com/facebook/react. The codebase is structured using the node style based module system (which browserify implements) but is built for release using grunt. This solution is flexible for 3 use cases.
require
specific dependencies. The benefits of proper dependency management have been well documented. I made some research...
Yesterday I hacked together a (rather stupid) script that transforms the Three.js sourcecode to use CommonJS require()
statements to declare dependencies between files. Just to see what happens... This:
var THREE = require('../Three.js');
require('../math/Color.js');
require('../math/Frustum.js');
require('../math/Matrix4.js');
require('../math/Vector3.js');
require('./webgl/WebGLExtensions.js');
require('./webgl/plugins/ShadowMapPlugin.js');
require('./webgl/plugins/SpritePlugin.js');
require('./webgl/plugins/LensFlarePlugin.js');
require('../core/BufferGeometry.js');
require('./WebGLRenderTargetCube.js');
require('../materials/MeshFaceMaterial.js');
require('../objects/Mesh.js');
require('../objects/PointCloud.js');
require('../objects/Line.js');
require('../cameras/Camera.js');
require('../objects/SkinnedMesh.js');
require('../scenes/Scene.js');
require('../objects/Group.js');
require('../lights/Light.js');
require('../objects/Sprite.js');
require('../objects/LensFlare.js');
require('../math/Matrix3.js');
require('../core/Geometry.js');
require('../extras/objects/ImmediateRenderObject.js');
require('../materials/MeshDepthMaterial.js');
require('../materials/MeshNormalMaterial.js');
require('../materials/MeshBasicMaterial.js');
require('../materials/MeshLambertMaterial.js');
require('../materials/MeshPhongMaterial.js');
require('../materials/LineBasicMaterial.js');
require('../materials/LineDashedMaterial.js');
require('../materials/PointCloudMaterial.js');
require('./shaders/ShaderLib.js');
require('./shaders/UniformsUtils.js');
require('../scenes/FogExp2.js');
require('./webgl/WebGLProgram.js');
require('../materials/ShaderMaterial.js');
require('../scenes/Fog.js');
require('../lights/SpotLight.js');
require('../lights/DirectionalLight.js');
require('../textures/CubeTexture.js');
require('../lights/AmbientLight.js');
require('../lights/PointLight.js');
require('../lights/HemisphereLight.js');
require('../math/Math.js');
require('../textures/DataTexture.js');
require('../textures/CompressedTexture.js');
We would need some major refactoring, maybe splitting the WebGLRenderer (and such) into multiple modules (atm the file is over 6000 lines long).
THREE.ShaderChunk
at compile time and then into THREE.ShaderLib
at runtime (joining toghether arrays of THREE.ShaderChunk
s) which is rather tricky to do with only browserify. I presume it would require a browserify transform that does the same. React.js uses commoner to lookup their modules without having to refer to them by file path. Maybe we could do the same and also define custom rules that allow us to require
GLSL files transforming them to JS syntax.
@rasteiner you may be very happy to learn about https://github.com/stackgl/glslify, it comes from the growing http://stack.gl family
I've had a fair bit of experience with modules and the "unixy" approach in the past couple months and my thought now is that it's too little too late, and refactoring threejs for modularity or npm modules would be an unrealistic goal.
Here's what I'm currently doing to tackle the problem of modularity/reusability:
My new projects tend to use "three" on npm just to get up and running. It would be pretty awesome if ThreeJS officially published the build to npm using version tags that align with the release numbers.
PS: to those interested in bringing reusable/modular shaders to their workflow:
https://gist.github.com/mattdesl/b04c90306ee0d2a412ab
Sent from my iPhone
On Nov 20, 2014, at 7:42 AM, aaron [email protected] wrote:
@rasteiner you may be very happy to learn about https://github.com/stackgl/glslify, it comes from the growing http://stack.gl family
—
Reply to this email directly or view it on GitHub.
In case it helps others who might be looking how to use Three.js with browserify, and stumble on this thread, the way I've just set it up myself is to use browserify-shim.
Following the README section on _"You will sometimes a) Expose global variables via global"_, I included a separate script tag for Three.js and configured it to expose the global THREE variable.
And then just the bit that I had to work out myself was how to include extras like ColladaLoader, OrbitControls etc. I did this like so:
From package.json:
"browser": {
"three": "bower_components/threejs/build/three.js"
},
"browserify-shim": "browserify-shim-config.js",
"browserify": {
"transform": [ "browserify-shim" ]
}
browserify-shim-config.js:
module.exports = {
"three": { exports: "global:THREE" },
"./vendor/threejs-extras/ColladaLoader.js": { depends: {"three": null}, exports: "global:THREE.ColladaLoader" },
"./vendor/threejs-extras/OrbitControls.js": { depends: {"three": null}, exports: "global:THREE.OrbitControls" }
};
Then in my own script, main.js:
require('../vendor/threejs-extras/ColladaLoader.js');
require('../vendor/threejs-extras/OrbitControls.js');
var loader = new THREE.ColladaLoader(),
controls = new THREE.OrbitControls(camera);
...
Browserify requires rebuilding the entire script when you modify even on bytes. I once use browserify to pack a project that requires THREE.js, then it takes more than two seconds to build a boundle and blocks the livereload every time I make a change. That's too frustrating.
You normally use watchify during development with livereload. That one builds the bundle incrementally.
watchify doesn't work for me. When I change a file and save it, watchify and beefy's livereload serve up the older/cached version. I have no idea why this happens. Thankfully, browserify already works pretty well.
@ChiChou Pass in --noparse=three
to browserify. This takes the browserify bundle step from 1000ms down to 500ms on my machine, which is decent enough for an instant feedback feel.
@rasteiner I want to thank you again for your informal research into three.js inter-dependencies. While that massive list of deps is some ugly looking code, really that ugliness is present as is, just invisible. Browserify's strength is it requiring us to air our dirty laundry and pursue less tangled systems.
There are a lot of places in Three.js where we take in some object, perceive its type, and perform different tasks based on that type. In most of those cases, that type-dependant code could be moved to the type itself, and we would not have to have an understanding of all the possible types that we're operating on.
The following is an abridged example from WebGLRenderer:
if ( texture instanceof THREE.DataTexture ) {
// ...
} else if ( texture instanceof THREE.CompressedTexture ) {
// ...
} else { // regular Texture (image, video, canvas)
// ...
}
should be more of the form
texture.processTexImage( _gl, mipmaps, otherData )
Let the type determine how to handle itself. This also allows the library consumer to use novel Texture types that we had not thought of. This structure should reduce interdependency.
I think moving to a browserify architecture is definitely the way to go. The UMD build will make consuming THREE.js easier. It will also allow us to split the WebGLRenderer into multiple files, because right now it looks rather monolithic and scary.
I have started a branch where I am currently working on moving it over here: https://github.com/coballast/three.js/tree/browserify-build-system
Please let me know what you think.
Here is @coballast's changes.
It looks like you're taking the automated conversion approach with your browserifyify.js
file, which I think is the right way to go.
One thing that we all haven't discussed much yet is how best to transition this large, ever-changing library over to browserify. You can make the changes and then open a PR, but it will immediately be out of date. Thats whats compelling about the automated approach.
If we can:
browserifyify.js
)...then we can turn this into a push-button conversion that will still work into the foreseeable future. that simplicity enables this dreamy notion of a fundamental architecture shift to such a large project to be accomplished when the ideological arguments win out.
@coballast to that end, I would remove the changes to src/Three.js if it works just the same.
Note: not just revert, but remove those changes from the branch's history via a new branch or a force push
@coballast I wonder if it would make more sense for the conversion utility to be not a fork of three.js
, but an external utility that you point at the three.js
development dir, and it converts the source files, adds a build script, and runs the tests.
@kumavis I agree with leaving the src directory alone. I think the thing to do is have the utility write a duplicate directory with the commonjs code, and we can test and run the browserify build from there to make sure the examples all work before we try doing anything major.
There is also an interesting opportunity here to write some static analysis stuff that will automatically check and enforce consistent style across the entire codebase.
@coballast sounds great.
There's a wealth of tools out there for automated code rewriting, e.g. escodegen. Need to make sure we're maintaining comments, etc.
Want to start a threejs-conversion-utility repo?
@coballast that said, maintaining a sharp focus for this utility is important
@kumavis Consider it done. I really want this to happen. This is only one of two projects I am working on at the moment.
@kumavis @mrdoob Some of the discussion here seems to be around the idea of splitting THREE into multiple separate modules that could presumably be installed via npm and then compiled with browserify. I am not outright against the idea, but I think that should be a separate issue. The only thing I am advocating at this point is using browserify to manage THREE's codebase internally. Get it moved over, and get it working the same way it always has for users, and then evaluate what makes sense.
I'll be curious to see what's the output of that utility ^^
@coballast link a repo for us to track, even if its just empty at this point. We can build from there.
https://github.com/coballast/threejs-browserify-conversion-utility
The code is a mess, will clean up soon.
here we go! :rocket:
I now have the utility in a state where it generates browserify src and will build it without problems. I will update the repo with instructions on how to do this yourself. At this point, the examples don't work. There are several issues that need to be dealt with to fix that. I will add them to the repo if anyone wants to roll up their sleeves and help out.
@coballast yeah please file the issues as TODOs, and I or others will jump in as we can.
Serious problems have come up. See #6241
Here is my analysis of what would need to happen in order for this to work: https://github.com/coballast/threejs-browserify-conversion-utility/issues/9#issuecomment-83147463
browserify is at the very least transport redundant (conjestive) due to it's design. This makes it's use cost inflative(data plan anyone?) and slow.
A simple remedy to this is to seperate document from library code which would entail two client files and not one. This is common practice for client side js.
If at the outset browserify is faulty and itself needs to be fixed I can hardly see why it should even be considered to improve anything at all let alone something like threejs.
@spaesani Because the data for threejs has to be downloaded anyway. If we split threejs into smaller modules and let an automated build system cherry pick what it needs for a single app, actually most threejs apps out there would be lighter.
If for some reason you still want to separate "document from library code", you could still do this and use a pre-built version like we do now. You could even split your browserify-built app into separate modules by using the --standalone and --exclude flags.
Browserify is just a way to use a battle proven module definition API (CommonJS) on the browser. It would greatly simplify the development of threejs plugins and enhance code clarity and therefore productivity, it would allow us to integrate into a bigger ecosystem (npm) where the code is inherently maintained by more people while still maintaining integrity through the versioning system (think about the stackgl family), and it wouldn't even force people into CommonJS if they don't want it.
Of course there are downsides, but they're not the ones you've mentioned.
three.js and three.min.js can be cached to save on transport (data) beit by a proxy, the common mobile solution, or a caching browser.
The moment you cherry pick and aggregate the threejs code with document specific code the caching is not feasible.
If browserify allows one to
sp
On Mar 28, 2015 1:06 PM, Roman Steiner notifications@github.com wrote:@spaesani Because the data for threejs has to be downloaded anyway. If we split threejs into smaller modules and let an automated build system cherry pick what it needs for a single app, actually most threejs apps out there would be lighter.
If for some reason you still want to separate "document from library code", you could still do this and use a pre-built version like we do now. You could even split your browserify-built app into separate modules by using the --standalone flag.
Browserify is just a way to use a battle proven module definition API (CommonJS) on the browser. It would greatly simplify the development of threejs plugins and enhance code clarity and therefore productivity, it would allow us to integrate into a bigger ecosystem (npm) where the code is inherently maintained by more people while still maintaining integrity through the versioning system, and it wouldn't even force people into CommonJS if they don't want it.
Of course there are downsides, but they're not the ones you've mentioned.
—Reply to this email directly or view it on GitHub.
@spaesani It (browserify) improves things for humans. My own psychological happiness and well being are more important than how easily a machine can load and execute stuff.
A lot of mobile network load issues will be ameliorated somewhat by things like http/2. Problems like that are best solved by modifying layers that are lower on the abstraction stack. Performance issues should not keep us from following software engineering best practices like modularity/separation of concerns etc.
I found this Issue because my team has recently started using jspm. We are able to import threejs (I believe this is because the main file has been browserified). I was looking to see if anybody had made threejs into es6 modules, primarily because of the build feature of jspm (bundling all dependencies into a single file, but only grabbing dependencies that are used).
While it is great that mrdoob keeps the size of threejs under 100kb, I find that most of my projects don't use much of the codebase (I feel like it is most, but I have not tried to figure that out). CubeCamera, OrthographicCamera, CanvasRenderer, various Lights, Loaders, Curves, Geometries, Helpers, etc. Additionally, I find that most of my projects include a few of the examples scripts.
My hope was that it would be possible to have a single location of all these modules (the ones normally bundled with threejs as well as the ones in examples), and simply import the ones I need, then when I bundle the project, it results in a file much smaller than the original threejs, even though it contains many parts that weren't originally included.
I also wanted to add that if threejs were built using browserify modules, it would add a small overhead of file size (but incomparable to the current 403kb at r70), but would also remove the usage of the global THREE variable from the code, thereby allowing variables like THREE.Geometry to be minified by closure.
I did a quick test doing a find-replace to get rid of the THREE object (so all of its children polluted the namespace) and wrapping the whole file in an IIFE, then ran the whole thing through google closure. The resulting file (non-gzipped) was 238kb, down from 777kb.
While the results will vary, I think it is definitely worth making sure this can happen.
thanks for telling - many good points there, and familiar to us too. we also never use many renderers in one project but do kind of lib things from examples.
didn't realize that about minifying - that's a pretty big difference.
and es6 modules have sure seemed promising -- has been also good to hear that there's a path there from the AMD/CommonJS/such module pattern & lib usage too.
@colin Not sure I follow you on how browserify
helps your psychological happi':ness.
Is it in the documentation?
browserify is a carrier cash cow...
Tuesday, 31 March 2015, 10:11PM -04:00 from Colin Ballast notifications@github.com:
@spaesani It (browserify) improves things for humans. My own psychological happi':ness and well being are more important than how easily a machine can load and execute stuff.
A lot of mobile network load issues will be ameliorated somewhat by things like http/2. Problems like that are best solved by modifying layers that are lower on the abstraction stack. Performance issues should not keep us from following software engineering best practices like modularity/separation of concerns etc.
—
Reply to this email directly or view it on GitHub .
Now, if browserify would automatically determine dependencies without a require statement... hey wait...
@spaesani I actually prefer the explicit dependencies -- helps you understand how the code all fits together.
browserify is a carrier cash cow...
@spaesani browserify overhead is infinitesimal. It will actually help people create more minimal builds.
Status update:
I have a branch with some browserified code up:
https://github.com/coballast/three.js/tree/browserify
Please keep in mind this is very much a work in progress. This code was autogenerated and is going to look pretty awful for a while as a result. I am still trying to to fix some build problems. See coballast/threejs-browserify-conversion-utility#10 if you think you can fix it. It was building for a while, but isn't now.
@kumavis and I have been working on addressing some runtime issues (and improving the software architecture as well). I believe I mentioned it above somewhere.
Sorry for the long reply. TLDR: jspm/es6 runs, but has some wierdness: 1) Exporting objects before defining them; 2) Exporting objects containing single class, rather than just exporting single class; 3) IIFE using circular dependencies; 4) File structure.
I played with your browserified branch in jspm (@spinchristopher above is me) and have some notes, though firstly: Wouldn't it be good to open up issues on that fork, so this thread is not full of them and they aren't jumbled together?
Though it runs, it does not actually output the correct result. (using the simple demo on the getting started). Creates the canvas and fills it black (unless i set the clear color to transparent), but is not actually rendering the cube. I am however not expecting it to work at this point, as this is still very early in the process.
I ran into 3 main issues:
One. This was the most annoying one, and I am honestly not sure how you even get anything to compile with this particular error, as it should not work at all. Many of the files begin like this (this is fine because function definitions are hoisted to the time, and are essentially run before the module.exports line, even though it appears first):
module.exports.Foo = Foo;
function Foo() {}
The problem comes in the many files which are like this (the first one I saw was math/Math.js). The object initialization is hoisted (which is why there is no undefined error) but the definition stays in place (so the export is undefined).
module.exports.Foo = Foo;
var Foo = {};
The only fix I found here is to move the exports line to the end, or to rewrite it like this (preferred):
var Foo = module.exports.Foo = {};
Two. The exported data. When dealing with modularized files, the standard is that each file exports a single object. While most of the files are this way (though some export more), they do not export the single constructor, they export an object containing that constructor (ie: module.exports.Foo = Foo;
rather than module.exports = Foo;
. The latter is the way all the browserify examples work). Thus, when using the requires, you must step a level deeper (var Vector3 = require('../math/Vector3').Vector3;
). Besides being unnecessary, there is not a way to do this when importing into es6. (import Vector3 from '../math/Vector3'; var vector = new Vector3.Vector3();
). While there is away to grab a particular export in es6, it does apply when using browserified modules, and would still have the same redundancy (import { Vector3 } from '../math/Vector3';
). There are a few files which simply collect other objects (the obvious being Three.js), but these should be kept to a minimum, and really should only be used for the build process, not as a way to grab lots of things in production.
Three. This has to do with the circular dependencies. System.js (the module loader used by jspm) can handle circular dependencies just fine, but there is one issue. In many places, the code reads something like the following. The problem is that, though Vector3 has been passed in as a dependency, at this point in time it has not been fully loaded (as Vector3 also includes this file, each cannot resolve until the other has resolved) and cannot be created. I added a very bad fix (shown below), though I'm not sure how this would be fixed better. It seems this is an architectural problem that may not have a simple solution. It happens many many times. It appears to be an optimization to prevent creating a new Vector3 each time the function is called. If there is indeed a significant performance hit that cannot be fixed by optimizing Vector3, then perhaps add a function to Vector3 to return an unused vector3 which is released later?
Foo.prototype.bar = function() {
var vector = new Vector3();
return function() {
// some data which reuses vector repeatedly.
};
}();
The fix:
Foo.prototype.bar = function() {
var vector;
return function() {
if(!vector) vector = new Vector3();
// some data which reuses vector repeatedly.
};
}();
Finally, I wanted to add a bit about file organization. This is obviously something to be tackled after the current set builds properly, but I wanted to bring it up now. While the current file structure works, parts of it are rather odd or even awkward. The major groupings (cameras, materials, geometries, etc) seem to have it well, though I would make some changes, as seen below. I would also move the globals in ThreeGlobals each to the thing for which they are a global. IE: FrontSide, BackSide, DoubleSide all belong with Material (along with NoShading, FlatShading, and SmoothShading. Actually, it seems that most of them do...).
My primary confusions came with core and extras. core/Geometry.js should be in the geometries folder, just like Material is in the materials folder. But there isn't a geometries folder, it is in extras. Incidentally, extras also has a core and a geometries, but the base geometry is not in this folder. There is this large collection of helpers, but should not each helper be with the thing it is helping? Build processes can easily be configured to only take the files you want, so there is no excuse to put the non-important files elsewhere.
I currently have a line in my code reading import BoxGeometry from 'threejs/extras/geometries/BoxGeometry';
and var geometry = new BoxGeometry.BoxGeometry( 1, 1, 1 );
Being accustomed to using `new THREE.BoxGeometry()``` it took quite a while to find the file. When using it modularly, file location is equally as important as something like function signatures.
General changes I would make to the file structure. These apply in many places, but I am only showing an example. (As a note, I have always preferred a model of naming the file after the single class it exports, and placing the file in a folder of the same name. Any direct descendants would generally go in that folder, but again, in a folder with the same name as themselves. However if one does not like this pattern, the same structure below applies, just taking out that extra level. Also, any file loader can be easily modified [the hooks are usually provided directly, in fact] to automate the layering and simplify the require statements].) (I also prefer all lower case files, with underscores when necessary.)
Three.js - This is _only_ used in the build process, so it should actually be with the build files, but run as if it is in this location.
geometry/geometry.js - Currently at core/Geometry.js
geometry/face3/face3.js - from core/Face3.js
geometry/box_geometry/box_geometry.js - Currently at extras/geometries/BoxGeometry.js
geometry/circle_geometry/circle_geometry.js - Similar to above.
geometry/utils/utils.js - from extras/GeometryUtils.js
camera/camera.js
camera/cube_camera/cube_camera.js
camera/perspective_camera/perspective_camera.js
camera/helper/helper.js - or camera/camera_helper/camera_helper.js
scene/scene.js
scene/fog/fog.js
scene/fog_exp2/fog_exp2.js
I would also likely rename math to utils (each category may also have a utils, like geometry above) so that it can hold more than just math (many of the things from core).
@HMUDesign @spinchristopher thanks for the great analysis ! Best to put these kind of issues into the coballast/threejs-browserify-conversion-utility repo in the future.
ok let me properly read your comment now.
I somehow missed the link to that repo above. I will happily move my issues
to that repo tomorrow, splitting then as necessary (I noticed that at least
some of it is already there)
On Apr 9, 2015 12:09 AM, "kumavis" notifications@github.com wrote:
@HMUDesign https://github.com/HMUDesign @spinchristopher
https://github.com/spinchristopher thanks for the great analysis ! Best
to put these kind of issues into the
coballast/threejs-browserify-conversion-utility repo in the future.ok let me properly read your comment now.
—
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-91132413.
Yes, and I do that for util type libraries of my own (*Util files are the
only time I do not have a default export, though I will sometimes add an
expert in addition to the default), however this syntax only works when you
export multiple named variables. The loader for common js modules treats
modules.exports as a default export, which can not be destructured in the
import =(
On Apr 9, 2015 12:12 AM, "kumavis" notifications@github.com wrote:
in es6 you can import properties off of the export objects via
destructuring.import { Vector3 } from '../math/Vector3';
that said, i agree that one export per module is preferred.
—
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-91132982.
@HMUDesign thanks again for your energy and analysis -- here we are constructing a todo list, feel free to jump in and stuff. https://github.com/coballast/threejs-browserify-conversion-utility/issues/17
+1 for browserify.
Also +1 for moving shaders into separate files using glslify.
Also +1 for adopting some ES6 features - like classes and modules. The new build stack will allow us to compile back down to ES5 if necessary. See example:
import Object3D from '../core/Object3D';
import Geometry from '../core/Geometry';
import MeshBasicMaterial from '../materials/MeshBasicMaterial';
class Mesh extends Object3D {
constructor(
geometry = new Geometry(),
material = new MeshBasicMaterial({color: Math.random() * 0xffffff}
) {
super();
this.geometry = geometry;
this.material = material;
this.updateMorphTargets();
}
}
export default Mesh;
@lmcd While we're at it, we can use es6 modules and use babeljs to compile everything.
@coballast I'd be interested in branching off your browserify
branch and making some of this stuff happen
@lmcd I wouldn't bother. I'm going to develop automated tooling to move es5 stuff to es6 automatically. It makes sense, since there is a huge amount of es5 code out there, and the labor requirement of moving it all over by hand is astronomical.
@coballast I was thinking more about running a 5to6
pass: https://github.com/thomasloh/5to6
@lmcd oo nice find
but tbh, seems like it would be easier to rewrite three.js from scratch : P
@lmcd I am so happy you found that. It was one of those things I was going to do from bare necessity, but sounded distinctly not fun.
@mrdoob what's your current thoughts on this issue?
@anvaka At the moment I'm currently focused on refactoring WebGLRenderer
. I don't have more mental bandwidth 😅
I recently encountered some project that was using es6 modules happily with the babel polyfill that was mentioned here already too. Couldn't remember nor find yet now what it was, looked good to me anyhow.
Also es6 seems to be finished now on the standards front: "Finally, ECMA-262 Edition 6 got officially approved and published as a standard on June 17, 2015" says https://developer.mozilla.org/en-US/docs/Web/JavaScript/New_in_JavaScript/ECMAScript_6_support_in_Mozilla
Just a note that regarding the tooling and stability of the module spec the situation seems good on that front.
I recently encountered some project that was using es6 modules happily with the babel polyfill that was mentioned here already too. Couldn't remember nor find yet now what it was, looked good to me anyhow.
It is also 47kb of extra js and reads and transpiles your included javascript to es5 in the browser, so not great for start up time.
There's nothing stopping anyone who uses three.js from using es6 in their own code; however using it in the library would introduce browser compatibility and performance issues across the board.
Ah - standing corrected here, thanks for the info. I guess browserify and such which work in the build process are better now still then.
Ah - standing corrected here, thanks for the info. I guess browserify and such which work in the build process are better now still then.
One of the main issues with es6 is it has breaking syntax changes with es5; for example the fat arrow => is invalid es5 and will cause the javascript parser to fail and bail trying to compile the code. Hopefully someone will come up with a way round this, but they haven't yet.
I was actually thinking just of the module system, the import statement etc.
It is also 47kb of extra js and reads and transpiles your included javascript to es5 in the browser, so not great for start up time.
for example the fat arrow => is invalid es5 and will cause the javascript parser to fail and bail trying to compile the code
The es6 code can be transpiled to es5 during _build_ with no runtime penalty. It's just a matter of adding a babel step to the build pipeline.
For instance, the arrow function can be transpiled to es5 during build with no polyfill or runtime penalty. Babel will transpile this es6 snippet
function MyObj() {
this.step = 1;
this.increment = function ( arr ) {
return arr.map( v => v + this.step );
}
}
into this portable version:
function MyObj() {
this.step = 1;
this.increment = function (arr) {
var _this = this;
return arr.map(function (v) {
return v + _this.step;
});
};
}
Other feature, like es6 classes, will generate a small polyfill (you can see in the babel repl http://babeljs.io/repl/).
@mrdoob understood. Would you be supportive of the idea to break up three.js into smaller modules hosted on the npm?
The main three.js repository will remain unchanged: the consumers will not have to build anything. More experienced users would be able to cherry-pick required bits of three.js.
That sounds good. I don't know the details though.
The es6 code can be transpiled to es5 during build with no runtime penalty. It's just a matter of adding a babel step to the build pipeline.
For instance, the arrow function can be transpiled to es5 during build with no polyfill or runtime penalty. Babel will transpile this es6 snippet
ES6 transpiling still comes with a significant runtime penalty: http://www.incaseofstairs.com/2015/06/es6-feature-performance/ especially for a high performance library.
modules/npm/browserfy etc however is probably a good idea
+1 on proper modularization using ES6 modules
+1 on proper modularization using any sane module system ( commonjs, amd, es6 )
I think commonjs and amd are the preferred choices b/c they dont necessitate transpiling
They do necessitate a build step, however, which is equivalent to the
transpile step.
Using ES6, besides being the next version of the language, would allow the
use of the next features as desired, but not break the existing native code.
Is it really wise to reactor the code base into a system that is already
not the latest?
On Jul 20, 2015 12:05 PM, "kumavis" notifications@github.com wrote:
+1 on proper modularization using any sane module system ( commonjs, amd,
es6 )
I think commonjs and amd are the preferred choices b/c they dont
necessitate transpiling—
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-122990605.
They do necessitate a build step, however, which is equivalent to the
transpile step.
building and transpiling are not equivalent in terms of the complexity they introduce.
Is it really wise to reactor the code base into a system that is already
not the latest?
commonjs is simple and works great. im not sold on the notion that 'latest' === 'better'.
Using ES6 [...] would allow the use of the next features as desired
so this is worth considering. do we want es6 features? if we start transpiling es6, people will start PR'ing es6. as @benaadams suggested there are non-intuitive performance impacts on using es6 features.
further more, we don't need to conflate the issues of 'module system' and 'es6 features'. you can transpile es6 and use commonjs. and you can introduce those separately.
+1 for browserify / commonjs - it's straightforward to compile with browserify in such a way that people can still use the library in the traditional way if they want to - this is what UMD is for, allow AMD requires (like require.js), CommonJS requires (like node + browserify) and a window global (for script tags) depending on the environment we're running in.
PIXI.js has just moved to a modular architecture using browserify and the library was set up in a very similar way - everything attached to a global PIXI object. Their setup is very much like @kumavis describes in the second post.
Neither browserify nor commonjs address the specific needs of a 3D engine, which doesn't mean they cannot be used, but they should be seen as one piece of a greater puzzle:
Components need to export meta information about their instances' properties, so there can be a loader for arbitrary objects and shared memory data connections. With such an architecture, it would be just a matter of common sense to also load out-of-core component code upon first use. Been brainstorming on these topics in #6464 and #6557.
+1
+1
As a hybrid solution its also possible to add the requirements as comments. I know browserfy Is already in many ecosystems, but i just wanted to leave this here as a fast to implement variant :) because you wouldn't have to change anything. You just need to add a comment on top of every file.
As a developer you could easily create your own min.js files with the @requires
information and gulp plugin.
Hi all, I recently had a similar need for loading arbitrary resources with their own dependencies in a clean and structured way and I came up with the solution of writing require.js plugins for every type of resource. This way I let require.js dependency resolution to take care of properly downloading and caching the resources.... In the same time this approach creates reusable resource 'bundles' that I can use in various of my projects.
If you are interested, you can find the project here: https://github.com/wavesoft/three-bundles
(An example using this library will be available soon)
In the future I am planning to include an optimisation phase in these plug-ins in order to allow the require.js optimiser to compile the resources into an even more compact format.
Looking at this conversation https://twitter.com/defunctzombie/status/682279526454329344 it doesn't seem like es6 modules are going to be implemented in near future. Something to keep in mind.
I did some prototypes with commonjs modules and browserify.
My final browserified bundle includes every single file of the src
folder and results in 962K
file size (compared to original unbrowserified version 885K
).
Targeted build of cloth
example is:
580K
(~44% smaller)431K
(~8% smaller)Here is bundle size breakdown: http://output.jsbin.com/yogoxawozu. Renderers take 40% of the bundle, and 10% of that is taken by shaders library.
I think we could reduce the size of the bundle by:
instance of
checks - they require explicitly referencing modules even when they are not used. I see some of the classes already have type
- we could use this across the library.glslify
was brought couple times, and it could definitely help. Ideally each component that require shaders need to explicitly depend on a shader.You can verify results and check the code:
git clone --depth 1 --branch commonjs https://github.com/anvaka/three.js.git
cd three.js
npm i
# build backward compatible three.js library from commonjs modules.
# The output will be save into `build/three.min.js`. I'm using `.min.js` just
# to quickly verify examples. The actual file is not minified.
npm run build
# build cloth example
# the output is saved into ./examples/cjs/webgl_animation_cloth.bundle.js
npm run demo
it doesn't seem like es6 modules are going to be implemented in near future. Something to keep in mind.
That's true, but it's also important to realise that CommonJS module support will _never_ be implemented in browsers, so the choice is between
Libraries like D3 are adopting ES6 modules because they can already do everything CommonJS modules can do (except run natively in Node.js, which isn't really a concern for a library like Three.js), and result in smaller builds.
I've done some experimenting over at https://github.com/rollup/three-jsnext, and while it's not production ready (I need to spend a bit more time porting examples etc!) the UMD build it generates is actually _smaller_ than the current build.
I would agree with the note about es6 modules vs other systems. Whether or
not they are a es standard, they are a community standard. And though they
cannot run "natively" in node, it can look native with Babel hooks.
I will be following up with your repo soonish.
Also, the fact that it is smaller was something I had brought up earlier in
this conversation. "THREE.Geometry" becomes "geometry" which can be
minified to "a" for example.
Also, the solution to the instance of checks is to remove them all
together. One module should not be beg having differently depending on what
it was given, but rather deferring to the provided thing to do as it needs
to do. Then there are no instance or type checks.
On Jan 1, 2016 8:23 PM, "Rich Harris" notifications@github.com wrote:
it doesn't seem like es6 modules are going to be implemented in near
future. Something to keep in mind.That's true, but it's also important to realise that CommonJS module
support will _never_ be implemented in browsers, so the choice is between
- continuing with a non-modular architecture and an ad hoc build
system, which has served Three.js well so far but acts as a brake on growth
in the long run- using CommonJS modules, which involves some trickery around cyclical
dependencies and results in a larger build, or- using ES6 modules, which are well suited to a codebase like Three.js
that has cyclical dependencies, and which result in the smallest and most
minifiable build possible. Eventually, browsers will support them natively,
and the changes required to accommodate any unforeseen quirks in the loader
spec will most likely be trivial compared to the effort involved in
upgrading from a CommonJS codebase.Libraries like D3 are adopting ES6 modules because they can already do
everything CommonJS modules can do (except run natively in Node.js, which
isn't really a concern for a library like Three.js), and result in smaller
builds.I've done some experimenting over at
https://github.com/rollup/three-jsnext, and while it's not production
ready (I need to spend a bit more time porting examples etc!) the UMD build
it generates is actually _smaller_ than the current build.—
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-168363092.
Would CommonJS still result in a larger build with a codebase that doesn't have cyclical dependencies?
@cecilemuller Yes – see https://github.com/nolanlawson/rollup-comparison. With CommonJS modules you pay a per-module cost (each module needs to be wrapped in a function, and needs to redeclare imports that are shared throughout the bundle, so you're penalised for a more modular codebase), a per-bundle cost (it needs to simulate a Node.js environment), and other costs such as unminifiable object property names that would be minifiable variable names in ES6 modules. ES6 modules allow you to bundle with literally zero overhead.
Though there would be some overhead in transpiling then to es5. At present,
I use webpack with babel which adds very little. There is a per module cost
as it is also wrapped in s function. Dependencies are declared in the final
code by calling a require like function with an integer index, so it gets
minified to something like "var a=f(5)" from what was originally 'import
Geometry from "./geometry";'
Using generators also add a bit more, but I dont imagine the structure of
the code would be changing that much any time soon.
On Jan 2, 2016 5:53 AM, "Rich Harris" notifications@github.com wrote:
@cecilemuller https://github.com/cecilemuller Yes – see
https://github.com/nolanlawson/rollup-comparison. With CommonJS modules
you pay a per-module cost (each module needs to be wrapped in a function,
and needs to redeclare imports that are shared throughout the bundle, so
you're penalised for a more modular codebase), a per-bundle cost (it needs
to simulate a Node.js environment), and other costs such as unminifiable
object property names that would be minifiable variable names in ES6
modules. ES6 modules allow you to bundle with literally zero overhead.—
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-168394376.
Though there would be some overhead in transpiling then to es5
If you're only using import
and export
syntax to describe the relationship between modules, there's no need to transpile the code itself with Babel or anything similar. It's only when you start adding other ES6 features (like classes and block scoping and arrow functions etc) that transpiling becomes necessary, so there's zero overhead to using import
and export
. D3 and PouchDB are two examples of libraries that use import
and export
but are otherwise Babel-less ES5, and three-jsnext is done the same way.
Ok, we all had the same idea. It would be great to have a story like lodash.
I propose to create a _three-foo_ package for every component _foo_ (for instance three-vector2) that can be modularized, with almost no change in the code, so it can be imported in this repo with no impact.
People that publish on npm should play well and collaborate with @mrdoob since he is the creator of this great piece of software, so if he want to centralize again all packages like _babel_ author did (I mean all core packages in the same folder), the publisher should give him control on the npm namespace taken.
I will try to do that, for the packages I need. Let's see what happens.
There is only one big community :)
I didn't notice anybody suggesting to fully separate the library like
lodash. Lodash is a collection of utilities under a common name; you cab
grab a single piece of it and use it. Threejs is not; it is a comprehensive
library, most of which is useless without the rest. There are a few pieces
which can be separated, such as the specific material types our specific
geometry generators, but those are necessarily going to be very closely
tied to the core, likely needing exact version matches. Considering their
size, it would create a maintenance headache with no measurable gain.
Should mr doob approve a split of that nature, however, I do not think it
appropriate for anybody but an official maintainer to be claiming threejs-*
packages.
Regardless of the above, I think it prudent to get it working in a modular
environment before anything else. There were several projects with this
goal, but all seem to have floundered.
On Mar 6, 2016 11:39 AM, "Gianluca Casati" notifications@github.com wrote:
Ok, we all had the same idea. It would be great to have a story like
lodash.I propose to create a _three-foo_ package for every component _foo_ (for
instance three-vector2) that can be modularized, with almost no change in
the code, so it can be imported in this repo with no impact.People that publish on npm should play well and collaborate with @mrdoob
https://github.com/mrdoob since he is the creator of this great piece
of software, so if he want to centralize again all packages like in
_babel_, the publisher should give him control on the npm namespace taken.I will try to do that, for the packages I need. Let's see what happens.
There is only one big community :)
—
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-192970867.
@mattdesl: for example your three-shader-fxaa uses THREE.Vector2 and THREE.Texture, aside three-effectcomposer I did not checked, it would result in a really light build using modularity approach proposed above.
@HMUDesign: I understand your doubts, but it still seems to me a good approach. I wanna try, but I will listen to your advice, using npm packages from GitHub URLs, not publishing them on the holy registry.
I gave it a try, starting form TrackballControls wich depends on Vector2, Vector3, Quaternion and so on.
There are circular deps (for example Matrix4 depends on Vector3 and vice versa). It cannot be done if the lib (i.e. threejs) was started monolithic.
It is a pity that the module pattern with all its benefits cannot be applied easily on important projects like this.
I tryes also with other projects like svg.js, vvvvjs, even x3dom, but it the authors are not fully convinced of this radical choice it cannot be done.
Sorry for the spam, but I wanted to try proactively: by the way I started with three-trackballcontrols repo.
@fibo ES6 Module pattern shouldn't have problems with circ dependencies afaik. Aren't bindings set up before the module executes, just like hoisting in plain JS?
I'm positive you have seen this: https://github.com/kamicane/three-commonjsify He solved it in commonJS.
@drcmda really interesting, I will give it a try
+1 for moving to a modular architecture.
+1
+1
@drcmda is right. ES6 modules have a initialization step and an execute step which allows circular references. However, as soon as you're having circular dependencies directly from the module execution context (in the global area of a module) then the first one loaded during runtime will experience undefined values for it's dependencies. As long as the references are used within different context where the runtime execution order matters, circular dependencies are no problem.
I suggest to consider also webpack instead of browserify.
@gionkunz we have circular references in the initialization step bc of the pattern where theres a closure to generate scratch variables
The beta of Webpack 2 has just been released (https://twitter.com/TheLarkInn/status/747955723003322368/photo/1), so es6 modules could also benefit from tree shaking when bundled.
@mrdoob
Has there been a official statement more recently? Like many we have abandoned ES5 and glue-concats long time ago and it is quite bad how much THREE falls out of line in a modern build system. We use maybe 10% of what it can do, yet it is the biggest dependency we ship.
This is maybe _the_ most favourite project on Github for me personally - i sincerely hope priorities will be re-considered.
Hmm, I would like to know more details about browser support. Which browsers do and which don't. For the browsers that don't, what are the workarounds and what are the performance penalties.
Actually, browser support becomes a non-issue (perhaps even less so than it is now). The build systems take that ES6 code and transpile it to es5 (sometimes taking up less space than the original ES5 would have). Certain kinds of transpiled things end up being large (primarily: generators and async functions), but if you avoid those, you won't have that penalty.
As @drcmda mentioned, the build system would still produce a monolithic output (and would be very easy to customize exactly what is included in that output), but the individual modules could also be included in our own projects, thus only using the parts that we need. To take full advantage of
that, inter-dependencies need to be adjusted, but that can happen over time. I think the main features we want is to have it modularized with import/export. From your point of view, it would enable the use of classes over prototypes (they still use prototypes under the hood, so you can still
mess with it as necessary).
There are a few build systems. My vote would be webpack (which uses babel for the transpiling). With babel, you can define custom loaders, so the chunking system you developed for shaders could be reduced to actual glsl code with a #include extension (I actually do my shaders this way, and would be happy to contribute it to the project). This gets the same benefits of your system (no code duplication), but is very simple to use.
I would love to be part of the modularization project, but I do know that this will not be successful in any way without your support (and likely assistance). Many of us know how to use the library, but none of us know how it works internally to the extent that you do.
Certain kinds of transpiled things end up being large (primarily: generators and async functions), but if you avoid those, you won't have that penalty.
How large?
Also, you didn't talk about performance penalty. Is that not an issue then?
As far as I can see, ES6 Imports are still not supported by any browser, so this module refactor would mainly be for build systems, right?
Don't forget about the benefits you get by using tools like rollupjs, this would automatically exclude all the exports a user doesn't use. (Which is default with JSPM)
The babel-polyfil package, which is only necessary if you are using
generators (which probably don't even make sense in this project) or async
functions (which I don't really think would change much in the project
either), adds around 50k to the final build. But again, this is optional.
As far as performance, it really depends on exactly which features you are
using. For instance, arrow functions are a tiny bit slower, due to the
underlying bind, classes are a bit slower to create, though the
instantiating time is the same. https://kpdecker.github.io/six-speed/
ES6 imports/exports are not supported by browsers, but since it goes
through a build system, that is a non-issue. The product output would be
usable exactly as it currently is (even being backwards compatible), but
would allow it to be integrated into our build systems, and make the
internal components reusable to us.
Another thing to note is final build size. Currently, things like Geometry,
Material, Mesh, etc are part of the THREE namespace. When minified,
references to THREE.Geometry, THREE.Material, THREE.Mesh etc remain in the
code. With a modular system, each of those files would get something like
var Geometry = require('./geometry');
then have references to the
variable Geometry
later. Then at minifaciton, Geometry
and require
are both switched to singe characters, the './geometry' is replaced with a
number, resulting in quite a bit of savings. Napkin math: the minified
build is 511,794 bytes and contains 2942 references to
THREE\.[A-Z][a-zA-Z]+
. Replacing all of these with a single character
results in an almost 10% file size reduction (down to 464,782). (The gziped
sizes are 117,278 and 110,460 respectively, a 6% reduction). The build
could likely be tuned to reduce this even further.
Rollup (which eliminated unused code from a final build) is the default
with jspm, will be the default with webpack2 (and I believe it can be used
with webpack). If things are written modularly, I don't think this will be
helpful, though. In any case, as long as the code can be transpiled with
babel, it can be used in any build system (the glsl loader I mentioned
before can also be made to work with webpack).
On Thu, Jul 7, 2016 at 1:28 PM, Mr.doob notifications@github.com wrote:
Certain kinds of transpiled things end up being large (primarily:
generators and async functions), but if you avoid those, you won't have
that penalty.How large?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/mrdoob/three.js/issues/4776#issuecomment-231197171,
or mute the thread
https://github.com/notifications/unsubscribe/AA71cqAqmgxsUjpvamnI_xyL2wpzeWrdks5qTWGBgaJpZM4B4aA7
.
Not sure if this is terribly helpful, but this is the discussion thread for D3 regarding the same issue: https://github.com/d3/d3/issues/2220. D3 4.0 has adopted ES6 import/export to manage modules, but is still written in ES5 (https://github.com/d3/d3/issues/2220#issuecomment-111655235).
Very interesting @jpweeks!
So... with this import/export approach... How would stuff like object instanceof THREE.Mesh
look like?
@mrdoob
import/export
is just the way modules are declared and required. It will not affect/change the code defined within the modules at all:
src/Objects/Mesh.js
// Mesh class, stays the same as today (except the export part)
var Mesh = function ( geometry, material ) {
// ...
}
export default Mesh
src/Three.js
// Library entry point, exports all files using som bundling tech
// In a "THREE" namespace for browsers
// As import three from 'three' in node
import Mesh from './objects/Mesh'
export {Mesh} // All three objects, such as Geometry, Material etc..
Application.js
// In node
import {Mesh} from 'three'
var mesh = new Mesh(geo, mat)
console.log(mesh instanceof Mesh) // true
Client.js
// In a browser
var mesh = new THREE.Mesh(geo, mat)
console.log(mesh instanceof THREE.Mesh) // true
That's super helpful @GGAlanSmithee! Thanks!
I'm a visual person so pseudo-code examples convince me more than big chunks of text 😅
Right, so it will require a bit of refactoring...
Does anyone know if closure compiler is planning on supporting this?
Right, so it will require a bit of refactoring...
I got you! Since this thread got lively over the last couple of days I've been working a bit more on three-jsnext. It's a project that takes the existing Three.js codebase and turns it into ES modules automatically. Just wrangling with a couple of tricky cyclical dependencies (particularly around KeyframeTrack
), but should have something to share real soon. As far as I can tell, all the examples continue to work, and the minified build is smaller than the current one (using Rollup to generate a UMD file), so it's all good news.
Ok, I've opened a pull request for this: #9310
@mrdoob
We have a library in production that more or less is structured like THREE. It works in browsers and modular environments. The codebase is ES6 but browsers are not your concern at all.
You would ship this on npm _as is_, all modules included + a compiled global-namespace browser monolith (three.js). Whoever needs to use single parts of it uses tools to create bundles.
Consider a structure like this:
/src
classA.js
classB.js
classC.js
/index.js
/browser.js
index.js simply re-exports all modules and functions in one file:
export ClassA from './src/classA';
export ClassB from './src/classB';
export ClassC from './src/classC';
So the end-user can npm install the lib and just use it without any further ado:
// all exports from index.js will be under: mylib.ClassA, etc.
import * as mylib from 'libname':
// selected exports from index.js
import { ClassA, ClassC } from 'libname';
// or, specific modules
import ClassB from 'libname/src/classB'
browser.js would be the only compiled part of the package. Usually transpiled to ES5 via Babel and exported into a global-namespace so it can be used as a script include. Rollup, Webpack, etc. can create this with ease.
@mrdoob its been a wonderful ride 🚀
Most helpful comment
I got you! Since this thread got lively over the last couple of days I've been working a bit more on three-jsnext. It's a project that takes the existing Three.js codebase and turns it into ES modules automatically. Just wrangling with a couple of tricky cyclical dependencies (particularly around
KeyframeTrack
), but should have something to share real soon. As far as I can tell, all the examples continue to work, and the minified build is smaller than the current one (using Rollup to generate a UMD file), so it's all good news.