I am working on a data visualization for the Harvard Art Museums, and I would like to show the entire collection using 200,000 pictures. I have set up a PIXI.Container with PIXI.Sprites, optimizing the PIXI.App and using pixi-viewport. So far the limit of images for a good interaction is 5,000. With 10,000 images it is still acceptable but slower. Images are pretty small, each of them has side of max 500px and size between 15kb and 70kb, I don't think they can be optimized more than that.
Does anyone have any suggestions to get to 200,000 images?
Screenshot => https://github.com/rodighiero/Suprise-Machines/commit/077be559f1f597a8959bce05badd51ecbc4d02ed?short_path=4f3cbb8#diff-4f3cbb8a1d024a4d9ec7512c5417dbd3
At 200k images like that, each image is maybe only a pixel small. Or less.
You could use that to display them as monocolored particles instead, until you zoom in.
Because there's just no way you're rendering 200k different images all at once on commodity hardware. They won't even fit in video memory if we assume they're all about 250x250 pixels large (you'd need tens of gigabytes).
The idea is to use zooming, but probably I have to change approach at this point. I was thinking to combine a series of images at different resolutions, like Google Earth. When you zoom, you retrieve a more pixel-dense image. Not sure if there is something already created in PixiJS.
P.S. I don't know if this helps, but all the images are static.
As all the texture are static, is an option to render the entire image as a bitmap? Thanks for helping!
Not sure if there is something already created in PixiJS.
Nope, you have to do it yourself.
Also, you are talking about file size, but another thing is actual number of pixels - in videomemory 1 pixel costs 4 bytes. You have to know how browser and webgl manages RGBA stuff. Here's one of my explanations: https://www.html5gamedevs.com/topic/45798-optimizing-memory-of-large-background-images-when-changing-maps/?do=findComment&comment=252643
You have to make your own manager that tracks all those images and their zooms, and which ones have to be removed from videomemory for others to take it.
Another thing is that for good FPS overall number of textures drawn in scene has to be small. One texture 100k times - no problem, 1000 textures 1 time - is a problem. Your stuff is small => you can combine them in atlas in realtime. Unfortunately, plugin for realtime atlas is not yet ported to v4, https://github.com/gameofbombs/pixi-super-atlas
There's another plugin that uses runtime atlases and has a number of algorithms that will suit you, but it has to be changed significantly: it was made for vector shapes, not to collect big number of regular images. https://github.com/gameofbombs/pixi-blit
Hi Ivan,
Thanks for diving into this issue. As you probably noticed, there is a smaller community using PixiJS for data visualization.
Unfortunately, my skill doesn't allow me to do advanced programming, I have to think something simpler. I was wondering if a good compromise is to create a large background canvas image鈥攖he size limit is around 32,000x32,000鈥攁nd load images at a certain level of zoom. The interface, indeed, is based on the viewport plugin (like this one https://rodighiero.github.io/DH2020/). Here three potential screenshots:
Do you think that is feasible?
Thanks,
Dario
Maybe i'll have time to look in weekend
I created an image measuring 13,998 x 13,998, rather small compared to the maximum allowed by the browser, but I cannot visualize it. Now I am out of ideas. Is there a pixel limit to load PixiJS sprites?
13,998 x 13,998, rather small compared to the maximum allowed by the browse
Mobiles usually do 4k, PC's 8k. 16k is certainly big. Did you check the specific webgl constant?
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.