Loading Ply Files using FileAPI and WebWorkers

The embedded video shows a ply file from the Stanford Repository beeing loaded by dragging and dropping the file into the browser window.
The ply file contains 35947 points in ascii format. The loading and parsing happened in the main thread at first but this made the browser completely unresponsive. I decided to move the parsing part into a WebWorker which continuously informs the main thread about the progress.

I was able to load point clouds with up to 5m points(~135Mb binary ply) in Firefox. Chrome managed to load up to ~3.8 points but loading 5m always failed. At the moment, the whole file is loaded at once in one single buffer. Maybe it works if I split loading into smaller parts.

Procedural Surfaces

Here’s a demo showing customizable procedural surfaces:

And a video, in case the demo does not work:

Just change the code at the bottom and press the generate button to create a new surface.
Modify xStep and yStep to increase or decrease the sampling distance.

New pointcloud: tower

Found a nice pointcloud at the Robotic 3D Scan Repository:


points: ~137m
size: ~2gb

Visibility calculations are quite heavy and since this cloud has a huge amount of nodes, I had to modify the process a little. For this particular cloud, visibility will be updated after every 5th frame.
I’m thinking about moving this task to a seperate WebWorker thread. Visibility does not have to be up to date each frame. Instead, a WebWorker could check which nodes are visible and inform the main thread about changes once or twice per second.

XYZ to potree converter


Here is a first take on a potree converter: PotreeConverter.jar
Converting the first 40m points of skatepark.xyz from xyz to the octree took me about 5m30s.

java -jar PotreeConverter.jar "path to xyz" "path to output directory" [max points]

java -jar PotreeConverter.jar "C:/dev/pointclouds/skatepark/skate/skate.xyz" "D:/temp/potree/skatepark/converterTest/02" 40000000

The xyz files are human readable with x,y,z and r,g,b data. For example:

123.000 534.123 534.143 255 255 123
54.000 67.123 12.143 10 20 30

You also have to add an entry for each cloud to potree/pointclouds.js

I’ve experienced some problems with the converter from time to time, especially with very large pointclouds(>200m). I’ll look into them the next few weeks.

This demo shows the result of the new converter:
(You can switch to fullscreen mode by double clicking anywhere)


The quality does not quite match that of the old converter, yet.
The compensate for that, I’ve increased the LOD modifier, which means more points will be displayed.
Also, nodes with invisible/unloaded children will be rendered with increased point size.


  • Increasing the point size for nodes with invisible children seems to improve quality quite well. I’d like to replace this quick hack I’ve added with a more thought out solution.
  • Add an interpolation filter. Potree had something called “Gaussian Splats” a while back and I want to reimplement them. This filter gets rid of artifacts caused by overlapping points that are close to each other. The result are amazing.
  • Converter improvements like support for point formats other than position&color.
  • potree.js

    potree.js(formerly WebScanopy) is an open source, WebGL based point cloud renderer targeted at huge datasets with up to a few gigabytes.
    This is possible because potree.js only loads as much data as needed for a given point of view.

    To the demo

    You’re free to use the source code under the GPL3 license.
    As of now, there is virtually no documentation and instructions but I hope I’ll be able to add both by the end of september.
    If you’ve got questions, suggestions or point clouds you’d like to share, contact me at mschuetz@potree.org.