Start a new topic

Model complexity metrics

Original Post by: ccbrianf Thu Sep 23 19:03:59 2010


Continuing from where this post left off:


Another example is the load information (NIS, NIX, NNL, NTC, NTX, NVT). In a class-level attribute, they would take 39 bytes per model, but in the extended attribute file they would take 204 bytes per model. Unless most of your models shared identical load parameters, this causes the RTP to load much more data. We still intend to keep using them, because they are the only load management information about the model (short of loading the actual model), but we would prefer to keep them as a class-level attribute.


Since the attribution above has been deprecated in CDB 3.1, what analogous new attribution to convey model complexity for RTP capacity/load management will be available in CDB 3.2?


Original Post by: ccbrianf Tue Oct 30 16:13:06 2012


These are my initial thoughts. I will follow up more after further reflection.


We are in the process of reviewing the handling of 3D model data for CDB. The currently envisaged approach is to bundle the 3D Models (Openflight and textures) into tile-LODs whose maximum size (in bytes) will be prescribed by the spec.


I hope this bundling only includes truly geo-specific models. Otherwise, you are once again inflating the size of the CDB, and making client device memory management more difficult, as was done before geo-typical models were allowed to be referenced by the geo-specified data set (in CDB 3.1). Furthermore, I hope that bytes has become only a secondary criteria for LOD,where significant size/real world resolution is still primary.


Also, bytes, especially if textures and geometry are combined (which I hope is not the plan), are not necessarily a good proxy for run-time client device load (which in the visual case is dominated geometrically mostly by vertices). Bytes is, however, a good way to assure the unit size is page-able and bounded.


I'm also still concerned about the choice of a chunked LOD structure for 3D models in the CDB as this makes smooth LOD transitions on some client devices more difficult. It is generally much easier to notice an entire chunk (region) of features in transition (being added, morphed, or removed) than it is to notice those same feature transitions in isolation (for instance, a significant size based "ragged bow wake" of transitions rather than a tiled block transition). The tiled choice of CDB LOD partitioning, in the absence of other supporting metadata, makes an implementation of individual point feature transitions complex, if even possible.


I would suggest to retain enough metadata in the CDB as to enable practical run-time reconstruction of the individual point feature LODs. This also seems desirable for using CDB as a source database. In order to do that, in addition to knowing the exact significant size of a particular point feature LOD (as opposed to just its allowable range), we also need to know at least whether a coarser LOD was present (ie. is this a new or refined feature). Better yet is knowing the exact significant size of the coarser LOD.


This will ensure that client-devices can deterministically access 3D Models; hence the need for mentioned attributes (NIS, NIX, NNL, NTC, NTX, NVT) goes away, since the load is relatively fixed.


Certainly we found NVT extremely valuable to algorithmically adapt published content to device capacity, and to a lesser extent NIX and NIS. Replacing these exact statistical measures with an arbitrary bound of "bytes" would be a significant step backward in our opinion. As above, retaining this metadata for those client devices that desire to use it would be highly recommended.


We are currently assessing whether this will be included into v3.2 or a follow-on version v3.3. The write-up for this addendum is ready however.


If the write-up is even moderately well formed, then I strongly encourage you to post it to allow reasonable time for industry comments and suggestions.

Original Post by: RyanFranz Tue Oct 30 19:03:12 2012


I also would like a criteria that is better than bytes for creating a tile-LOD. OpenFlight isn't the most space optimized format, and adding additional groups/objects can increase file size, not to mention whether duplicate vertices are removed from the vertex palette. Thus, it will be very hard to determine the IG load of a 200k OpenFlight file (or what sounds like a 1Mb zip file that has texture too). This sounds like a step backwards in the spec.


So maybe I am reading too much into the recent posts here, but are the Geo-Specified shapefiles and models disappearing? If so, it seems like it would be that much harder to scatter trees or generic buildings. If not, losing the load attributes (NIS, NIX, NNL, NTC, NTX, NVT) would remove a publisher being able to tell if it can properly load and display that shapefile LOD. Can I get some clarification on this until an actual spec appears?

Login to post a comment