-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wicked huge voxel files #35
Comments
LAS files are typically delivered by vendors in a tiled format - usually
500m x 500m or 1km x 1km. When we wrote lidar2dems, we had issues with
classifying returns along tile boundaries, so we made an effort to reduce
tile boundaries by joining tiles into larger regions for classification in
lidar2dems (and all steps that followed). So the resulting LAS files were
much larger and took on the shape of our region shapefiles that represented
different cover types (I think we called them "site shapefiles"). These
resulting LAS files were carried through the DEM processing.
If we changed the order a bit, while I'm not sure what it would do to
processing times, we should be able to classify each of the sites, combine
all of the sites into a single LAS file, then tile the large classified LAS
file, which should generate more manageable tiled shapefiles. It also might
introduce new issues with missing data along tile boundaries. At the very
least, it should be possible to recombine and retile the dataset at some
point for easier management and data sharing.
A schematic would show the processing steps a bit more clearly...
…On Thu, Jul 13, 2017 at 12:39 PM, Bobby Braswell (Rob) < ***@***.***> wrote:
This isn't a problem on our compute cluster but large files of 12-50 GB
cause a severe bog-down when trying to do demonstration runs on a smaller
system (16GB laptop).
For example the first step of automated logging scripts is to read in the
voxel file and perform a summary calculation across vertical levels, and
this read will not complete on a maxed out (8GB) Linux VM on my macbook.
From @F-Sullivan <https://github.com/f-sullivan> I think I've heard there
is a natural way to divide or re-divide these files spatially via tiles,
but I don't understand the implications yet.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#35>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AKdSXN1E4mELHA7uKGtguYm0GyU8Uw3Xks5sNkhUgaJpZM4OXQMW>
.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This isn't a problem on our compute cluster but large files of 12-50 GB cause a severe bog-down when trying to do demonstration runs on a smaller system (16GB laptop).
For example the first step of automated logging scripts is to read in the voxel file and perform a summary calculation across vertical levels, and this read will not complete on a maxed out (8GB) Linux VM on my macbook.
From @F-Sullivan I think I've heard there is a natural way to divide or re-divide these files spatially via tiles, but I don't understand the implications yet.
The text was updated successfully, but these errors were encountered: