Global elevation data is available through NASA’s Shuttle Radar Topography Mission (SRTM) program. For those interested, this blog post compares the resolution of elevation slope maps derived from the SRTM project vs. LiDAR data, and provides a brief background behind the two methods. LiDAR, in short, provides for much higher resolution Digital Elevation Models (DEMs). Here I provide an extensive example of generating contour lines from LIDAR point clouds with classified ground points.

  1. I have selected 4 LIDAR adjacent tiles of an area with Olympic National Park, Washington, downloaded from the Puget Sound LIDAR Consortium:
  2. In order to download data from PSLC, you’ll need to register (free). Once you’ve downloaded your tiles of interest, the first step is filtering out only the points classified as ground. In forested settings, generating a DEM on the unfiltered .las tile would generate a Digital Surface Model, or DSM, in contrast to a Digital Terrain Model, or DTM. This can be done using PDAL’s filters.range module, because ground points in a classified point cloud are given the classification of 2. Note that not all point cloud .las files have classified points; methodologies for classifying unclassified points is beyond the scope of this article. Save a new text file with the following contents and name it filter_ground.pipeline.json:
          "compression": "laszip", 
          "limits": "Classification[2:2]"
          "keep_unspecified": "false",
          "quote_header": "false",
          "delimiter": ",",
  3. Run this file by running pdal pipeline filter_ground.pipeline.json. If you have multiple tiles as I do in this example, you’ll need to run this filter for each file. You can simplify modify the input and output file arguments and save over the same file.
  4. Import the outputted .txt point cloud files into SAGA GIS by following steps 4-8 from this post.
  5. From the Geoprocessing menu, select Shapes -> Point Clouds -> Tools -> Merge Point Clouds to merge the point clouds you have imported into SAGA.
    Merge Point Clouds
  6. From the Geoprocessing menu, select Grid -> Gridding -> Interpolation from Points -> Triangulation. Fill out the dialog with the information and point cloud you have created from the merge.
    • Set Attribute to Z.
    • Set Fit to cells.


  7. Click Okay. Running the gridding algorithm will likely take some time, potentially several hours. This is because the implementation does not leverage multiple processor cores. Once completed, you should see your generated grid similar in the screenshot below.
  8. Now, from the Geoprocessing menu, select Shapes -> Grid -> Vectorization -> Contour Lines from Grid. Set Equidistance to the interval to which you would like contour lines rendered.
  9. After your contour lines are created, you can optionally have them labeled via SAGA’s object properties dialog, and/or export the contour lines for use in outside programs.

Recent versions of SAGA GIS no longer support importing files in the .las format. However, a .las file can be converted using PDAL’s writers.text module into a text file format, which in turn can be imported in SAGA. For this example, we will begin with a .las file projected in a state plane CRS that is the output of my post Reprojecting LIDAR .las files to Latitude/Longitude to State Plane Example.

  1. Open a new text file named las2xyz.json in the same directory as your .las file and paste the following into it:
      "pipeline": [
          "type": "readers.las",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.las"
          "type": "writers.text",
          "order": "X,Y,Z",
          "keep_unspecified": "false",
          "quote_header": "false",
          "delimiter": ",",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.txt"
  2. Now run the pipeline. This may take a while. Once it completes, you should now see the converted .txt file in the working directory.
    pdal pipeline las2xyz.json
    ls -hs -1 
    total 1.1G
    227M 20131013_usgs_olympic_wa_10TDT310990.las
    148M 20131013_usgs_olympic_wa_10TDT310990.laz
    745M 20131013_usgs_olympic_wa_10TDT310990.txt
    4.0K las2xyz.json
    4.0K lat_lng_WA_S_FIPS_ft_reprojection.pipeline.json
  3. Let’s perform a quick sanity check on this new file:
    less 20131013_usgs_olympic_wa_10TDT310990.txt
  4. Open SAGA GIS. From the Geoprocessing menu, select File -> Tables -> Import -> Import Text Table with Numbers only
    Import Text Table from SAGA GIS Geooprocessing menu
  5. Select the .txt point cloud file that was created by PDAL, with the appropriate options as shown in the screenshot. Click Okay. Since the example text file used is nearly 1.0 GB in size, importing may take some time.
    SAGA GIS - Import text table with numbers only
  6. Once the text file has been imported as a table into SAGA, select Geoprocessing -> Shapes -> Point Clouds -> Conversion -> Point Cloud from Table.
    SAGA GIS - Point Cloud from Table Menu
  7. Select the table that was created in the previous step, and then set the values for X/Y/Z to these columns from the table.
    SAGA GIS - Point Cloud From Table
  8. Afterwards, you will get the point cloud generated in your data sources.
    SAGA GIS - Point Cloud in Data sources
  9. From the Geoprocessing menu, select Visualization -> 3D Viewer -> Point Cloud Viewer. Select Z as the colored attribute.
    SAGA GIS - Geoprocessing 3d visualization
    SAGA GIS - Point Cloud Viewer Dialog
  10. Click Okay. The point cloud will be displayed in SAGA’s Point Cloud Viewer tool:
    SAGA - Point Cloud in Point Cloud Viewer

  1. Download one or more lidar tiles:
  2. Note the source crs (coordinate reference system, i.e. projection). This 2013 USGS Lidar: Olympic Peninsula (WA) Point Cloud dataset from NOAA uses EPSG 4269. This means that the point cloud coordinates are using latitude/longitude, which is not well supported by many downstream GIS tools. Running lasinfo confirms that the x and y coordinates are in lat/lng:
    lasinfo (170203) report for 20131013_usgs_olympic_wa_10TDT310990.laz
    reporting all LAS header entries:
      file signature:             'LASF'
      file source ID:             0
      global_encoding:            1
      project ID GUID data 1-4:   00000000-0000-0000-6C4F-6D7900636970
      version major.minor:        1.2
      system identifier:          'LAStools (c) by rapidlasso GmbH'
      generating software:        'lasduplicate (160119) commercia'
      file creation day/year:     231/2014
      header size:                227
      offset to point data:       331
      number var. length records: 1
      point data format:          1
      point data record length:   28
      number of point records:    25204541
      number of points by return: 16034492 6849061 2147980 173008 0
      scale factor x y z:         0.0000001 0.0000001 0.001
      offset x y z:               0 0 0
      min x y z:                  -123.9223149 47.8406412 248.190
      max x y z:                  -123.9087939 47.8497435 827.800

    Thus, re-projecting the .las file is necessary before any further processing can be done. Common CRS systems include UTM based projections, as well as state plane. For this example, I’ve chosen to reproject from lat/lng to NAD 1983 StatePlane Washington South FIPS 4602 Feet as it is what is used by data available through the Puget Sound Lidar Consortium.

  3. Re-projection can be done with PDAL’s filters.reprojection. Create a new JSON pipeline file named lat_lng_WA_S_FIPS_ft_reprojection.pipeline.json with the following contents:

      "pipeline": [
          "type": "readers.las",
          "compression": "laszip",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.laz"
          "type": "filters.reprojection",
          "in_srs": "EPSG:4326",
          "out_srs": "EPSG:102749"
          "type": "writers.las",
          "compression": "laszip",
          "scale_x": "0.01",
          "scale_y": "0.01",
          "scale_z": "0.01",
          "offset_x": "auto",
          "offset_y": "auto",
          "offset_z": "auto",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.las"

    Now save the file and run the pipeline. This will probably take a while as it involves de-compressing the input .laz file, iterating through and reprojecting every point in the point cloud, and writing the output file to disk. Now, examining the outputted, reprojected .las file with lasinfo 20131013_usgs_olympic_wa_10TDT310990.las will indicate the min/max in the new projection, as well as the z (elevation) values converted into feet:

    lasinfo 20131013_usgs_olympic_wa_10TDT310990.las 
    lasinfo (170203) report for 20131013_usgs_olympic_wa_10TDT310990.las
    reporting all LAS header entries:
      file signature:             'LASF'
      file source ID:             0
      global_encoding:            0
      project ID GUID data 1-4:   00000000-0000-0000-0000-000000000000
      version major.minor:        1.2
      system identifier:          'PDAL'
      generating software:        'PDAL 1.5.0 (Releas)'
      file creation day/year:     211/2017
      header size:                227
      offset to point data:       690
      number var. length records: 3
      point data format:          3
      point data record length:   34
      number of point records:    25204541
      number of points by return: 16034492 6849061 2147980 173008 0
      scale factor x y z:         0.01 0.01 0.01
      offset x y z:               800114.623206489603035 932554.57963061647024 814.270025000000487
      min x y z:                  800114.62 932554.58 814.27
      max x y z:                  803498.24 935938.24 2715.87

In the past, I’ve written about different remote desktop products/solutions, including NoMachine for remoting into traditional desktop machines, but also a few alternatives for accessing and controlling Android devices from a computer. Today I’d like to mention TeamViewer, a product free for non-commercial use that supports computer to computer, Android to computer, and computer to Android connections.

The following screenshots were taken on a macOS Sierra machine remotely logged onto a rooted Samsung Galaxy S4 Android tablet.

  1. Download the SDelete program provided by Microsoft onto your Windows Guest machine.
  2. Clean up unnecessary files and programs. One useful utility is WinDirStat.
  3. Run the following command in the terminal to write zeroes over unused disk storage in your Windows guest:
    sdelete64 -z c:
  4. Shut down your VM and run the following command on your Windows Guest VDI:
    VBoxManage modifyhd /misc/virtualbox-vms/Windows10/Windows10LargeDisk.vdi
  5. Create a new VDI with the new desired maximum size:
    VBoxManage createhd --filename /misc/virtualbox-vms/Windows10/Windows10Disk.vdi --size 92160
  6. Clone the existing VDI into the newly allocated one:
    VBoxManage clonehd /misc/virtualbox-vms/Windows10/Windows10LargeDisk.vdi /misc/virtualbox-vms/Windows10/Windows10Disk.vdi --existing

Tree heights can be approximated to a fairly high degree precision using a laser rangefinder, presuming that one obtains accurate measurements, using a method known as the laser sine method .

Nikon Forestry Pro

While some rangefinders made specifically for forestry purposes may have a built in height calculator, I personally own a more inexpensive model that does not have this built in.

I am building a very simple Android app that can perform these calculations for the very few other tree enthusiasts out there who would like to have a way of knowing how tall a tree is at the time of taking measurements. The apk can be downloaded here and the source code for this project is an open source project on GitHub.

Feel free to follow me on Happy tree hunting!

A bit of background for the curious

Redwoods are the world’s current tallest trees, though it is a subject of debate as to whether or not this was historically the case as there are many records from the logging days that claim heights of douglas firs up to 465 feet tall (see also Redwoods: Only The Tallest Because the Rest Have Been Logged.

The current tallest non-redwood tree in North America is a douglas fir located in southwestern Oregon. Nearly 330 feet tall, the Doerner Fir is still over 50 feet shorter than the tallest redwood, though perhaps what is more interesting to note on that matter is that there are at least ~4 known redwoods over 370 ft tall.

. As you go into the 360-370ft range for redwoods, and further down, the number of known redwoods in that height range increases exponentially.

There are likely many “undiscovered” douglas-fir exceeding 300 ft.

The percentage of remaining old growth forests in Oregon and Washington state (and British Columbia, for the matter) is roughly approximate to the percentage of remaining old growth redwood. However, the distribution of old growth over this non-redwood region, in particularly the segments of old growth that might contain exceptionally tall trees, is much more scattered than the areas known to contain the tallest redwoods. Douglas-fir have a gigantic range in comparison. Combined with the fact that relatively little effort has been expended on the discovery of these trees, since they would not be even potential candidates of being “the” world’s tallest, leads me to conclude that the number of undiscovered (i.e. not recognized with an official height measurement) tall trees is probably very high.

I have built this app to aid me as I am going on a trip next weekend to an area known to contain some of the tallest and largest and best preserved groves of trees in the region. I will report back my findings, likely on

Over the past two years I have searched for tall trees in Mt. Rainier National Park (including the Grove of the Patriarchs) in WA and Cathedral Grove in BC. This led me to the measuring a tree that was slightly over 200 ft. tall, which I named after the creek to which it grew along. While those areas contained some of the greatest old-growth in the region, they did not contain any exceptionally tall trees in relation to the tallest known still standing today. Thus, I search elsewhere.

A useful feature native feature of Chrome is its ‘Task Manager’, which allows one to monitor all Chrome processes and their resource consumption:
Google Chrome Task Manager

Firefox has been slow to adopt parallel functionality, though this is mostly that multi-process mode in Firefox hadn’t become available until recently (at least in non-Nightly releases). One still needs to install this Task Manager Firefox add-on — though I would not be surprised to see if this became built-in to Firefox sometime in the near future.

Firefox Task Manager

  1. Enroll in Apple’s Beta Software Program.
  2. Download a fresh copy of VMware Workstation Player for Windows or Linux from the official VMware site. The free trial of this product has no expiration if used for non-commercial purposes.
  3. You’ll need to unlock your installation of VMware to use Mac operating system as a guest following these instructions (external link).
  4. On a computer running an official/genuine instance of OS X, download macOS Sierra from the App Store:
    App Store - macOS Sierra
  5. Once the download completes, open the Terminal application, and either save a new file with the following contents,
    # Mount the installer image
    hdiutil attach /Applications/Install\ macOS\ -noverify -nobrowse -mountpoint /Volumes/install_app
    # Create the macOS-Sierra Blank ISO Image of 7316mb with a Single Partition - Apple Partition Map
    hdiutil create -o /tmp/macOS-Sierra.cdr -size 7316m -layout SPUD -fs HFS+J
    # Mount the macOS-Sierra Blank ISO Image
    hdiutil attach /tmp/macOS-Sierra.cdr.dmg -noverify -nobrowse -mountpoint /Volumes/install_build
    # Restore the Base System into the macOS-Sierra Blank ISO Image
    asr restore -source /Volumes/install_app/BaseSystem.dmg -target /Volumes/install_build -noprompt -noverify -erase
    # Remove Package link and replace with actual files
    rm /Volumes/OS\ X\ Base\ System/System/Installation/Packages
    cp -rp /Volumes/install_app/Packages /Volumes/OS\ X\ Base\ System/System/Installation/
    # Copy macOS Sierra installer dependencies
    cp -rp /Volumes/install_app/BaseSystem.chunklist /Volumes/OS\ X\ Base\ System/BaseSystem.chunklist
    cp -rp /Volumes/install_app/BaseSystem.dmg /Volumes/OS\ X\ Base\ System/BaseSystem.dmg
    # Unmount the installer image
    hdiutil detach /Volumes/install_app
    # Unmount the macOS-Sierra ISO Image
    hdiutil detach /Volumes/OS\ X\ Base\ System/
    # Convert the macOS-Sierra ISO Image to ISO/CD master (Optional)
    hdiutil convert /tmp/macOS-Sierra.cdr.dmg -format UDTO -o /tmp/macOS-Sierra.iso
    # Rename the macOS-Sierra ISO Image and move it to the desktop
    mv /tmp/macOS-Sierra.iso.cdr ~/Desktop/macOS-Sierra.iso

    or run

    curl > /tmp/

    and then execute it by running sudo chmod +x /tmp/ && /tmp/ Once completed, you should see a file named macOS-Sierra.iso on your desktop.

  6. Now boot up VMware Player and create a new virtual machine using the File dialog. Using the iso disk image we just created, and with macOS 10.12 selected as the guest os, finish the setup for your new image.
    Guest Operating System - Select macOS 10 12
  7. You’ll need to erase the virtual hard disk medium as seen in the following screenshots. DiskUtility - VMware Virtual STA hHard Drive
    DiskUtility - Erase Hard Drive confirmation
  8. Now you will be able to proceed with the installation. Once completed, you should be all ready to enjoy your new macOS Sierra vm!
    Installation macOS Sierra
    macOS Sierra Screenshot

This basically involves two steps:

  1. Client-side support, aka adding the required decoding libraries on the local computer. See official documentation: Enabling the H.264 codec on the NoMachine client host. Here is an example of following the steps on my Macbook Air:
    brew update
    brew install ffmpeg && brew upgrade ffmpeg
    cd /usr/local/Cellar/ffmpeg/3.0.2/lib
    sudo cp libavcodec.dylib /Applications/NoMachine/Contents/Frameworks/lib
    sudo cp libavutil.dylib /Applications/NoMachine/Contents/Frameworks/lib

Setting a custom resolution to that of the guest monitor

Say you have two monitors physically connected to the server, supporting by default maximum resolutions of 1440x900 and 1600x900. If you are logging in to the remote server from a machine that has a larger display, it may be difficult to add the new resolution. I have attempted to follow dozens of similar instructions I found online, with little luck. The only thing that worked for me seems like a hack, but it works.

➜  xrandr  | grep -i primary 
DVI-D-0 connected primary 1440x900+0+0 (normal left inverted right x axis y axis) 410mm x 257mm

We’ll be using the scale option of randr to change our resolution in this case, i.e. (x resolution)(x scale factor) = (desired x resolution); (y resolution)(y scale factor) = (desired y resolution). You’ll want to use at least a few decimal places for non-truncating decimal numbers unless your scaling factor is a rational number:

xrandr --output DVI-D-0 --scale 1.3333333x1.33333333

Afterwards, you should see this represented if you run the initial xrandr command above.

➜ xrandr  | grep -i primary 
DVI-D-0 connected primary 1920x1200+0+0 (normal left inverted right x axis y axis) 410mm x 257mm