2017

Notepad++ is undoubtedly one of the most popular code/text editors available for Windows. While it does not run natively on Linux (or macOS), many have ran it successfully with Wine. There is also a clone of Notepad++ called Notepadqq that runs natively on Linux that is written using the well-known cross-platform UI library Qt.

  1. Download a fresh copy of VMware Workstation Player for Windows or Linux from the official VMware site. The free trial of this product has no expiration if used for non-commercial purposes.
  2. You’ll need to unlock your installation of VMware to use Mac operating system as a guest following these instructions (external link).
  3. On a computer running an official/genuine instance of OS X, download macOS High Sierra from the App Store:
    App Store - macOS High Sierra
  4. Once the download completes, open the Terminal application, and either save a new file with the following contents,
     #!/bin/bash
    # Mount the installer image
    hdiutil attach /Applications/Install\ macOS\ High\ Sierra.app/Contents/SharedSupport/InstallESD.dmg -noverify -mountpoint /Volumes/macOS-High-Sierra
    
    # Create the macOS-HighSierra Blank ISO Image of 7316mb with a Single Partition - Apple Partition Map
    hdiutil create -o /tmp/macOS-HighSierra-Base.cdr -size 7316m -layout SPUD -fs HFS+J
    
    # Mount the macOS High Sierra Blank ISO Image
    hdiutil attach /tmp/macOS-HighSierra-Base.cdr.dmg -noverify -mountpoint /Volumes/install_build
    
    # Restore the Base System into the macOS High Sierra Blank ISO Image
    asr restore -source /Applications/Install\ macOS\ High\ Sierra.app/Contents/SharedSupport/BaseSystem.dmg -target /Volumes/install_build -noprompt -noverify -erase
    
    # Remove Package link and replace with actual files
    rm /Volumes/OS\ X\ Base\ System/System/Installation/Packages
    cp -R /Volumes/macOS-High-Sierra/Packages /Volumes/OS\ X\ Base\ System/System/Installation
    
    # Unmount the installer image
    hdiutil detach /Volumes/OS\ X\ Base\ System/
    
    # Unmount the macOS High Sierra ISO Image
    hdiutil detach /Volumes/macOS-High-Sierra/
    
    mv /tmp/macOS-HighSierra-Base.cdr.dmg /tmp/BaseSystem.dmg
    
    # Restore the macOS High Sierra Installer's BaseSystem.dmg into file system and place custom BaseSystem.dmg into the root
    hdiutil create -o /tmp/macOS-HighSierra.cdr -size 8965m -layout SPUD -fs HFS+J
    hdiutil attach /tmp/macOS-HighSierra.cdr.dmg -noverify -mountpoint /Volumes/install_build
    asr restore -source /Applications/Install\ macOS\ High\ Sierra.app/Contents/SharedSupport/BaseSystem.dmg -target /Volumes/install_build -noprompt -noverify -erase
    cp /tmp/BaseSystem.dmg /Volumes/OS\ X\ Base\ System
    hdiutil detach /Volumes/OS\ X\ Base\ System/
    
    # Convert the macOS-Sierra ISO Image to ISO/CD master
    hdiutil convert /tmp/macOS-HighSierra.cdr.dmg -format UDTO -o /tmp/macOS-HighSierra.iso
    
    # Rename the macOS-Sierra ISO Image and move it to the desktop
    mv /tmp/macOS-HighSierra.iso.cdr ~/Desktop/macOS-HighSierra.iso
    rm /tmp/macOS-HighSierra.cdr.dmg

    or run

    curl https://www.nickmcummins.com/make-macos-high-sierra-iso.sh > /tmp/make-macos-high-sierra-iso.sh

    and then execute it by running sudo chmod +x /tmp/make-macos-high-sierra-iso.sh && /tmp/make-make-macos-high-sierra-iso.sh. Once completed, you should see a file named macOS-High-Sierra.iso on your desktop.

  5. Now boot up VMware Player and create a new virtual machine using the File dialog. Using the iso disk image we just created, and with macOS 10.13 selected as the guest os, finish the setup for your new image.
    Guest Operating System - Select macOS 10 13
  6. You’ll need to erase the virtual hard disk medium as seen in the following screenshots. DiskUtility - VMware Virtual STA hHard Drive
    DiskUtility - Erase Hard Drive confirmation
  7. Now you will be able to proceed with the installation. Once completed, you should be all ready to enjoy your new macOS High Sierra vm!
    Installation macOS Sierra
    macOS Sierra Screenshot

Screenshot tools for Linux and macOS generally produce image files in the .png file format. PNG is a lossless compression format, meaning that it can be compressed without losing quality, which is unlike another commonly used image format, JPG. There are 3 command-line tools which can be used to compress PNG files, pngout, optipng, and advpng.

➜  pngout terminator-terminal-window.png 
 In:   11405 bytes               terminator-terminal-window.png /c6 /f5
Out:    8790 bytes               terminator-terminal-window.png /c2 /f5
Chg:   -2615 bytes ( 77% of original)
➜  optipng -o7 terminator-terminal-window.png 
** Processing: terminator-terminal-window.png
800x266 pixels, 3x8 bits/pixel, RGB+transparency
Input IDAT size = 8715 bytes
Input file size = 8790 bytes

Trying:
  zc = 9  zm = 9  zs = 0  f = 0		IDAT size = 8140
  zc = 9  zm = 8  zs = 0  f = 0		IDAT size = 8140
                               
Selecting parameters:
  zc = 9  zm = 8  zs = 0  f = 0		IDAT size = 8140

Output IDAT size = 8140 bytes (575 bytes decrease)
Output file size = 8215 bytes (575 bytes = 6.54% decrease)

➜  advpng -z4 terminator-terminal-window.png 
        8215        7232  88% terminator-terminal-window.png
        8215        7232  88%

Global elevation data is available through NASA’s Shuttle Radar Topography Mission (SRTM) program. For those interested, this blog post compares the resolution of elevation slope maps derived from the SRTM project vs. LiDAR data, and provides a brief background behind the two methods. LiDAR, in short, provides for much higher resolution Digital Elevation Models (DEMs). Here I provide an extensive example of generating contour lines from LIDAR point clouds with classified ground points.

  1. I have selected 4 LIDAR adjacent tiles of an area with Olympic National Park, Washington, downloaded from the Puget Sound LIDAR Consortium:
  2. In order to download data from PSLC, you’ll need to register (free). Once you’ve downloaded your tiles of interest, the first step is filtering out only the points classified as ground. In forested settings, generating a DEM on the unfiltered .las tile would generate a Digital Surface Model, or DSM, in contrast to a Digital Terrain Model, or DTM. This can be done using PDAL’s filters.range module, because ground points in a classified point cloud are given the classification of 2. Note that not all point cloud .las files have classified points; methodologies for classifying unclassified points is beyond the scope of this article. Save a new text file with the following contents and name it filter_ground.pipeline.json:
    {
      "pipeline":[
        {
          "type":"readers.las",
          "compression": "laszip", 
          "filename":"q47123g8207.laz"
        },
        {
          "type":"filters.range",
          "limits": "Classification[2:2]"
        },
        {
          "type":"writers.text",
          "order":"X,Y,Z",
          "keep_unspecified": "false",
          "quote_header": "false",
          "delimiter": ",",
          "filename":"q47123g8207_Ground.txt"
        }
      ]
    }
    
  3. Run this file by running pdal pipeline filter_ground.pipeline.json. If you have multiple tiles as I do in this example, you’ll need to run this filter for each file. You can simplify modify the input and output file arguments and save over the same file.
  4. Import the outputted .txt point cloud files into SAGA GIS by following steps 4-8 from this post.
  5. From the Geoprocessing menu, select Shapes -> Point Clouds -> Tools -> Merge Point Clouds to merge the point clouds you have imported into SAGA.
    Merge Point Clouds
  6. From the Geoprocessing menu, select Grid -> Gridding -> Interpolation from Points -> Triangulation. Fill out the dialog with the information and point cloud you have created from the merge.
    • Set Attribute to Z.
    • Set Fit to cells.

    Triangulation

  7. Click Okay. Running the gridding algorithm will likely take some time, potentially several hours. This is because the implementation does not leverage multiple processor cores. Once completed, you should see your generated grid similar in the screenshot below.
  8. Now, from the Geoprocessing menu, select Shapes -> Grid -> Vectorization -> Contour Lines from Grid. Set Equidistance to the interval to which you would like contour lines rendered.
  9. After your contour lines are created, you can optionally have them labeled via SAGA’s object properties dialog, and/or export the contour lines for use in outside programs.

Recent versions of SAGA GIS no longer support importing files in the .las format. However, a .las file can be converted using PDAL’s writers.text module into a text file format, which in turn can be imported in SAGA. For this example, we will begin with a .las file projected in a state plane CRS that is the output of my post Reprojecting LIDAR .las files to Latitude/Longitude to State Plane Example.

  1. Open a new text file named las2xyz.json in the same directory as your .las file and paste the following into it:
    {
      "pipeline": [
        {
          "type": "readers.las",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.las"
        },
        {
          "type": "writers.text",
          "order": "X,Y,Z",
          "keep_unspecified": "false",
          "quote_header": "false",
          "delimiter": ",",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.txt"
        }
      ]
    }
  2. Now run the pipeline. This may take a while. Once it completes, you should now see the converted .txt file in the working directory.
    pdal pipeline las2xyz.json
    ls -hs -1 
    total 1.1G
    227M 20131013_usgs_olympic_wa_10TDT310990.las
    148M 20131013_usgs_olympic_wa_10TDT310990.laz
    745M 20131013_usgs_olympic_wa_10TDT310990.txt
    4.0K las2xyz.json
    4.0K lat_lng_WA_S_FIPS_ft_reprojection.pipeline.json
    
  3. Let’s perform a quick sanity check on this new file:
    less 20131013_usgs_olympic_wa_10TDT310990.txt
    X,Y,Z
    803169.892,932562.842,2093.730
    803170.060,932566.158,2093.070
    803178.839,932573.593,2054.360
    803170.851,932564.554,2093.700
    803225.508,932562.372,2043.070
    803232.723,932560.928,2040.090
    803232.361,932562.222,2041.790
    803232.443,932564.117,2041.470
    803234.863,932568.358,2030.210
    803235.004,932568.206,2034.440
    803236.294,932567.895,2028.310
    
    
  4. Open SAGA GIS. From the Geoprocessing menu, select File -> Tables -> Import -> Import Text Table with Numbers only
    Import Text Table from SAGA GIS Geooprocessing menu
  5. Select the .txt point cloud file that was created by PDAL, with the appropriate options as shown in the screenshot. Click Okay. Since the example text file used is nearly 1.0 GB in size, importing may take some time.
    SAGA GIS - Import text table with numbers only
  6. Once the text file has been imported as a table into SAGA, select Geoprocessing -> Shapes -> Point Clouds -> Conversion -> Point Cloud from Table.
    SAGA GIS - Point Cloud from Table Menu
  7. Select the table that was created in the previous step, and then set the values for X/Y/Z to these columns from the table.
    SAGA GIS - Point Cloud From Table
  8. Afterwards, you will get the point cloud generated in your data sources.
    SAGA GIS - Point Cloud in Data sources
  9. From the Geoprocessing menu, select Visualization -> 3D Viewer -> Point Cloud Viewer. Select Z as the colored attribute.
    SAGA GIS - Geoprocessing 3d visualization
    SAGA GIS - Point Cloud Viewer Dialog
  10. Click Okay. The point cloud will be displayed in SAGA’s Point Cloud Viewer tool:
    SAGA - Point Cloud in Point Cloud Viewer

  1. Download one or more lidar tiles:
    wget https://coast.noaa.gov/htdata/lidar1_z/geoid12b/data/5008/20131013_usgs_olympic_wa_10TDT310990.laz
  2. Note the source crs (coordinate reference system, i.e. projection). This 2013 USGS Lidar: Olympic Peninsula (WA) Point Cloud dataset from NOAA uses EPSG 4269. This means that the point cloud coordinates are using latitude/longitude, which is not well supported by many downstream GIS tools. Running lasinfo https://coast.noaa.gov/htdata/lidar1_z/geoid12b/data/5008/20131013_usgs_olympic_wa_10TDT310990.laz confirms that the x and y coordinates are in lat/lng:
     
    lasinfo (170203) report for 20131013_usgs_olympic_wa_10TDT310990.laz
    reporting all LAS header entries:
      file signature:             'LASF'
      file source ID:             0
      global_encoding:            1
      project ID GUID data 1-4:   00000000-0000-0000-6C4F-6D7900636970
      version major.minor:        1.2
      system identifier:          'LAStools (c) by rapidlasso GmbH'
      generating software:        'lasduplicate (160119) commercia'
      file creation day/year:     231/2014
      header size:                227
      offset to point data:       331
      number var. length records: 1
      point data format:          1
      point data record length:   28
      number of point records:    25204541
      number of points by return: 16034492 6849061 2147980 173008 0
      scale factor x y z:         0.0000001 0.0000001 0.001
      offset x y z:               0 0 0
      min x y z:                  -123.9223149 47.8406412 248.190
      max x y z:                  -123.9087939 47.8497435 827.800
    

    Thus, re-projecting the .las file is necessary before any further processing can be done. Common CRS systems include UTM based projections, as well as state plane. For this example, I’ve chosen to reproject from lat/lng to NAD 1983 StatePlane Washington South FIPS 4602 Feet as it is what is used by data available through the Puget Sound Lidar Consortium.

  3. Re-projection can be done with PDAL’s filters.reprojection. Create a new JSON pipeline file named lat_lng_WA_S_FIPS_ft_reprojection.pipeline.json with the following contents:

    {
      "pipeline": [
        {
          "type": "readers.las",
          "compression": "laszip",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.laz"
        },
        {
          "type": "filters.reprojection",
          "in_srs": "EPSG:4326",
          "out_srs": "EPSG:102749"
        },
        {
          "type": "writers.las",
          "compression": "laszip",
          "scale_x": "0.01",
          "scale_y": "0.01",
          "scale_z": "0.01",
          "offset_x": "auto",
          "offset_y": "auto",
          "offset_z": "auto",
          "filename": "20131013_usgs_olympic_wa_10TDT310990.las"
        }
      ]
    }
    

    Now save the file and run the pipeline. This will probably take a while as it involves de-compressing the input .laz file, iterating through and reprojecting every point in the point cloud, and writing the output file to disk. Now, examining the outputted, reprojected .las file with lasinfo 20131013_usgs_olympic_wa_10TDT310990.las will indicate the min/max in the new projection, as well as the z (elevation) values converted into feet:

    lasinfo 20131013_usgs_olympic_wa_10TDT310990.las 
    lasinfo (170203) report for 20131013_usgs_olympic_wa_10TDT310990.las
    reporting all LAS header entries:
      file signature:             'LASF'
      file source ID:             0
      global_encoding:            0
      project ID GUID data 1-4:   00000000-0000-0000-0000-000000000000
      version major.minor:        1.2
      system identifier:          'PDAL'
      generating software:        'PDAL 1.5.0 (Releas)'
      file creation day/year:     211/2017
      header size:                227
      offset to point data:       690
      number var. length records: 3
      point data format:          3
      point data record length:   34
      number of point records:    25204541
      number of points by return: 16034492 6849061 2147980 173008 0
      scale factor x y z:         0.01 0.01 0.01
      offset x y z:               800114.623206489603035 932554.57963061647024 814.270025000000487
      min x y z:                  800114.62 932554.58 814.27
      max x y z:                  803498.24 935938.24 2715.87
    

In the past, I’ve written about different remote desktop products/solutions, including NoMachine for remoting into traditional desktop machines, but also a few alternatives for accessing and controlling Android devices from a computer. Today I’d like to mention TeamViewer, a product free for non-commercial use that supports computer to computer, Android to computer, and computer to Android connections.

The following screenshots were taken on a macOS Sierra machine remotely logged onto a rooted Samsung Galaxy S4 Android tablet.