{"id":703,"date":"2018-03-21T13:20:33","date_gmt":"2018-03-21T12:20:33","guid":{"rendered":"http:\/\/jenacopterlabs.de\/?page_id=703"},"modified":"2018-03-22T22:04:58","modified_gmt":"2018-03-22T21:04:58","slug":"python-data-processing-in-pci-ii-cloud-masking-haze-removal-terrain-derivates-atcor","status":"publish","type":"page","link":"https:\/\/jenacopterlabs.de\/?page_id=703","title":{"rendered":"Satellite Data Preprocessing with Python\/PCI II &#8211; Cloud Masking, Haze Removal, Terrain Derivates  &#8211; ATCOR"},"content":{"rendered":"<p style=\"text-align: justify;\"><strong>GEO214 \/409 \/402 Python Data Preprocessing Code &#8211; Section II<\/strong><\/p>\n<p>22.03.2018: \u00a0Section II: MASKING\/HAZEREM\/TERSETUP\/ATCOR\/RESAMP processing<\/p>\n<p>Batch multispectral data pre processing for RapidEye and Sentinel-2 data (L1C) in PCI Geomatica 2017. A\u00a0reader for two of the sessions in GEOG214 and GEO402\/GEOG409 at FSU Jena (BSc Geography \/ MSc Geoinformatics).<\/p>\n<p><a href=\"http:\/\/jenacopterlabs.de\/?page_id=659\"><strong>Section I: NITF file import and reprojection in a given directory:<\/strong><\/a><\/p>\n<ol>\n<li>imports NITF RE L1B data to PIX files by also importing all metadata there is<\/li>\n<li>reprojects the files to UTM WGS84 CC,<\/li>\n<li>resamples to low resolution copies of the data files with 30 and 100m and<\/li>\n<li>reads header information and georef to text report<\/li>\n<li>reads simple band statistics to text report<\/li>\n<\/ol>\n<p><strong>Section II: \u00a0\u00a0<\/strong><strong>Cloud masking\/haze removal\/terrain Analysis &#8211; full processing in Atmospheric Correction package ATCOR.<\/strong><strong>\u00a0<\/strong><\/p>\n<ol>\n<li>Calculate a cloud, haze and water mask<\/li>\n<li>Remove the haze from satellite date<\/li>\n<li>Calculate DEM derivates: slope\/aspect\/shadow-cast layer\n<ol>\n<li>Optional: lowpass filter the DEM<\/li>\n<li>slope\/aspect\/shadow cast layer calculation<\/li>\n<\/ol>\n<\/li>\n<li>Atmospheric correction on haze corrected input calculating the local incidence angle and outputting: 16 Bit scaled topographic normalized reflectance.<\/li>\n<\/ol>\n<p><strong>Section III: \u00a0R statistics integration using rpy2 and numpy2<\/strong><\/p>\n<p><strong>Section IV: \u00a0 Integration of GRASS binaries into workflows\u00a0<\/strong><\/p>\n<p><strong>Section V: \u00a0 \u00a0 Integration of AGISOFT<\/strong><\/p>\n<p>&nbsp;<\/p>\n<p style=\"text-align: justify;\"><span style=\"text-decoration: underline;\"><strong>1. Loading libraries and defining locale environment<\/strong><\/span><\/p>\n<p style=\"text-align: justify;\">Here we \u00a0load PCI libraries, define the locale environment and define what information the user should provide on the CLI (data path and DEM).<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\">\r\nprint &quot;******************* ATCOR Processing with RE Data **************************&quot;\r\n# \r\n# by SHese 22.3.2018 RE ATCOR processing \r\n# Script to loop through files in a folder and apply the full atmosphere correction workflow.\r\n# - cloud masking \r\n# - hazerem uses cld\/hazemasks for hazerem\r\n# - tersetup creates the illufiles \r\n# - atcor with illufiles\/hazerem files\r\n# ---------------------------------------------------------------------------\r\n# Import the required algorithms \r\n# ---------------------------------------------------------------------------\r\n# usage: py-atcor.py \/full\/directory\/to\/datafiles \/fulldir\/path\/andfilename-DEM.pix\r\nprint &quot;1. Importing PCI Libraries ... &quot;\r\n\r\nfrom pci.fun import *\r\nfrom pci.lut import *\r\nfrom pci.pcimod import *\r\nfrom pci.exceptions import *\r\nfrom pci.ortho import *\r\nfrom pci.fimport import fimport\r\nfrom pci.resamp import *\r\nfrom pci.masking import *\r\nfrom pci.hazerem import * #\r\nfrom pci.tersetup import * # is not supporting multithreading unfortunately\r\nfrom pci.atcor import * # uses big tmp files in the tmp file directory - cld pot crash\r\nfrom sys import argv \r\n\r\nimport sys # These two libraries are required to extract file names from folders\r\nimport os\r\nimport fnmatch\r\n\r\n# The following locale settings are used to ensure that Python is configured\r\n# the same way as PCI's C\/C++ code. \r\nimport locale\r\nlocale.setlocale(locale.LC_ALL, &quot;&quot;)\r\nlocale.setlocale(locale.LC_NUMERIC, &quot;C&quot;)\r\n\r\nscript, InFolder, DEMFile = argv # here we define the input by use on the CLI \r\n# usage: py-atcor.py \/full\/directory\/to\/datafiles \/fulldir\/path\/andfilename-DEM.pix\r\n# ------------------------------------------------------------------------\r\n# Set File variables\r\n# ------------------------------------------------------------------------\r\nprint &quot;2. Processing in folder path: &quot; + InFolder\r\n\r\n# InFolder = raw_input(&quot;Please enter full input folder path: &quot;) \r\n# User will enter pathname at a later stage - we dont want this here\r\n<\/pre>\n<p><span style=\"text-decoration: underline;\"><strong>2. Next we are listing \u00a0the input files that should be processed:<\/strong><\/span><br \/>\nThis section searches for the filenames that we want in our filename input list. Its important to get the file_filter here right to populate the right input files in our for-loop later.<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\">\r\n# ----------------------------------------------------\r\n# Create list of files to loop through from InFolder\r\n# ----------------------------------------------------\r\nprint &quot;3. Creating list of valid files ...&quot;\r\n# This line is a wildcard filter. In this case only ORTHO.pix files will be used\r\nfile_filter = &quot;*ORTHO.pix&quot; \r\n\r\n# This list will become populated with the full pathnames of each file in InFolder\r\ninput_files_list = &#x5B;] \r\n\r\n# os.walk searches a directory tree and produces the file name of any file it encounters\r\n# r is the main directory\r\n# d is any subdirectory within r\r\n# f image file names within r\r\nfor r,d,f in os.walk(InFolder):\r\n for file in fnmatch.filter(f, file_filter): # fnmatch.filter filters out non .pix files\r\n print &quot;4. Found valid input file: &quot;, file\r\n input_files_list.append(os.path.join(r,file)) # The file name and full path are added to input_files_list\r\n<\/pre>\n<p><span style=\"text-decoration: underline;\"><strong>3. Cloud Masking using MASKING \u00a0<\/strong><\/span><br \/>\nNow we start the cloud masking algorithm from PCI: &#8220;MASKING&#8221; &#8211; here the basic parameters are derived from expected TOA values that we can measure using the C0C1 coeffis. This isnt extremely accurate but allows masking in order to avoid overcorrection at a later stage.<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\"> \r\n for ORTHO in input_files_list:\r\n print &quot;5. Beginning Cloudmasking ...&quot;\r\n fili = ORTHO\r\n filo = ORTHO + &quot;MASKING.pix&quot;\r\n visirchn = &#x5B;1,3,5,0] # RE channels blue red nir and swir missing\r\n asensor = &quot;RapidEye&quot; # ssensor is RapidEye should be &quot;Eye&quot;! here - should be changed as appropriate\r\n cfile = &quot;\/opt\/geomatica_2017\/atcor\/cal\/rapideye_m1\/rapideye_mode1.cal&quot; #The path and file name of the text \r\n # file that contains, for each band, the calibration coefficients \r\n # used to transform the values from the image to absolute radiance values.\r\n znangle = &#x5B;] # ingest that from metadata\r\n hazecov = &#x5B;15] # default is 50% - me thinks this is usually too much! \r\n clthresh = &#x5B;] \r\n wuthresh = &#x5B;]\r\n srcbgd = &quot;&quot;\r\n masking(fili, srcbgd, asensor, visirchn, cfile, znangle, hazecov, clthresh, wuthresh, filo)\r\n<\/pre>\n<p>cfile = &#8220;\/opt\/geomatica_2017\/atcor\/cal\/rapideye_m1\/rapideye_mode1.cal&#8221;<br \/>\nThe path and file name of the text file that contains, for each band, the calibration coefficients used to transform the values from the image to absolute radiance values.<\/p>\n<p><span style=\"text-decoration: underline;\"><strong>4. Haze Removal using HAZEREM<\/strong><\/span><br \/>\nWith the cloud and haze mask &#8211; stored as a bitmap layers we now apply the haze correction using HAZEREM.<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\"> \r\n # ----------------------------------------------------------------------------------\r\n # Use HAZERM algorithm with cloud mask from MASKING to avoid overcorrection\r\n # ----------------------------------------------------------------------------------\r\n print &quot;6. Beginning HAZEREM ...&quot;\r\n fili = ORTHO # we use the original ORTHO output as input here but dont mix it with maskfili\r\n fili_pan = &quot;&quot; # we dont have this for RapidEye\r\n srcbgd = &quot;&quot;\r\n asensor = &quot;Rapideye&quot; # Rapideye &quot;eye&quot;here\r\n visirchn = &#x5B;] # this is taken vom ASENSOR automatically\r\n chanopt = &quot;p,p,p,c,c&quot; # &quot;p,p,p,c,c&quot; - apply haze removal to 1,2,3 and copy all NIR channels wo\r\n maskfili = ORTHO + &quot;MASKING.pix&quot; \r\n maskseg = &#x5B;2,3,4] # hazerem expects haze mask in 2, cloud mask in 3 and water mask in segment 4 \r\n hazecov = &#x5B;15] # we go as low as possible here - overcorrection should be avoided\r\n hazeflsz = &#x5B;]\r\n filo = ORTHO + &quot;MASKING-HAZEREM.pix&quot; \r\n filo_pan = &quot;&quot;\r\n ftype = &quot;PIX&quot;\r\n foptions = &quot;&quot;\r\n hazerem( fili, fili_pan, srcbgd, asensor, visirchn, chanopt, maskfili, maskseg, hazecov, hazeflsz, filo, filo_pan, ftype, foptions )\r\n<\/pre>\n<p><span style=\"text-decoration: underline;\"><strong>5. TERSETUP creates the DEM derivates<\/strong><\/span><br \/>\nTo perform topographic normalization we need a DEM -slope\/aspect and illumination angles to calculate the local incidence angle. This is done by ATCOR on the fly given you provide aspect\/slope. This is what TERSETUP does. ILLUMCAST can calculate the local incidence angle but ILLUMCAST is not needed when we use TERSETUP. Note however that you might need another lowpass filtering run to reduce artefacts that originate often from edges in your DEM. Some higher order AVERAGE low pass filter runs are helpful than:<\/p>\n<pre class=\"pre codeblock exampleCode\">From PCI online help: \r\n<a href=\"http:\/\/www.pcigeomatics.com\/geomatica-help\/references\/pciFunction_r\/python\/P_fme.html\">http:\/\/www.pcigeomatics.com\/geomatica-help\/references\/pciFunction_r\/python\/P_fme.html<\/a>\r\n\r\nfrom pci.fme import *\r\n\r\nfile\t=\t'input.pix'\r\ndbic\t=\t[10]\t# elevation data channel\r\ndboc\t=\t[11]\t# same as input channel\r\nflsz\t=\t[5,5]\t# 5x5 filter\r\nmask\t=\t[]\t# process entire image\r\nbgrange\t=\t[]\r\nfailvalu\t=\t[]\r\nbgzero\t=\t''\r\n\r\nfme( file, dbic, dboc, flsz, mask, bgrange, failvalu, bgzero )<\/pre>\n<p>or use an average filter:<\/p>\n<p><a href=\"http:\/\/www.pcigeomatics.com\/geomatica-help\/references\/pciFunction_r\/python\/P_fav.html\">http:\/\/www.pcigeomatics.com\/geomatica-help\/references\/pciFunction_r\/python\/P_fav.html<\/a><\/p>\n<pre class=\"pre codeblock exampleCode\">from pci.fav import fav\r\n\r\nfile\t=\t'input.pix'\r\ndbic\t=\t[1]\t# elevation data channel\r\ndboc\t=\t[6]\t# output channel\r\nflsz\t=\t[5,5]\t# use a 5x5 filter\r\nmask\t=\t[]\t# bitmap mask\r\nbgrange\t=\t[0]\t# remove background values\r\nfailvalu\t=\t[]\r\nbgzero\t=\t''\t# default, set background to 0\r\n\r\nfav( file, dbic, dboc, flsz, mask, bgrange, failvalu, bgzero )<\/pre>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\">\r\n        # ----------------------------------------------------------------------------------\r\n        # Use TERSETUP algorithm to prepare terrain derivates: slope, aspect, and skyview rasters from an input DEM.\r\n        # ----------------------------------------------------------------------------------\r\n        print &quot;7. Beginning TERSETUP ...&quot;\r\n        filedem\t=\tDEMFile\t# input DEM file should be hard coded ASTER GDEM 15 or SRTM 30\r\n        dbec\t=\t&#x5B;1]\t# input DEM channel expected in channel 1 here\r\n        terfile\t=\t&quot;tersetup.pix&quot;\t# output terrain image file - we use a temp file here\r\n        backelev\t=\t&#x5B;]\t# search for No Data value in metadata\r\n        elevref\t=\t&quot;ELLIPS&quot;\t# default, Mean Sea Level\r\n        elevunit\t=\t&quot;METER&quot;\t# default, METER\r\n        elfactor\t=\t&#x5B;]\t# default, &#x5B;0.0,1.0]\r\n        tersetup( filedem, dbec, terfile, backelev, elevref, elevunit, elfactor )\r\n<\/pre>\n<p>ILLUMCAST\u00a0is not needed by ATCOR when TERSETUP\u00a0is run &#8211; ATCOR calculates local incidence angle on the fly from tersetup output.<\/p>\n<p><span style=\"text-decoration: underline;\"><strong>6. ATCOR &#8211; Atmospheric Correction &#8211; calculating scaled reflectance<\/strong><\/span><br \/>\nNow we have all the input layers for ATCOR. MASKING and HAZEREM provide the cloud\/haze\/water mask and generated the haze corrected dataset, TERSETUP created the DEM derivates to perform a topographic normalization.<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\">\r\n       # ----------------------------------------------------------------------------------\r\n        # ATCOR algorithm \r\n        # ----------------------------------------------------------------------------------\r\n        print &quot;8. Beginning ATCOR ...&quot;  \r\n        \r\n        fili\t=\tORTHO + &quot;MASKING-HAZEREM.pix&quot;\r\n        dbic\t=\t&#x5B;1,2,3,4,5]\r\n        srcbgd\t=\t&quot;&quot;\r\n        asensor\t=\t&quot;Rapideye Mode1&quot;\r\n        cfile\t=\t&quot;\/opt\/geomatica_2017\/atcor\/cal\/rapideye_m1\/rapideye_mode1.cal&quot; # this is linux specific - should be change to your setup\r\n        maskfili\t= ORTHO + &quot;MASKING.pix&quot; \r\n        terfile\t=\t&quot;tersetup.pix&quot;\r\n        illufile\t=\t&quot;&quot;  \r\n        meanelev\t=\t&#x5B;10]\r\n        vistype\t=\t&quot;constant,25.0&quot;    \r\n        visfilo\t=\t&quot;&quot;\r\n        atmdef\t=\t&quot;Urban&quot;             # needs modification for every project\r\n        atmcond\t=\t&quot;summer&quot;            # needs modification for every project\r\n        satilaz\t=\t&#x5B;]                  # is taken from the metadata by ATCOR - should be left open\r\n        sazangl\t=\t&#x5B;]                  # is also taken from the metadata by ATCOR, \r\n                                            #note: Solar zenith = 90 degrees - solar_elevation\r\n        adjacen\t=\t&quot;ON,5&quot;               \r\n        brdffun\t=\t&#x5B;]\r\n       \r\n        terrefl\t=\t&#x5B;]             \r\n        # Optionally specifies the number iterations used to apply adjacency effect corrections. use 0 to switch off\r\n        outunits\t=\t&quot;16bit_Reflectance&quot; # we want 16bit Reflectance to keep it simple \r\n        filo\t=\tORTHO + &quot;MASKING-HAZEREM-ATCOR.pix&quot;\r\n        ftype\t=\t&quot;PIX&quot;\r\n        foptions=\t&quot;&quot;\r\n        try: atcor( fili, dbic, srcbgd, asensor, cfile, maskfili, terfile, illufile, meanelev, vistype, visfilo,\\\r\n        atmdef, atmcond, satilaz, sazangl, adjacen, brdffun, terrefl, outunits, filo, ftype, foptions )\r\n\r\n        except PCIException, e: print e\r\n        except Exception, e: print e\r\n        \r\n        print &quot;   Beginning RESAMP 100m ...&quot;        \r\n        fili = ORTHO + &quot;MASKING-HAZEREM-ATCOR.pix&quot;\r\n        filo = ORTHO + &quot;MASKING-HAZEREM-ATCOR-RESAMP100m.pix&quot;\r\n        dbic\t=\t&#x5B;1,2,3,4,5]\r\n        dbsl\t=\t&#x5B;]\r\n        sltype\t=\t&quot;ALL&quot;\r\n        ftype\t=\t&quot;PIX&quot;\r\n        foptions\t= ' '\r\n        pxszout\t=\t&#x5B;100, 100]\t# output with meter resolution\r\n        resample\t=\t&quot;CUBIC&quot;\t# Cubic Convolution method\r\n\r\n        try: resamp(fili, dbic, dbsl, sltype, filo, ftype, foptions, pxszout, resample )\r\n        except PCIException, e: print e\r\n        except Exception, e: print e\r\nprint &quot;END Atcor finished&quot;, file\r\n        \r\n\r\n<\/pre>\n<p style=\"text-align: justify;\"><strong>\u00a0Some comments that I removed from the script for better readability (partly take from the PCI Online Help):<\/strong><\/p>\n<p style=\"text-align: justify;\">ATCOR receives as input a haze-free image, the cloud and cloud shadow, water, ice\/snow and saturated pixel masks, and the terrain derivatives and illumination map for each scene to process to compute the visibility map and apply the atmospheric LUT to the scene, so that the atmospheric addition to the DN of each pixel can be removed. ATCOR allows users to prepare their data for analysis, such as GCP collection, segmentation, classification, or extraction of vegetation indices. ATCOR depends heavily on the quality of the visibility map and the accuracy of the atmospheric tables (based on MOTRAN1 code), as well as on the quality of the haze-free scene and terrain derivatives.<\/p>\n<p style=\"text-align: justify;\"><strong>maskfili = ORTHO + &#8220;MASKING.pix&#8221;<\/strong><br \/>\nspecifies the name of the input file that contains the Haze and Cloud mask produced by MASKING for the reference image. The specified file must match exactly the geocoding of the input image and the mask for cloud and haze. ATCOR assumes the first bitmap found to be the haze mask, and the second to be the Cloud mask.<\/p>\n<p style=\"text-align: justify;\"><strong>illufile = &#8220;&#8221;<\/strong><br \/>\nOptionally specifies the name of the input illumination file that contains the illumination map and topographic cast shadow bitmap produced by ILLUMCAST for the input image. If the illumination file is not provided, the information is computed on-the-fly when a Terrain Derivatives file (TERFILE) is provided.<\/p>\n<p style=\"text-align: justify;\"><strong>adjacen = &#8220;ON,5&#8221;<\/strong><br \/>\nOptionally specifies whether a correction for the adjacency effect is required. This parameter is composed of a keyword and an optional filter size. Available options are:<br \/>\nOFF: no adjacency effect correction is applied,<br \/>\nON,n: the correction for adjacency is applied using a nxn kernel size, where n is an odd number between 3 and 39<br \/>\nON: the correction for adjacency is applied, using a 3&#215;3 kernel size<br \/>\nThe adjacency is the radiation reflected from the neighborhood and scattered energy into the viewing direction. The effect is a result of atmospheric scattering, and depends on the reflectance contrast between a target pixel and its large-scale neighborhood, and decreases with wavelength. It reduces the apparent surface contrast by decreasing the top-of-the-atmosphere radiance over bright pixels and increasing the brightness of the dark pixels. The adjacency effect causes a certain amount of blurring, known as crosstalk. This effect is often noticeable at the boundary of adjacent features; for example, roads passing through a forest may appear blurred. The adjacency effect is especially important for sensors of high spatial resolution, such as RapidEye, KazEOSat-2 and SPOT and is usually negligible for low spatial resolution sensors such as AVHRR and Landsat.<\/p>\n<p style=\"text-align: justify;\"><strong>brdffun = []<\/strong><br \/>\n[],[],[] where:<\/p>\n<p style=\"text-align: justify;\">FUNCTION: a number between 0 and 4, each representing a different correction function<\/p>\n<p style=\"text-align: justify;\">THRSANG: the threshold illumination angle (by default, set as the solar zenith angle)<\/p>\n<p style=\"text-align: justify;\">LBOUND: indicates the lowest correction applied. By default, set to 0.25, meaning that the pixel values will be changed at most by 75%, in order to avoid overcorrection.<\/p>\n<p style=\"text-align: justify;\">This parameter can receive up to five different values, which set the following correction functions:<\/p>\n<p style=\"text-align: justify;\">0: no BRDF correction is applied,<\/p>\n<p style=\"text-align: justify;\">1: sets the correction factor as a linear function based on the illumination angle (i), defined as cos(i)\/cos(THRSANG),<\/p>\n<p style=\"text-align: justify;\">2: sets the correction factor as an exponential function, defined as sqrt [cos(i)\/cos(THRSANG)],<\/p>\n<p style=\"text-align: justify;\">3: sets the correction factor as a linear function based on the illumination angle and exitance angles, defined as cos (i)*cos(e)\/cos(THRSANG),<\/p>\n<p style=\"text-align: justify;\">4: sets the correction factor as an exponential function based on the illumination angle and exitance angles, defined as sqrt[cos(i)*cos(e)\/cos(THRSANG)]. This function is relevant only for tilt sensors.<\/p>\n<p style=\"text-align: justify;\">When a Terrain Derivative (TERFILE) file is specified, this parameter is set to 2 by default. The effects of the threshold illumination are scene-dependent and its effect varies according to local topography. If the topographic correction must be increased on the steepest slopes, the following rules are recommended, to define the threshold angle: if solar zenith is less than 20 degrees, THRSANG = Solar Zenith + 20 degrees if the solar zenith angle is between 20 and 45 degrees, THRSANG = Solar Zenith + 15 degrees if the solar zenith angle is higher than 45 degrees, THRSANG = Solar Zenith + 10 degrees.<\/p>\n<p>&nbsp;<\/p>\n<p>tbc<\/p>\n","protected":false},"excerpt":{"rendered":"<p>GEO214 \/409 \/402 Python Data Preprocessing Code &#8211; Section II 22.03.2018: \u00a0Section II: MASKING\/HAZEREM\/TERSETUP\/ATCOR\/RESAMP processing Batch multispectral data pre processing for RapidEye and Sentinel-2 data (L1C) in PCI Geomatica 2017. A\u00a0reader for two of the sessions in GEOG214 and GEO402\/GEOG409 at FSU Jena (BSc Geography \/ MSc Geoinformatics). Section I: NITF file import and reprojection in a given directory: imports&#46;&#46;&#46;<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":18,"menu_order":6,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_uag_custom_page_level_css":"","footnotes":""},"class_list":["post-703","page","type-page","status-publish","hentry"],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false,"post-thumbnail":false,"cd-small":false,"cd-medium":false,"cd-standard":false},"uagb_author_info":{"display_name":"S\u00f6ren Hese","author_link":"https:\/\/jenacopterlabs.de\/?author=1"},"uagb_comment_info":0,"uagb_excerpt":"GEO214 \/409 \/402 Python Data Preprocessing Code &#8211; Section II 22.03.2018: \u00a0Section II: MASKING\/HAZEREM\/TERSETUP\/ATCOR\/RESAMP processing Batch multispectral data pre processing for RapidEye and Sentinel-2 data (L1C) in PCI Geomatica 2017. A\u00a0reader for two of the sessions in GEOG214 and GEO402\/GEOG409 at FSU Jena (BSc Geography \/ MSc Geoinformatics). Section I: NITF file import and reprojection&hellip;","_links":{"self":[{"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=\/wp\/v2\/pages\/703","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=703"}],"version-history":[{"count":48,"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=\/wp\/v2\/pages\/703\/revisions"}],"predecessor-version":[{"id":762,"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=\/wp\/v2\/pages\/703\/revisions\/762"}],"up":[{"embeddable":true,"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=\/wp\/v2\/pages\/18"}],"wp:attachment":[{"href":"https:\/\/jenacopterlabs.de\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=703"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}