Skip to content

DISP‐S1 Beta Acceptance Testing Instructions

Scott Collins edited this page Aug 5, 2024 · 1 revision

This page contains instructions for performing Acceptance Testing for the DISP-S1 Beta delivery from the OPERA-ADT team. These instructions assume the user has access to the JPL FN-Artifactory, and has Docker installed on their local machine.

Acquiring the DISP-S1 Beta Docker Image

The image is currently hosted on JPL FN-Artifactory, which requires JPL VPN access and JPL credentials. You may also need to be added to the gov.nasa.jpl.opera.adt organization.

Once you have access, the container tarball delivery is available under general/gov/nasa/jpl/opera/adt/disp_s1/r3/dockerimg_disp_s1_beta.tar . Sample inputs and outputs are also available under general/gov/nasa/jpl/opera/adt/disp_s1/r3/delivery_data_small.tar. (Note, the delivery_data_full.tar should not be used with the AT due to long runtime).

Documentation for the delivery is under general/gov/nasa/jpl/opera/adt/disp_s1/r3/documents/

Download both images to a location on your local machine. This location will be referred to throughout this instructions as <DISP_S1_DIR> Note that the sample data is almost 4 gigabytes, so the download from AF may take some time.

Loading the image into Docker

The first step in running the DISP-S1 image is to load it into Docker via the following command:

docker load -i <DISP_S1_DIR>/dockerimg_disp_s1_beta.tar

This should add the Docker image to your local repository with the name opera/disp_s1 and the tag 0.2.

Preparing the test data

Once the delivery_data_small.tar file is downloaded to your local machine, unpack it to <DISP_S1_DIR>:

tar -xvf delivery_data_small.tar

This will create a delivery_data_small directory within <DISP_S1_DIR> containing the following directories:

  • config_files/
  • dynamic_ancillary_files/
    • average_coherence/
    • ionosphere_files/
    • ps_files/
    • static_layers/
    • troposphere_files/
    • dem.tif
    • watermask.flg.aux
    • watermask.flg.rsc
  • golden_output/
    • forward/
      • compressed_slcs/
    • historical/
      • compressed_slcs/
  • input_slcs/
  • delivery_data_small.tar

In order to execute the SAS, the input file directory, runconfig and an output location will be mounted into container instance as Docker Volumes. To help streamline this process, we recommend making the following changes to the delivery_data_small directory:

Create a directory named runconfig just under delivery_data_small, and move the existing runconfig YAML files into it:

`mkdir -p <DISP_S1_DIR>/delivery_data_small/runconfig`

`mv <DISP_S1_DIR>/delivery_data_small/config_files/*.yaml <DISP_S1_DIR>/delivery_data_small/runconfig/`

NOTE: There will be 2 sets of runconfig files one historical and one forward. These will be used as input for separate runs of the container.

Executing the DISP-S1 container on the sample datasets

Running the Forward case

Change directory into the delivery_data_small/ directory.

cd <DISP_S1_DIR>/delivery_data_small

NOTE: This documentation was based on runs made on MAC OS, and changes to the number of threads used were required. For the forward case the `threads_per_worker' had to be changed from 16 to 8.

We're now ready to execute the 'forward' DISP-S1 Interface. Run the following the command to kick off execution with the test assets:

NOTE: the relative path to the runconfig file must be specified in the docker run command

docker run --rm --user $(id -u):$(id -g) \
   --volume <DISP_S1_DIR>/delivery_data_small:/work \
   opera/disp-s1:0.2 disp-s1 runconfig/runconfig_forward.yaml

The docker container will output progress messages as it runs, e.g.:

[2023-10-30 22:31:15] INFO Found SLC files from 2 bursts main.py:77

...

Execution time for the small test case on Mac OS was 20 minutes.

When the docker run is finished, scratch/forward/ and output/forward/ directories will be created.

The output/forward directory will contain the product file:

-rw-r--r-- 1 jehofman staff 25540418 Oct 30 15:51 20221119_20221213.unw.nc

There will also be a matching .png file and a compressed_slcs/ directory:

-rw-r--r-- 1 jehofman staff 2609 Oct 30 15:50 20221119_20221213.unw.png

drwxr-xr-x 4 jehofman staff 128 Oct 30 15:52 compressed_slcs

The compressed_slcs/ directory contains compressed .h5 files:

-rw-r--r-- 1 jehofman staff 136952199 Oct 30 15:51 compressed_t042_088905_iw1_20221119_20221213.h5

-rw-r--r-- 1 jehofman staff 136953988 Oct 30 15:52 compressed_t042_088906_iw1_20221119_20221213.h5

Running the Historical case

Change directory into the delivery_data_small/ directory.

cd <DISP_S1_DIR>/delivery_data_small

NOTE: For the historical case further changes to the runconfig file were required for MAC OS. As in the previous case, the threads_per_worker had to be changed from 16 to 8, additionally the n_workers had to be reduced from 4 to 2.

We're now ready to execute the 'historical' DISP-S1 Interface.

Note: the relative path to the runconfig file must be specified in the docker run command

docker run --rm --user $(id -u):$(id -g) \
    --volume <DISP_S1_DIR>/delivery_data_small:/work \
    opera/disp-s1:0.2 disp-s1 runconfig/runconfig_historical.yaml

The docker container will output progress messages as it runs, e.g.:

[2023-10-31 17:28:45] INFO Found SLC files from 2 bursts main.py:77

...

Execution time for the small test case on Mac OS was a little over 11 minutes.

When the docker run is finished, scratch/historical/ and output/historical/ directories will be created.

The output/forward directory will contain two product files:

-rw-r--r-- 1 jehofman staff 13758104 Oct 31 10:39 20221119_20221201.unw.nc
-rw-r--r-- 1 jehofman staff 13758104 Oct 31 10:39 20221119_20221201.unw.nc

There will also be a matching .png file and a compressed_slcs/ directory:

-rw-r--r-- 1 jehofman staff     2609 Oct 31 10:38 20221119_20221201.unw.png
-rw-r--r-- 1 jehofman staff     2609 Oct 31 10:39 20221119_20221213.unw.png
drwxr-xr-x 4 jehofman staff      128 Oct 30 15:52 compressed_slcs

The compressed_slcs/ directory contains two compressed .h5 files:

-rw-r--r-- 1 jehofman staff 136952199 Oct 31 10:39 compressed_t042_088905_iw1_20221119_20221213.h5
-rw-r--r-- 1 jehofman staff 136953988 Oct 31 10:40 compressed_t042_088906_iw1_20221119_20221213.h5

Running the Quality Assurance test

Now that we've successfully executed the SAS container and generated outputs, the last step is to perform a QA check against the expected outputs.

A Python program to compare DISP-S1 products generated by DISP-S1-SAS with expected outputs “golden datasets” is included in the Docker image. The script validate_product.py accepts two input files: the golden dataset and the test dataset.

The docker command to run this is:

docker run \
    --rm \
    --volume <local host directory>:/work \
    opera/disp-s1:0.2 \
    python /disp-s1/scripts/release/validate_product.py \
    --golden <path to golden dataset> \
    --tests <path to test dataset>

For example, if the SAS was run using the example command above and the result is in the output/ directory, the validation program can be run as as follows:

docker run --rm --volume <DISP_S1_DIR>/delivery_data_small:/work \
    opera/disp-s1:0.2 python \
    /disp-s1/scripts/release/validate_product.py \
    --golden golden_output/forward/20221119_20221213.unw.nc --test output/forward/20221119_20221213.unw.nc

Currently the small test case does not pass validation due to how embedded strings are compared. However, this error should be considered benign for this release.

Small-size test case validation output:

[2023-11-02 20:31:12] INFO     Comparing HDF5            validate_product.py:481
                               contents...                                      
                      INFO     Checking connected        validate_product.py:192
                               component labels...                              
[2023-11-02 20:31:13] INFO     Test unwrapped area:      validate_product.py:223
                               3100556/62045298 (4.997%)                                              
                      INFO     Reference unwrapped area: validate_product.py:224
                               3100556/62045298 (4.997%)                        
                      INFO     Intersection/Reference:   validate_product.py:225
                               3100556/3100556                                  
                               (100.000%)                                       
                      INFO     Intersection/Union:       validate_product.py:226
                               3100556/3100556                                  
                               (100.000%)                                       
Traceback (most recent call last):
  File "/disp-s1/scripts/release/validate_product.py", line 521, in <module>
    compare(args.golden, args.test, args.data_dset)
  File "/disp-s1/scripts/release/validate_product.py", line 483, in compare
    compare_groups(hf_g, hf_t)
  File "/disp-s1/scripts/release/validate_product.py", line 68, in compare_groups
    compare_groups(
  File "/disp-s1/scripts/release/validate_product.py", line 77, in compare_groups
    _compare_datasets_attr(golden_dataset, test_dataset)
  File "/disp-s1/scripts/release/validate_product.py", line 115, in _compare_datasets_attr`
    raise ComparisonError(
ComparisonError: /metadata/pge_runconfig dtypes do not match: |S7269 vs |S7268
Clone this wiki locally