How to handle and process air pressure data for loading effects

Current issues (Feb. 2010)

The archive with most of the tar-files of ECMWF surface pressure is on

If data sets are not found under ./gap${year} the aploading would fetch them using ftp.
We don't netmount disks any more.
Thus, the retrieval method must be changed to scp.

A command line that worked 2010-Feb-15:
cd ~/Apload
aploading -ecmwf -v KUUS -l apls/aploading.log -d apls/ -O \'-s /home/hgs/Oload/smhi/itrf_97may25.stacov \' 2006 jan

News (rather old news)


PROBLEM: All WWW/*.ts have the offset unremoved for all 2003.
cd Apload
tsf2ts -x -i WWW -o WWW site [site ... ]
setenv OFFS_RANGE "From 2003 01 01 0 0 0 0 To 2003 12 31 19 0 0 0"
# check that WWW/ exists and is o.k.
ts-sub-offset -i WWW -o WV site [site ... ]

2000-MAR-10   > apl_reduce_stacov

To reduce (unmapped) stacov files for air pressure loading effects

2000-JAN-28.. > aploading

    aploading     job command lines must include one of the options
These options are used to unpack the data in the appropriate way.
SMHI: gzipped files GEyymm.asc.gz comprise of one ascii file for each month
ECMWF: tar files GEyymm.tar comprise of typically 120 files GEyymmddqq.ASC

The BIFROST standard job with ECMWF-data is now:

      nice aploading -ecmwf -s -v OSKA -d apls/ -O \'$* -s Bifrost.stacov \' 2000 jun

(implying    nice run_olfg -s Bifrost.stacov -ts apls/ 2000 jun 01    etc.)

      fgrep 'STA' $REFF.stacov > Bifrost.stacov

2000-JAN-01.. > Get data from our new archives

   there are two archive disks with air pressure data:
      /GNSS_archive/misc/hgs1  (/SMHI)   (only needed for GE95??.asc.gz )
      /GNSS_archive/misc/hgs2  (/SMHI)  (/ECMWF)



   This is a typical job to run a mars retrieval for a succession of 9 months

   tcsh (! it's working with history!)
   cd $SCRATCH
   foreach dt ( 199803 199804 199805 199806 199807 199808 199809 199810 199811 199812 )
   source ~/p/m-aixrq.env

   and when the jobs of the loop are ready,
   foreach dt ( 199803 199804 199805 199806 199807 199808 199809 199810 199811 199812 )
   source ~/p/m-geplrq.env
   gzip GEyymm*.ASC

   tar cfv GEyymm.tar GEyymm*.ASC.gz
      scp GEyymm.tar

   Transfer by ftp: When the whole job is complete, start ftp on local host:
   cd $MYAR/ECMWF  # or where you want to store the results

   mget GEyy??.tar

  because the mput function is not working at ecgate1/ftp  (? actual remark??)

My user name at ECMWF: sub
My home directory: /home/ms/se/sub

P.M.: ECMWF cert: the fettlind dream of 1

Retrieve time series from observations

Edit script, run mars job and unpack BUFR
  emacs -nw ~/p/m-aixrq-obs
  llsubmit ~/p/m-aixrq-obs
# that one can take a long time. Adjust hard CPU limit.

  ln -s $filename input_datafile
  touch obs.dat

  more time_series

Compile a program
cd ~/p
  f77 $ECLIB $EMOSLIB bfpress.f
  mv a.out ~/bin/bfpress.x

Loading jobs

    Older jobs can be seen in the shell scripts
     nice aploading -ecmwf -s -v WESTFORD -d new-vlbi-apl-s/ -O \
          \'$* -lc core-vlbi-apl/local.pressure.2001 -nocmc \
               -v /home/hgs/Calc/\' \
               2001 aug sep oct nov dec

    Look for a recently updated  run_aploading_4*  file!

    Observe that the local pressure is collected using the -lc option in the option string
    that is conveyed to run_olfg.
    Add option -k after -ecmwf to preserve the 6h ascii data sets. The space requirement is
    large  though!

    Examples for newer jobs (year 2000...) that worked are supposed to be collected in
     aploading -ecmwf -v KUUS -l /home/hgs/Oload/smhi/apls/aploading.log \
               -d apls/ -O \'-s itrf_97may25.stacov\' 2000 oct nov dec

  (VLBI: NO! rather check run_aploading_4*)

    The loading effects to the ECMWF-fields are supposed to be collected in subdirectories

    etc., where the trailing "s" reflects the fact that the data once was spherical
    harmonics decomposed. It's also surface pressure rather than sea level pressure,
    for which we code an "r" (reduced to sea level).

    We have basically two kinds of loading jobs, BIFROST-GPS and VLBI.
    The associated site files are

    Wouter van der Wal did a couple of SLR jobs. We have to relocate the results.

 Time series files (output by aploading / run_olfg)

    The time series output by aploading are a bit special:
#1   1993sep01:00   0.03312  -0.00481   0.01018
#2   1993sep01:06   0.03296  -0.00477   0.01022
#3   1993sep01:12   0.03225  -0.00483   0.01025
#4   1993sep01:18   0.03248  -0.00473   0.01025
#1   1993sep02:00   0.03253  -0.00472   0.01006

It is thought that they are input by tslist and stored binary (tsf2ts). Thereby, duplicates would be deleted.
However, in order to make the process short and get regular, numerical time columns, you may use

Collect ascii date and form binary mc-files (also 24h-decimated)

If we have all in one directory, the first job is easy: cd to that dir and
  tsf2ts -x . site site site ...

If we have annual directories and want to collect multi-year binaries,
  touch start-help.tsf # empty file
  tsf2ts -x -sth -I -o apl_out -C 'apl_[12][90]??' apl_temp site site site ...

  tsf2ts -x -sth -I -o apl_out -C 'apl_[12][90]??' apl_temp

for all sites. The data will first be collected in ascii in apl_temp.

  24h-decimate -B apl_out site site site...

where -B signifies binary data and input=output directory  apl_out . Consider 24-decimate -h for getting help.
You need a tse-file  ap_decimate.tse
DECIMATE 4 0 G=0.5 6

Determining site coefficients for vertical

Here is a basic scheme. You need to make sure you get simultaneous time series into the two files.

     awk '/WETTZELL/{print $5}' ./core-vlbi-apl/local.pressure.2001 > ser_loc.txt

     tslist .../WETTZELL_apl.ts -Ff10.4 -Bc2001,1,1 | fgrep -v '>' |\
            awk '{print $2}' > ser_apl.txt

     paste ser_loc.txt ser_apl.txt | fitxym.out -SXY -Ey0.001 -Ex0.1

If the awk above returns double samples, the following might be useful:
     fgrep 'WETTZELL' ./core-vlbi-apl/local.pressure.2001 |\
       tslist - -g'(5x,a9,1x,i2,t31,f10.0)' -d'(i4,i3,i2)' -k1 -Ff10.4 |\
          awk '\!/>/{print $2}' > ser_apl.txt

Determining offsets

> compute the average difference of two time series
      setenv SITE ONSA
      tslist apl/${SITE}.ts -U1997,10,1 -Ediff.tse,DIFF] |\
         awk '\!/>/{a+=$2; print $0}; END{print "mean:",a/NR}'

   with diff.tse:
      OPEN 22 ^ apls/${SITE}_apl.ts
      22, 'BIN]',-99999.0,'L:RAD', 0,  0,  'N', -1.d0

   or using a shell script for a lot of sites
      ts-offsets apls apl >! apls/
      ts-offsets '-U1997,10,1' new-vlbi-apl-s new-vlbi-apl-r >! new-vlbi-apl-s/

Using offsets between the SMHI and ECMWF sets

> first, you must create ts files from tsf files:
      tsf2ts apls
> before you can use ts-join97.
> join two time series and compensate offset:
      ts-join97 apl apls aplj ONSA

   (uses tsfedit with control file apls/join.tse and the apls/ file from above.)

> If you want to work with tsf files, e.g.
> to append the VLBI ftp data base

     uncompressdir $MYFTP/apload/core-vlbi-apl
     tsf-sub-offset -a -d 2000jan new-vlbi-apl-s $MYFTP/apload/core-vlbi-apl
     compressdir $MYFTP/apload/core-vlbi-apl
     cp -Rf $MYFTP/apload/core-vlbi-apl $MYAR/apload/

The whole job for BIFROST:

     aploading -ecmwf -v KUUS -l /home/hgs/Oload/smhi/apls/aploading.log \
      -d apls/ -O \'-s itrf_97may25.stacov\' 2000 oct nov dec
     tsf2ts apls
     ts-join97 apl apls aplj
     24h_decimate -B aplj

Making EPHEDIS records

Use /home/bin/EPHEDIS -h
This script uses the -TEPH option in tslist, and adds summary statisticts to the file header.

Long-term-mean pressure fields

sorry, doc still missing

Producing an archive of daily pressure loading displacements in geocentric XYZ


For some time there was a timing problem: the daily time went like 18  00  06 12, but the sample series was correct
Here is an awk to correct the problem:

   awk '{t=substr($0,16,2)+6; if (t>=24){t=t-24}; u=t/6+1; printf "#%s%s%2.2i %s\n",u,substr($0,3,13),t,substr($0,18)}' core-vlbi-apl/ALGOPARK_apl.tsf > core-vlbi-apl/ALGOPARK_apl.tsg


> Proceed with CMC
  olfg applies a physical factor (motion specifies CMC in metres).
  From 1997may01 on, the file names written by olfg are
  The file names
  are reserved for COM parameters in pressure units. The conversion
  to CMC is with the factor   (-G rhow a /g) * (100/g/rhow) =
 -4.4635E-02 (m/mWL) * 9.8968E-03 (mWL/hPa) = -4.4174E-04 m/hPa

  The files *noib.cmc and *ocib.cmc are produced with apmean/apmeanm.f
  (Obviously, one can extend the geocenter files and the mean field
  at the same time.) apmean can be programmed to skip the accumean part;
  it would avoid to add any field more than once if all goes well.

    apmean -smhi -O \'-m 1993,08,01\' -o apmean_ \
           -c topo.flg geoc/ocib.tsf geoc/noib.tsf \
           apmean_upto970430.dat \
           1997 may jun jul aug sep oct

  That job had some special options:
  -O passes options to apmeanm.out, which are -m date
  Reason: A bug fix; the old file apmean_upto970430.dat had an illegal
  date table. -m creates one from the date for nm fields ahead.
  nm would normally be specified after an -n option, but in this case
  nm was still readable from the file.


> Repair/rescale/reorder the {no|oc}ib.cmc time-series:

    apltsm.out -DC -eAPL]_Rwd -s 0.001 -i airp_ocib.cmc -o geoc/ocib.tsf

  It involves the (default) TSF-edit command file ./apl.tse
  which for the purpose of rescaling from [hPa] to [m] units contains

  MUL -0.4418 From 1993 01 01 0 0 0 0  To  1997 04 30 0 0 0 0

  ~/Oload/smhi/m/apltsm.f  might be the beginning of a versatile
  3-D time-series editor.  apltsm -h  gives more information.


  For the ib.cmc time-series, the data was [mm] in the file covering
  1993 to Apr 1997, and [m] in apl/airp_ib.cmc
  Append and rescale using
    apltsm.out -DC -eIB$ -i airp_ib.cmc -o test/ib.tsf
  with ./apl.tse containing
  MUL 0.001 From 1993 01 01 0 0 0 0 To  1997 04 30 0 0 0 0
  OPEN 22 B ~/Oload/smhi/apl/airp_ib.cmc
  22, 'TOP',-99999.0,'(2x,a12,1x,i2,t18,e12.4)',1,0,'A.',1.d0

  and equivalently for X and Y (format code t18 -> t30, t42)


> use NOAA/NCAR spectral data

   The fields we need are

   Copy the data set for year YYYY into /geo/hgs/Oload/smhi/gapYYYY
       aploading_request NRAO20 ALGOPARK
       aploading -noaa -s -v NRAO20 -d vlbi-apl-noaa/ \
          -O \'-lc vlbi-apl-noaa/local.pressure.YYYY -nocmc \
          -v\' \
          YYYY may jun jul aug sep oct

   This routine calls  /geo/hgs/Oload/noaa/noaa-aps2bin.out

   noaa-aps2bin.out takes parameters:
    (1) input data set name
    (2) YYYY - year
    (3) month (decimal only)
    (4) (optional)  day range


> analyze annual waves etc.

   There is a version of apmeanm.f called ~hgs/Oload/p/mtt/apmeanhm.f
   that computes harmonic responses to S2 and Sa tides. It can easily
   be configured to analyze different or additional harmonics.
   Variable names for fields are zz_S2, arguments z_S2, numbers n_S2 etc.

   The program name ~/bin/apmeanhm.out must be defined in ./apmean
   Run a whole-year apmean job with

      nice apmean -smhi -FB -o apmz -O \'-Z zh-file.dat \' -U \'-m \' 1994

   i.e. passing the -Z option to apmeanhm.out and, since it's an old
   data set, the -m (mag.tape) option to  aploading -p

   The zh-file.dat is iterated. It contains cmplx*16 data. To convert and
   normalize for unit amplitude use

      cnvczm.out zh-file.dat zhs-file.dat

   The loading effects can be computed by gotsalm.out using ./gotsalm.ins
   (check that you use the correct tide symbol and file names!)

   Graphic display using
      zd zd-aphl.ins '>Sa>'a


Source codes

olfg-v2-2-2.f     ~/Oload/afor/p/mpp    binary: ~/bin/olfg -> ~/bin/olfg-v2-2-2   subroutines: olfg-v2-2-2-lib.list

smhi-apf2bin.f    ~/Oload/afor/p/m      binary: ~/bin/smhi-apf2bin.out            subroutines: smhi-apf2bin-lib.list

JDC.f             ~/sas/p/mt            binary: ~/bin/JDC                         subroutines: JDC-lib.list

calrect.f         ~/util/afor/p/m       binary: ~/bin/calrect                     subroutines: calrect-lib.list

Subroutine locations:

~/Oload/afor/p  ~/util/afor/p  ~/math/afor/p  ~/sas/p

Libraries and include-files:

/usr/local/lib/pgplot/lib*.a   * = texx  notexx  grex  nogrex  pgplot
                               grex.fh  keys.fh

~/pgplot522/pgplot/         libpgplot.a   (makefile!)
~/pgplot522/pgplot/4mw/     libgrex.a     addgrex.f bmwg.f keydefs.f pause.f setmode.f timer.f inkey.f grex.f curctr.f
libnogrex.a   nogrex.f pause.f setmode.f keydefs.f
~/util/afor/c               libtexx.a     scrprmpt.f textmode.f prpause.f
~/util/afor/c               libcurse.a   
t_iniscr.c getcch.c
~/util/afor/p               libnotexx.a   notexx.f

(fat names:  -h option gives help.  Gray names no help available)

aploading     ~/Apload
run_olfg      ~/Apload
monthnum      ~/bin
month3l       ~/bin                returns 3-letter month names, given the month number
lost          ~/bin
clost         ~/bin
fromto        ~/bin
the_days_of   ~/bin
ecq           ~/bin                a test function for csh
beak          ~/bin                returns first character of a string
bill          ~/bin                bill -length= n returns first n characters of a string
fcmd         Main program's source dir
fcmb          ~/bin
fcs           ~/bin

Ancillary data
~/Apload/topo.flg -> topo30.flg