To reduce (unmapped) stacov files for air pressure loading effects
2000-JAN-28.. > aploading
aploading job
command lines must include one of the options
-smhi
-ecmwf
-noaa
These options are used to unpack the data in the appropriate way.
SMHI: gzipped files GEyymm.asc.gz comprise
of one ascii file for each month
ECMWF: tar files GEyymm.tar comprise of
typically
120 files GEyymmddqq.ASC
The BIFROST standard job with ECMWF-data is now:
nice aploading -ecmwf -s -v OSKA -d apls/ -O \'$* -s Bifrost.stacov \' 2000 jun
(implying nice run_olfg -s Bifrost.stacov
-ts
apls/ 2000 jun 01 etc.)
after
fgrep 'STA' $REFF.stacov >
Bifrost.stacov
2000-JAN-01.. > Get data from our new archives
there are two archive disks with air pressure data:
/GNSS_archive/misc/hgs1
(/SMHI)
(only needed for GE95??.asc.gz )
/GNSS_archive/misc/hgs2
(/SMHI)
(/ECMWF)
This is a typical job to run a mars retrieval for a succession of 9 months
ssh sub@ecaccess.ecmwf.int
tcsh (! it's working with
history!)
cd $SCRATCH
foreach dt ( 199803 199804 199805 199806 199807 199808
199809 199810 199811 199812 )
source ~/p/m-aixrq.env
end
and when the jobs of the loop are ready,
foreach dt ( 199803 199804 199805 199806 199807 199808
199809 199810 199811 199812 )
source ~/p/m-geplrq.env
end
gzip GEyymm*.ASC
tar cfv GEyymm.tar
GEyymm*.ASC.gz
scp
GEyymm.tar
hgs@holt.oso.chalmers.se:/home/hgs/Apload/gapyear
Transfer by ftp: When the whole job is complete,
start ftp on local
host:
cd $MYAR/ECMWF
# or where you want to store the results
sftp sub@ecaccess.ecmwf.int
usrid
#..#
cd ECSCRATCH
bin
prompt
mget GEyy??.tar
because the mput function is not working at ecgate1/ftp
(? actual remark??)
My user name at ECMWF: sub
My home directory: /home/ms/se/sub
Older jobs can be seen in the shell scripts
run_aploading_4*
like
nice aploading -ecmwf -s -v WESTFORD
-d new-vlbi-apl-s/ -O \
\'$* -lc
core-vlbi-apl/local.pressure.2001 -nocmc \
-v
/home/hgs/Calc/VLBI99.xyz\' \
2001
aug sep oct nov dec
Look for a recently updated run_aploading_4* file!
Observe that the local pressure is collected
using
the -lc option in the option string
that is conveyed to run_olfg.
Add option -k after -ecmwf to preserve the 6h ascii
data sets. The space requirement is
large though!
Examples for newer jobs (year 2000...) that
worked
are supposed to be collected in
aploading.cmd
like
aploading -ecmwf -v KUUS -l
/home/hgs/Oload/smhi/apls/aploading.log
\
-d
apls/ -O \'-s itrf_97may25.stacov\' 2000 oct nov dec
(VLBI: NO! rather check run_aploading_4*)
The loading effects to the ECMWF-fields are
supposed
to be collected in subdirectories
new-vlbi-apl-s/
apls/
etc., where the trailing "s" reflects the fact
that
the data once was spherical
harmonics decomposed. It's also surface
pressure
rather than sea level pressure,
for which we code an "r" (reduced to sea level).
We have basically two kinds of loading jobs,
BIFROST-GPS
and VLBI.
The associated site files are
./itrf_97may25.stacov
/home/hgs/Calc/VLBI99.xyz
respectively.
Wouter van der Wal did a couple of SLR jobs. We
have
to relocate the results.
The time series output by aploading are a bit
special:
Example:
#1
1993sep01:00 0.03312 -0.00481 0.01018
#2
1993sep01:06 0.03296 -0.00477 0.01022
#3
1993sep01:12 0.03225 -0.00483 0.01025
#4
1993sep01:18 0.03248 -0.00473 0.01025
#1
1993sep02:00 0.03253 -0.00472 0.01006
It is thought that they are input by tslist
and stored binary (tsf2ts).
Thereby,
duplicates would be deleted.
However, in order to make the process short and get regular, numerical
time columns, you may use
nicetsf.pl
awk '/WETTZELL/{print $5}' ./core-vlbi-apl/local.pressure.2001 > ser_loc.txt
tslist
.../WETTZELL_apl.ts -Ff10.4
-Bc2001,1,1
| fgrep -v '>' |\
awk '{print $2}' > ser_apl.txt
paste
ser_loc.txt ser_apl.txt |
fitxym.out
-SXY -Ey0.001 -Ex0.1
If the awk above returns double samples, the following might be
useful:
fgrep
'WETTZELL'
./core-vlbi-apl/local.pressure.2001
|\
tslist
-
-g'(5x,a9,1x,i2,t31,f10.0)'
-d'(i4,i3,i2)' -k1 -Ff10.4 |\
awk
'\!/>/{print
$2}'
> ser_apl.txt
> compute the average difference of two time series
setenv
SITE ONSA
tslist
apl/${SITE}.ts -U1997,10,1
-Ediff.tse,DIFF] |\
awk
'\!/>/{a+=$2;
print
$0}; END{print "mean:",a/NR}'
with diff.tse:
TSF
EDIT DIFF
OPEN
22 ^ apls/${SITE}_apl.ts
22,
'BIN]',-99999.0,'L:RAD',
0,
0, 'N', -1.d0
END
or using a shell script for a lot of sites
ts-offsets
apls apl >!
apls/ts-offsets.ren
ts-offsets
'-U1997,10,1'
new-vlbi-apl-s
new-vlbi-apl-r >! new-vlbi-apl-s/ts-offsets.ren
> first, you must create ts files from tsf files:
tsf2ts apls
> before you can use ts-join97.
> join two time series and compensate offset:
ts-join97 apl apls aplj ONSA
(uses tsfedit with control file apls/join.tse and the apls/ts-offset.ren file from above.)
> If you want to work with tsf files, e.g.
> to append the VLBI ftp data base
uncompressdir
$MYFTP/apload/core-vlbi-apl
tsf-sub-offset -a -d 2000jan
new-vlbi-apl-s
$MYFTP/apload/core-vlbi-apl
compressdir $MYFTP/apload/core-vlbi-apl
cp -Rf $MYFTP/apload/core-vlbi-apl
$MYAR/apload/
aploading -ecmwf -v KUUS -l
/home/hgs/Oload/smhi/apls/aploading.log
\
-d apls/ -O \'-s
itrf_97may25.stacov\'
2000 oct nov dec
tsf2ts apls
ts-join97 apl apls aplj
24h_decimate -B aplj
sorry, doc still missing
cf http://holt.oso.chalmers.se/hgs/hgs.man/apl_reduce_stacov.html
awk '{t=substr($0,16,2)+6; if (t>=24){t=t-24};
u=t/6+1;
printf "#%s%s%2.2i %s\n",u,substr($0,3,13),t,substr($0,18)}'
core-vlbi-apl/ALGOPARK_apl.tsf
> core-vlbi-apl/ALGOPARK_apl.tsg
_____________________________________________________________________
--------------------------------------------------------------
> Proceed with CMC
olfg applies a physical factor (motion specifies CMC in
metres).
From 1997may01 on, the file names written by olfg are
$subdir/geoc_ib.tsf
The file names
./airp_*ib.cmc
are reserved for COM parameters in pressure units. The
conversion
to CMC is with the factor (-G rhow a /g) *
(100/g/rhow)
=
-4.4635E-02 (m/mWL) * 9.8968E-03 (mWL/hPa) = -4.4174E-04
m/hPa
The files *noib.cmc and *ocib.cmc are produced with
apmean/apmeanm.f
(Obviously, one can extend the geocenter files and the mean
field
at the same time.) apmean can be programmed to skip the
accumean part;
it would avoid to add any field more than once if all goes
well.
apmean -smhi -O \'-m 1993,08,01\' -o apmean_
\
-c
topo.flg geoc/ocib.tsf geoc/noib.tsf \
apmean_upto970430.dat
\
1997
may jun jul aug sep oct
That job had some special options:
-O passes options to apmeanm.out, which are -m date
Reason: A bug fix; the old file apmean_upto970430.dat had
an illegal
date table. -m creates one from the date for nm fields ahead.
nm would normally be specified after an -n option, but in
this case
nm was still readable from the file.
----------------------------------------------------------------
> Repair/rescale/reorder the {no|oc}ib.cmc time-series:
apltsm.out -DC -eAPL]_Rwd -s 0.001 -i airp_ocib.cmc -o geoc/ocib.tsf
It involves the (default) TSF-edit command file ./apl.tse
which for the purpose of rescaling from [hPa] to [m] units
contains
TSF EDIT APL
MUL -0.4418 From 1993 01 01 0 0 0 0 To 1997
04 30 0 0 0 0
END
~/Oload/smhi/m/apltsm.f might be the beginning of a
versatile
3-D time-series editor. apltsm -h gives more
information.
---
For the ib.cmc time-series, the data was [mm] in the file
covering
1993 to Apr 1997, and [m] in apl/airp_ib.cmc
Append and rescale using
apltsm.out -DC -eIB$ -i airp_ib.cmc -o
test/ib.tsf
with ./apl.tse containing
TSF EDIT IBZ
MUL 0.001 From 1993 01 01 0 0 0 0 To 1997 04 30 0
0 0 0
OPEN 22 B ~/Oload/smhi/apl/airp_ib.cmc
22, 'TOP',-99999.0,'(2x,a12,1x,i2,t18,e12.4)',1,0,'A.',1.d0
END
and equivalently for X and Y (format code t18 -> t30, t42)
----------------------------------------------------------------
> use NOAA/NCAR spectral data
http://www.cdc.noaa.gov/cdc/data.nmc.reanalysis.html#spectral
ftp://ftp.cdc.noaa.gov/Datasets/nmc.reanalysis/spectral/
The fields we need are
Copy the data set for year YYYY into
/geo/hgs/Oload/smhi/gapYYYY
Run
aploading_request temp.xyz
NRAO20 ALGOPARK
aploading -noaa -s -v NRAO20
-d vlbi-apl-noaa/ \
-O \'-lc
vlbi-apl-noaa/local.pressure.YYYY -nocmc \
-v
temp.xyz\'
\
YYYY may
jun jul aug sep oct
This routine calls /geo/hgs/Oload/noaa/noaa-aps2bin.out
noaa-aps2bin.out takes parameters:
(1) input data set name
(2) YYYY - year
(3) month (decimal only)
(4) (optional) day range
----------------------------------------------------------------
> analyze annual waves etc.
There is a version of apmeanm.f called
~hgs/Oload/p/mtt/apmeanhm.f
that computes harmonic responses to S2 and Sa tides.
It can easily
be configured to analyze different or additional
harmonics.
Variable names for fields are zz_S2, arguments z_S2,
numbers n_S2 etc.
The program name ~/bin/apmeanhm.out must be defined
in ./apmean
Run a whole-year apmean job with
nice apmean -smhi -FB -o apmz -O \'-Z zh-file.dat \' -U \'-m \' 1994
i.e. passing the -Z option to apmeanhm.out and,
since
it's an old
data set, the -m (mag.tape) option to aploading
-p
The zh-file.dat is iterated. It contains cmplx*16
data.
To convert and
normalize for unit amplitude use
cnvczm.out zh-file.dat zhs-file.dat
The loading effects can be computed by gotsalm.out
using ./gotsalm.ins
(check that you use the correct tide symbol and file
names!)
Graphic display using
zd zd-aphl.ins '>Sa>'a
---------------------------------------------------------------------------
Programs
Source codes
smhi-apf2bin.f
~/Oload/afor/p/m binary:
~/bin/smhi-apf2bin.out
subroutines: smhi-apf2bin-lib.list
JDC.f ~/sas/p/mt binary: ~/bin/JDC subroutines: JDC-lib.list
calrect.f ~/util/afor/p/m binary: ~/bin/calrect subroutines: calrect-lib.list
Subroutine locations:
Scripts
(fat names: -h option gives help. Gray names no help
available)