Python version of tad programme for IDSLs, by Alessandro Annunziato, Joint Research Centre (c) 2022
The programmes in the prog folder represent the suite of programmes used in the Inexpensive Device for Sea Level Measurements ( IDSLs) devices and whose objective is to process the data in real time to provide an alert in case an anomalous wave, originated by a Tsunami or any other reason, is detected.
The programme to be run on the IDSL can be launched with the command:
python3 tad.py [ -c ]
however for testing purposes, it is possible to use the scrape.py programme to read the sea level from available sea level repositories and get the quantities calculated on the fly.
To test the calculation procedure you can use the following command:
Suppose that you have to analyse the tide gauge from the GLOSS Sea Level Facility, you can use this command below, using as parameter code the value of the code from this list: https://www.ioc-sealevelmonitoring.org/list.php
If you want to analyse Ierapetra, in Greece, the code is iera. Any available signal in the list above can be used.
python3 scrape.py -code iera -n300 200 -n30 50 -mult 4 -add 0.1 -th 0.08 -mode GLOSS -sensors rad -out ./temp/iera
where:
parameter | Meaning |
-code | is the code of the device, as from the list indicated above |
-n300 | the long term number of intervals; the lenght depends on the interval among two points in the dataset |
-n30 | the short term number of intervals; the lenght depends on the interval among two points in the dataset |
-mult | rms multiplication factor |
-add | is the adding quantity to the rms |
-add | is the adding quantity to the rms |
-th | threshold to be overpassed |
-mode | type of sea level netwrok (GLOSS/NOAA, BIG_INA...) |
-out | optional and indicates where to write the output |
-sensors | optional except for GLOSS. In the case of GLOSS you need to specify which sensor is to be read, comma separated. For example rad,enc |
The output will be in the form of a list of data analysed applying the detection algorithm. If the command is repeated, only the new data will be considered from the last time the command was run. The response is the following
The quantities in the middle represent the se alevel, the foreacsat short term and forecast long term
iera,08/06/2022,00:10:00,0.0,0.0,0.237,0.241,0.238,0.00250,0.002,0,0.00,,,
The quantities displayed are generated according to thsi definition:
logData='$IDdevice,$DATE,$TIME**,$TEMP,$PRESS,$LEV,$FORE30,$FORE300,$RMS,$ALERT_LEVEL,$ALERT_SIGNAL,$V1,$V2,$V3,
**
Temp, PRESS and V1, V2 and V3 are not relevant and always kep contact. The quantities in bold are the ones considered/
The plot of these quantities is the following:
If you would like to analyse one of the nOAA sea levels according to the list of sea levels contained in this page: https://tidesandcurrents.noaa.gov/sltrends/sltrends_us.html
then clicking on one specific area, i.e. Oregon: https://tidesandcurrents.noaa.gov/sltrends/sltrends_states.html?gid=1234
the number appearing as station ID is what is needed. So for Charleston, is : 9435380
So to analyse this you should call:
python3 scrape.py -code 9435380 -n300 100 -n30 20 -mult 4 -add 0.1 -th 0.08 -mode NOAA -out /tmp/Charleston
You can note that the long term forecast does not well follow the level signal. The reson is that the number of points chosen, 100 is too large because this signal has only one point every 6 minutes. Therefore 100 points represent 10h that is too much. The maximum should be in the oreder of 2h and therefore the n300 value should be around 20. The short term forecast n30 should be 2 or 3. This procedure works well with a rather dense number of points, not so coarse as in thsis case
The SeaLevelMachine is the software contained in the test application: https://slm.azurewebsites.com that allows to perform calculations on the fly of a huge amount of sea level stations and verify the effect of changing the parameters. The application is very primitive but is done only to test the routine and show how they can be implemented in other programs.