Return to Main Page
In this webpage, I describe the steps involved in extracting long waves from tide gauge records.
Preliminary data processing involves the following steps:
Removing the tide from the signal involves forecasting the tide and subtracting it from the record. The tide consists of a number of constituents (up to 600). Each of these constituents is related to a particular astronomical phenomenon. This governs the constituent's period, but each constituent also has an amplitude and phase, corresponding to its strength and timing. We determine the amplitude and phase of each tidal constituent by analysing a long period of historical tide gauge data. Once we have the constituents, we can forecast the tide within a few cm for anytime in the future or the past.
Tide gauge records occasionally have gaps from a few minutes up to a few days, caused by instrument malfunction or downtime during maintenance. These gaps are undesirable, but unfortunately they are inevitable. We try to minimise them as much as possible. For many data processing routines, we need a continuous record with no gaps. Therefore, we must do something about the gaps, at least for the calculations. Oftentimes, we remove the gaps, do the calculations, then reintroduce the gaps so that we do not misinterpret the results. The procedure for degapping sea-level records is to remove the tide from the record (see detiding ) and linearly interpolate across the gap. If necessary, the tide can be added back in again later.
Spikes occur as a result of instrument error: either of the sensor, or the telemetry system that transmits the data. In sea-level records, spikes usually are a single point. The record will be going along nicely when suddenly it registers a zero or a large number. A single spike in a record can completely mask the waves of interest. Detecting spikes by eye is easy. In fact, the human eye is the best spike-detecting instrument there is. Detecting spikes automatically and robustly requires sophisticated mathematical methods such as those described here (187 KB). For sea-level records, the algorithm we have found works best uses wavelet analysis to decompose the detided residual, then looks for points that exceed a threshold within a particular time window. The spikes are replaced by gaps, and later we remove the gaps by degapping. Determination of the threshold and the length of the window differs from site to site and must be estimated from historical data.
Noise occurs in a tide gauge record as a result of instrument errors and
improper sampling of swell waves (aliasing). Usually, noise only affects long waves;
however, the methods described here can be applied to any process.
A detailed report on denoising sea-level records is available from here (239 KB).
In brief, denoising involves first of all analysing the sea-level record to determine the background noise that occurs in a quiescent period when there are no significant waves, then using the information from that period to estimate how much of the signal is noise, and removing that proportion of the signal at times when there are significant waves. The technique uses new technology called "wavelet analysis" to decompose the sea-level record and separate noise from real data. Each tide gauge has different noise characteristics, which means that each site needs to be calibrated for noise reduction, and the noise-reduction strategy will differ from site to site. For some sites, noise reduction is very important (Frenchman Island near Marsden Point is an example), but for others, denoising is unimportant (e.g., Timaru) and has essentially no effect on the results. There is no way of telling in advance when this will be the case. You simply have to analyse the data.
Return to top
Once the detided, degapped, despiked, denoised residual has been evaluated,
we can derive the long-wave record by high-pass filtering.
This removes any left-over tide after detiding, as well as long-period effects
like storm surge and long-term variations in sea level.
For the records shown here, we used orthogonal wavelet analysis. This has the advantage over other high-pass filter methods in that it is a localised filtering method, thus accommodating non-stationary data (i.e., data with changing variance).
Orthogonal wavelet analysis involves taking a mother wavelet like this one (left) and fitting it to the data. Then it is doubled in timescale (dilated) and fitted again. This process is repeated 6 times to yield 6 components (called wavelet details) with timescales between 3 and 96 minutes. Adding these together produces the long-wave record (Plot A in the figures).
The long-wave record can be processed as if it were a wind-wave record, except that it is continuous rather than in bursts. Waves were reckoned as occurring between downward zero-crossings. Figures 1 B and C show the raw wave heights and periods. For significant wave height and period (Figures 2 B and C), we used a 6-hour moving window with a 4-hour overlap, producing parameters at 2-hourly intervals.
The long wave record was first decomposed using the continuous wavelet method described by Torrence and Compo(1998). We used the Mexican Hat mother wavelet and four intervals within each dyadic scale. We then reconstituted the signal to give a series of continuous wave records at various dilation scales. At this point, we had a very complicated array of sea levels as a function of time and dilation scale. The original long wave record could be obtained from this array by integrating over the dilation scales at each time interval. In order to display the distributed data, we smoothed the wave records over time by calculating the significant wave height using a 6-hour moving window with a 4-hour overlap (significant wave height is the average of the highest third of the waves). This gave a set of data at two-hour intervals for each dilation scale. To obtain a representative period for each of these scales, we calculated the average period of the significant waves over the timespan. The results of these calculations are presented in the Figures 2 D in the form of a contour map of significant wave height as a function of time and period (equivalent to dilation scale).
Torrence, C.; Compo, G.P. 1998: A practical guide to wavelet analysis. Bulletin of the American Meteorological Society, 79(1): 61-78.