Time Scales in Satellite Geodesy

 

In general there are three different time systems that are used in space geodesy:


See Figure 1 for a diagrammetric representation of the relationship between the time systems.

Dynamical Time

 

Dynamical time is required to describe the motion of bodies in a particular reference frame and according to a particular gravitational theory. Today, General Relativity and an inertial (non-accelerating) reference frame are fundamental concepts. The most nearly inertial reference frame to which we have access through gravitational theory has its origin located at the centre-of-mass of the solar system (the barycentre). Dynamical time measured in this system is called Barycentric Dynamical Time (TDB -- the abbreviation for this and most other time scales reflects the French order of the words). A clock fixed on the earth will exhibit periodic variations as large as 1.6 milliseconds with respect to TDB due to the motion of the earth in the sun's gravitational field. However, in describing the orbital motion of near-earth satellites we need not use TDB, nor account for these relativistic variations, since both the satellite and the earth itself are subject to essentially the same perturbations.

For satellite orbit computations it is common to use Terrestrial Dynamical Time (TDT), which represents a uniform time scale for motion within the earth's gravity field and which has the same rate as that of an atomic clock on the earth, and is in fact defined by that rate (see below). In the terminology of General Relativity, TDB corresponds to Coordinate Time, and TDT to Proper Time. The predecessor of TDB was known as Ephemeris Time (ET).

Figure 1 shows the relationship between this and other time scales.

Atomic Time

 

The fundamental time scale for all the earth's time-keeping is International Atomic Time (TAI). It results from analyses by the Bureau International des Poids et Mesures (BIPM) in SÈvres, France, of data from atomic frequency standards (atomic "clocks") in many countries. (Prior to 1 January, 1988, this function was carried out by the Bureau International de l'Heure.) TAI is a continuous time scale and serves as the practical definition of TDT, being related to it by:

TDT = TAI + 32.184 seconds

The fundamental unit of TAI (and therefore TDT) is the SI second, defined as "the duration of 9192631770 periods of the radiation corresponding to the transition between two hyperfine levels of the ground state of the cesium 133 atom". The SI day is defined as 86400 seconds and the Julian Century as 36525 days.

Because TAI is a continuous time scale, it has one fundamental problem in practical use: the earth's rotation with respect to the sun is slowing down by a variable amount which averages, at present, about 1 second per year. Thus TAI would eventually become inconveniently out of synchronisation with the solar day. This problem has been overcome by introducing Coordinated Universal Time (UTC), which runs at the same rate as TAI, but is incremented by 1 second jumps ( so-called "leap seconds") when necessary, normally at the end of June or December of each year. During the period mid-1994 to the end of 1995, one needed to add 29 seconds to UTC clock readings to obtain time expressed in the TAI scale.

The time signals broadcast by the GPS satellites are synchronised with atomic clocks at the GPS Master Control Station, in Colorado Springs, Colorado. These clocks define GPS Time (GPST), and are in turn periodically compared with UTC, as realised by the U.S. Naval Observatory in Washington D.C. GPST was set to UTC at 0hr on 6 January, 1980, and is not incremented by leap seconds. As a result there will be integer-second differences between the two time scales. For example, in December 1994 clocks running on GPST were offset from UTC by 10 seconds. There is therefore a constant offset of 19 seconds between the GPST and TAI time scales:

GPST + 19 seconds = TAI


Figure 1 shows the relationship between this and other time scales.

Solar and Sidereal Time

 

A measure of earth rotation is the angle between a particular reference meridian of longitude (preferably the Greenwich meridian) and the meridian of a celestial body. The most common form of solar time is Universal Time (UT) (not to be confused with UTC, which is an atomic time scale). UT is defined by the Greenwich hour angle (augmented by 12 hours) of a fictitious sun uniformly orbiting in the equatorial plane. However, the scale is not uniform because of oscillations of the earth's rotational axis. UT corrected for polar motion is denoted by UT1, and is otherwise known as Greenwich Mean Time (GMT). The precise definition of UT1 is complicated because of the motion both of the celestial equator and the earth's orbital plane with respect to inertial space, and the irregularity of the earth's polar motion. UT1 is derived from the analysis of observations carried out by the IERS, and can be reconstructed from published corrections (UT1) to UTC:

UT1 = UTC + UT1

A measure of sidereal time is Greenwich Apparent Sidereal Time (GAST), defined by the Greenwich hour angle of the intersection of the earth's equator and the plane of its orbit on the Celestial Sphere (the vernal equinox). Taking the mean equinox as the reference leads to Greenwich Mean Sidereal Time (GMST). The conversion between mean solar time corrected for polar motion (UT1) and GAST is through the following relation:

g = 1.0027379093.UT1 + o + .cos

where is the nutation in longitude, is the obliquity of the ecliptic and o represents the sidereal time at Greenwich midnight (0hr UT). The omission of the last term in the above equation permits the GMST to be determined. o is represented by a time series:

o = 24110.54841s + 8640184.812866s.To +
0.093104s.To2 6.2s.10-6.To3

 

where To represents the time span expressed in Julian centuries (of 36525 days of 86400 SI seconds) between the reference epoch J2000.0 and the day of interest (at 0hr UT).

Figure 1 shows the relationship between this and other time scales.

Relationship Between Time Scales


The Figure below illustrates the relationship between the various time scales discussed above. The vertical axis indicates the relative offsets of the origins of the time scales, and the slope of the lines indicate their drift. Note that with the exception of UT1 (or GAST) all time scales (nominally) have zero drift as defined by TAI.


Figure 1. Time scale relationships.


© Chris Rizos, SNAP-UNSW, 1999