Coordinated Universal Time ( UTC ) replaced Greenwich Mean Time (GMT) as the World Standard for time in 1972. UTC is based on atomic measurements, while GMT is based on Earth's rotation.
Greenwich Mean Time (GMT) is still the standard time for the Prime Meridian (Zero Longitude) and civil time in UK when Daylight Saving Time is not in use.
At the 1884 International Meridian Conference held in Washington, DC, USA, local mean solar time at the Royal Observatory, Greenwich in England was chosen to define the Universal Day. It started from 0 hours at mean midnight. This agreed with civil Greenwich Mean Time (GMT), used in Great Britain since 1847.
In contrast, astronomical GMT began at mean noon, 12 hours after mean midnight of the same date until 1 January 1925, whereas nautical GMT began at mean noon, 12 hours before mean midnight of the same date, at least until 1805 in the British Navy. It persisted much longer elsewhere because it was mentioned at the 1884 conference. In 1884 the Greenwich Meridian was used for two-thirds of all charts and maps as their Prime Meridian.
In 1928, the term Universal Time (UT) was introduced by the International Astronomical Union (IAU) to refer to GMT with the day starting at midnight. Until the 1950s, broadcast time signals were based on UT, and hence on the rotation of the Earth.
In 1955 the caesium atomic clock was invented. This provided a form of timekeeping that was both more stable and more convenient than astronomical observations.
In 1956 the U.S. National Bureau of Standards and U.S. Naval Observatory started to develop atomic frequency time scales; by 1959 these time scales were used in generating the WWV time signals, named for the shortwave radio station that broadcasts them.
In 1960 the U.S. Naval Observatory, the Royal Greenwich Observatory, and the UK National Physical Laboratory coordinated their radio broadcasts so time steps and frequency changes were coordinated, and the resulting time scale was informally referred to as "Coordinated Universal Time".
The frequency of the signals was initially set to match the rate of UT, but then kept at the same frequency by the use of atomic clocks and deliberately allowed to drift away from UT. When the divergence grew significantly, the signal was phase shifted (stepped) by 20 ms to bring it back into agreement with UT. Twenty-nine such steps were used before 1960.
In 1958 data was published linking the frequency for the caesium transition, newly established, with the ephemeris second. The ephemeris second is the duration of time that, when used as the independent variable in the laws of motion that govern the movement of the planets and moons in the solar system, cause the laws of motion to accurately predict the observed positions of solar system bodies. Within the limits of observing accuracy, ephemeris seconds are of constant length, as are atomic seconds. This publication allowed a value to be chosen for the length of the atomic second that would work properly with the celestial laws of motion..
UTC was officially initiated at the start of 1961 (but the name Coordinated Universal Time was not adopted by the International Astronomical Union until 1967). The TAI instant 1 January 1961 00:00:01.422818 exactly was identified as UTC instant 1 January 1961 00:00:00.000000 exactly, and UTC ticked exactly one second for every 1.000000015 s of TAI. Time steps occurred every few months thereafter, and frequency changes at the end of each year. The jumps increased in size to 100 ms, with only one 50 ms jump having ever occurred. This UTC was intended to permit a very close approximation of UT2, within around 0.1 s.
In 1967, the SI second was redefined in terms of the frequency supplied by a caesium atomic clock. The length of second so defined was practically equal to the second of ephemeris time. This was the frequency that had been provisionally used in TAI since 1958. It was soon recognised that having two types of second with different lengths, namely the UTC second and the SI second used in TAI, was a bad idea. It was thought that it would be better for time signals to maintain a consistent frequency, and that that frequency should match the SI second. Thus it would be necessary to rely on time steps alone to maintain the approximation of UT. This was tried experimentally in a service known as "Stepped Atomic Time" (SAT), which ticked at the same rate as TAI and used jumps of 200 ms to stay synchronised with UT2.
There was also dissatisfaction with the frequent jumps in UTC (and SAT). In 1968, Louis Essen, the inventor of the caesium atomic clock, and G. M. R. Winkler both independently proposed that steps should be of 1 second only. This system was eventually approved, along with the idea of maintaining the UTC second equal to the TAI second. At the end of 1971, there was a final irregular jump of exactly 0.107758 TAI seconds, so that 1 January 1972 00:00:00 UTC was 1 January 1972 00:00:10 TAI exactly, making the difference between UTC and TAI an integer number of seconds. At the same time, the tick rate of UTC was changed to exactly match TAI. UTC also started to track UT1 rather than UT2. Some time signals started to broadcast the DUT1 correction (UT1 − UTC) for applications requiring a closer approximation of UT1 than UTC now provided.
The first leap second occurred on 30 June 1972. Since then, leap seconds have been added on average about once every 19 months, always on 30 June or 31 December.
There have been 27 leap seconds in total, all positive. putting UTC 37 seconds behind TAI.
Clocks and Time Tools
Our creative collection
What time is it?