How far is Terrace from London?
The distance between London (London Heathrow Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 4547 miles / 7317 kilometers / 3951 nautical miles.
London Heathrow Airport – Northwest Regional Airport Terrace-Kitimat
Search flights
Distance from London to Terrace
There are several ways to calculate the distance from London to Terrace. Here are two standard methods:
Vincenty's formula (applied above)- 4546.760 miles
- 7317.300 kilometers
- 3951.026 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4532.193 miles
- 7293.857 kilometers
- 3938.368 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from London to Terrace?
The estimated flight time from London Heathrow Airport to Northwest Regional Airport Terrace-Kitimat is 9 hours and 6 minutes.
What is the time difference between London and Terrace?
The time difference between London and Terrace is 8 hours. Terrace is 8 hours behind London.
Flight carbon footprint between London Heathrow Airport (LHR) and Northwest Regional Airport Terrace-Kitimat (YXT)
On average, flying from London to Terrace generates about 525 kg of CO2 per passenger, and 525 kilograms equals 1 158 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from London to Terrace
See the map of the shortest flight path between London Heathrow Airport (LHR) and Northwest Regional Airport Terrace-Kitimat (YXT).
Airport information
Origin | London Heathrow Airport |
---|---|
City: | London |
Country: | United Kingdom |
IATA Code: | LHR |
ICAO Code: | EGLL |
Coordinates: | 51°28′14″N, 0°27′42″W |
Destination | Northwest Regional Airport Terrace-Kitimat |
---|---|
City: | Terrace |
Country: | Canada |
IATA Code: | YXT |
ICAO Code: | CYXT |
Coordinates: | 54°28′6″N, 128°34′33″W |