How far is London from Tulita?
The distance between Tulita (Tulita Airport) and London (London International Airport) is 2280 miles / 3669 kilometers / 1981 nautical miles.
The driving distance from Tulita (ZFN) to London (YXU) is 3159 miles / 5084 kilometers, and travel time by car is about 68 hours 41 minutes.
Tulita Airport – London International Airport
Search flights
Distance from Tulita to London
There are several ways to calculate the distance from Tulita to London. Here are two standard methods:
Vincenty's formula (applied above)- 2279.609 miles
- 3668.675 kilometers
- 1980.926 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2274.529 miles
- 3660.499 kilometers
- 1976.511 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tulita to London?
The estimated flight time from Tulita Airport to London International Airport is 4 hours and 48 minutes.
What is the time difference between Tulita and London?
The time difference between Tulita and London is 2 hours. London is 2 hours ahead of Tulita.
Flight carbon footprint between Tulita Airport (ZFN) and London International Airport (YXU)
On average, flying from Tulita to London generates about 250 kg of CO2 per passenger, and 250 kilograms equals 550 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Tulita to London
See the map of the shortest flight path between Tulita Airport (ZFN) and London International Airport (YXU).
Airport information
Origin | Tulita Airport |
---|---|
City: | Tulita |
Country: | Canada |
IATA Code: | ZFN |
ICAO Code: | CZFN |
Coordinates: | 64°54′34″N, 125°34′22″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |