Air Miles Calculator logo

How far is London from Tatitlek, AK?

The distance between Tatitlek (Tatitlek Airport) and London (London International Airport) is 2908 miles / 4680 kilometers / 2527 nautical miles.

The driving distance from Tatitlek (TEK) to London (YXU) is 3880 miles / 6245 kilometers, and travel time by car is about 77 hours 34 minutes.

Tatitlek Airport – London International Airport

Distance arrow
2908
Miles
Distance arrow
4680
Kilometers
Distance arrow
2527
Nautical miles

Search flights

Distance from Tatitlek to London

There are several ways to calculate the distance from Tatitlek to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2907.753 miles
  • 4679.575 kilometers
  • 2526.768 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2899.923 miles
  • 4666.974 kilometers
  • 2519.964 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tatitlek to London?

The estimated flight time from Tatitlek Airport to London International Airport is 6 hours and 0 minutes.

Flight carbon footprint between Tatitlek Airport (TEK) and London International Airport (YXU)

On average, flying from Tatitlek to London generates about 323 kg of CO2 per passenger, and 323 kilograms equals 713 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tatitlek to London

See the map of the shortest flight path between Tatitlek Airport (TEK) and London International Airport (YXU).

Airport information

Origin Tatitlek Airport
City: Tatitlek, AK
Country: United States Flag of United States
IATA Code: TEK
ICAO Code: PAKA
Coordinates: 60°52′17″N, 146°41′25″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W