Air Miles Calculator logo

How far is London from Antananarivo?

The distance between Antananarivo (Ivato International Airport) and London (London International Airport) is 9030 miles / 14533 kilometers / 7847 nautical miles.

Ivato International Airport – London International Airport

Distance arrow
9030
Miles
Distance arrow
14533
Kilometers
Distance arrow
7847
Nautical miles
Flight time duration
17 h 35 min
CO2 emission
1 152 kg

Search flights

Distance from Antananarivo to London

There are several ways to calculate the distance from Antananarivo to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 9030.329 miles
  • 14532.906 kilometers
  • 7847.141 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9029.454 miles
  • 14531.498 kilometers
  • 7846.381 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Antananarivo to London?

The estimated flight time from Ivato International Airport to London International Airport is 17 hours and 35 minutes.

Flight carbon footprint between Ivato International Airport (TNR) and London International Airport (YXU)

On average, flying from Antananarivo to London generates about 1 152 kg of CO2 per passenger, and 1 152 kilograms equals 2 539 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Antananarivo to London

See the map of the shortest flight path between Ivato International Airport (TNR) and London International Airport (YXU).

Airport information

Origin Ivato International Airport
City: Antananarivo
Country: Madagascar Flag of Madagascar
IATA Code: TNR
ICAO Code: FMMI
Coordinates: 18°47′48″S, 47°28′43″E
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W