Air Miles Calculator logo

How far is London from Manado?

The distance between Manado (Sam Ratulangi International Airport) and London (London International Airport) is 8963 miles / 14424 kilometers / 7788 nautical miles.

Sam Ratulangi International Airport – London International Airport

Distance arrow
8963
Miles
Distance arrow
14424
Kilometers
Distance arrow
7788
Nautical miles
Flight time duration
17 h 28 min
CO2 emission
1 142 kg

Search flights

Distance from Manado to London

There are several ways to calculate the distance from Manado to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 8962.785 miles
  • 14424.205 kilometers
  • 7788.448 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8956.526 miles
  • 14414.131 kilometers
  • 7783.008 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Manado to London?

The estimated flight time from Sam Ratulangi International Airport to London International Airport is 17 hours and 28 minutes.

Flight carbon footprint between Sam Ratulangi International Airport (MDC) and London International Airport (YXU)

On average, flying from Manado to London generates about 1 142 kg of CO2 per passenger, and 1 142 kilograms equals 2 517 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Manado to London

See the map of the shortest flight path between Sam Ratulangi International Airport (MDC) and London International Airport (YXU).

Airport information

Origin Sam Ratulangi International Airport
City: Manado
Country: Indonesia Flag of Indonesia
IATA Code: MDC
ICAO Code: WAMM
Coordinates: 1°32′57″N, 124°55′33″E
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W