Air Miles Calculator logo

How far is London from Ujung Pandang?

The distance between Ujung Pandang (Makassar Sultan Hasanuddin International Airport) and London (London International Airport) is 9528 miles / 15333 kilometers / 8279 nautical miles.

Makassar Sultan Hasanuddin International Airport – London International Airport

Distance arrow
9528
Miles
Distance arrow
15333
Kilometers
Distance arrow
8279
Nautical miles
Flight time duration
18 h 32 min
CO2 emission
1 228 kg

Search flights

Distance from Ujung Pandang to London

There are several ways to calculate the distance from Ujung Pandang to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 9527.590 miles
  • 15333.171 kilometers
  • 8279.250 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9523.717 miles
  • 15326.937 kilometers
  • 8275.884 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ujung Pandang to London?

The estimated flight time from Makassar Sultan Hasanuddin International Airport to London International Airport is 18 hours and 32 minutes.

Flight carbon footprint between Makassar Sultan Hasanuddin International Airport (UPG) and London International Airport (YXU)

On average, flying from Ujung Pandang to London generates about 1 228 kg of CO2 per passenger, and 1 228 kilograms equals 2 707 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ujung Pandang to London

See the map of the shortest flight path between Makassar Sultan Hasanuddin International Airport (UPG) and London International Airport (YXU).

Airport information

Origin Makassar Sultan Hasanuddin International Airport
City: Ujung Pandang
Country: Indonesia Flag of Indonesia
IATA Code: UPG
ICAO Code: WAAA
Coordinates: 5°3′41″S, 119°33′14″E
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W