Air Miles Calculator logo

How far is Beijing from Ujung Pandang?

The distance between Ujung Pandang (Makassar Sultan Hasanuddin International Airport) and Beijing (Beijing Nanyuan Airport) is 3092 miles / 4976 kilometers / 2687 nautical miles.

Makassar Sultan Hasanuddin International Airport – Beijing Nanyuan Airport

Distance arrow
3092
Miles
Distance arrow
4976
Kilometers
Distance arrow
2687
Nautical miles

Search flights

Distance from Ujung Pandang to Beijing

There are several ways to calculate the distance from Ujung Pandang to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 3091.780 miles
  • 4975.737 kilometers
  • 2686.683 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3105.008 miles
  • 4997.026 kilometers
  • 2698.178 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ujung Pandang to Beijing?

The estimated flight time from Makassar Sultan Hasanuddin International Airport to Beijing Nanyuan Airport is 6 hours and 21 minutes.

What is the time difference between Ujung Pandang and Beijing?

There is no time difference between Ujung Pandang and Beijing.

Flight carbon footprint between Makassar Sultan Hasanuddin International Airport (UPG) and Beijing Nanyuan Airport (NAY)

On average, flying from Ujung Pandang to Beijing generates about 345 kg of CO2 per passenger, and 345 kilograms equals 761 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ujung Pandang to Beijing

See the map of the shortest flight path between Makassar Sultan Hasanuddin International Airport (UPG) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Makassar Sultan Hasanuddin International Airport
City: Ujung Pandang
Country: Indonesia Flag of Indonesia
IATA Code: UPG
ICAO Code: WAAA
Coordinates: 5°3′41″S, 119°33′14″E
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E