Air Miles Calculator logo

How far is Padang from Nanjing?

The distance between Nanjing (Nanjing Lukou International Airport) and Padang (Minangkabau International Airport) is 2547 miles / 4099 kilometers / 2213 nautical miles.

Nanjing Lukou International Airport – Minangkabau International Airport

Distance arrow
2547
Miles
Distance arrow
4099
Kilometers
Distance arrow
2213
Nautical miles

Search flights

Distance from Nanjing to Padang

There are several ways to calculate the distance from Nanjing to Padang. Here are two standard methods:

Vincenty's formula (applied above)
  • 2547.055 miles
  • 4099.088 kilometers
  • 2213.331 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2555.406 miles
  • 4112.528 kilometers
  • 2220.587 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanjing to Padang?

The estimated flight time from Nanjing Lukou International Airport to Minangkabau International Airport is 5 hours and 19 minutes.

Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Minangkabau International Airport (PDG)

On average, flying from Nanjing to Padang generates about 281 kg of CO2 per passenger, and 281 kilograms equals 619 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nanjing to Padang

See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Minangkabau International Airport (PDG).

Airport information

Origin Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E
Destination Minangkabau International Airport
City: Padang
Country: Indonesia Flag of Indonesia
IATA Code: PDG
ICAO Code: WIPT
Coordinates: 0°47′12″S, 100°16′51″E