How far is Jambi from Nanjing?
The distance between Nanjing (Nanjing Lukou International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 2503 miles / 4029 kilometers / 2175 nautical miles.
Nanjing Lukou International Airport – Sultan Thaha Syaifuddin Airport
Search flights
Distance from Nanjing to Jambi
There are several ways to calculate the distance from Nanjing to Jambi. Here are two standard methods:
Vincenty's formula (applied above)- 2503.433 miles
- 4028.886 kilometers
- 2175.424 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2512.750 miles
- 4043.879 kilometers
- 2183.520 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nanjing to Jambi?
The estimated flight time from Nanjing Lukou International Airport to Sultan Thaha Syaifuddin Airport is 5 hours and 14 minutes.
What is the time difference between Nanjing and Jambi?
The time difference between Nanjing and Jambi is 1 hour. Jambi is 1 hour behind Nanjing.
Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Sultan Thaha Syaifuddin Airport (DJB)
On average, flying from Nanjing to Jambi generates about 276 kg of CO2 per passenger, and 276 kilograms equals 608 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Nanjing to Jambi
See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Sultan Thaha Syaifuddin Airport (DJB).
Airport information
Origin | Nanjing Lukou International Airport |
---|---|
City: | Nanjing |
Country: | China |
IATA Code: | NKG |
ICAO Code: | ZSNJ |
Coordinates: | 31°44′31″N, 118°51′43″E |
Destination | Sultan Thaha Syaifuddin Airport |
---|---|
City: | Jambi |
Country: | Indonesia |
IATA Code: | DJB |
ICAO Code: | WIPA |
Coordinates: | 1°38′16″S, 103°38′38″E |