How far is Jambi from Beijing?
The distance between Beijing (Beijing Daxing International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 2945 miles / 4739 kilometers / 2559 nautical miles.
Beijing Daxing International Airport – Sultan Thaha Syaifuddin Airport
Search flights
Distance from Beijing to Jambi
There are several ways to calculate the distance from Beijing to Jambi. Here are two standard methods:
Vincenty's formula (applied above)- 2944.745 miles
- 4739.107 kilometers
- 2558.913 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2955.879 miles
- 4757.025 kilometers
- 2568.588 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beijing to Jambi?
The estimated flight time from Beijing Daxing International Airport to Sultan Thaha Syaifuddin Airport is 6 hours and 4 minutes.
What is the time difference between Beijing and Jambi?
The time difference between Beijing and Jambi is 1 hour. Jambi is 1 hour behind Beijing.
Flight carbon footprint between Beijing Daxing International Airport (PKX) and Sultan Thaha Syaifuddin Airport (DJB)
On average, flying from Beijing to Jambi generates about 328 kg of CO2 per passenger, and 328 kilograms equals 722 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Beijing to Jambi
See the map of the shortest flight path between Beijing Daxing International Airport (PKX) and Sultan Thaha Syaifuddin Airport (DJB).
Airport information
Origin | Beijing Daxing International Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | PKX |
ICAO Code: | ZBAD |
Coordinates: | 39°30′33″N, 116°24′38″E |
Destination | Sultan Thaha Syaifuddin Airport |
---|---|
City: | Jambi |
Country: | Indonesia |
IATA Code: | DJB |
ICAO Code: | WIPA |
Coordinates: | 1°38′16″S, 103°38′38″E |