How far is Baicheng from Jiujiang?
The distance between Jiujiang (Jiujiang Lushan Airport) and Baicheng (Baicheng Chang'an Airport) is 1153 miles / 1856 kilometers / 1002 nautical miles.
The driving distance from Jiujiang (JIU) to Baicheng (DBC) is 1429 miles / 2300 kilometers, and travel time by car is about 25 hours 48 minutes.
Jiujiang Lushan Airport – Baicheng Chang'an Airport
Search flights
Distance from Jiujiang to Baicheng
There are several ways to calculate the distance from Jiujiang to Baicheng. Here are two standard methods:
Vincenty's formula (applied above)- 1153.054 miles
- 1855.660 kilometers
- 1001.976 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1154.642 miles
- 1858.216 kilometers
- 1003.356 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Jiujiang to Baicheng?
The estimated flight time from Jiujiang Lushan Airport to Baicheng Chang'an Airport is 2 hours and 40 minutes.
What is the time difference between Jiujiang and Baicheng?
Flight carbon footprint between Jiujiang Lushan Airport (JIU) and Baicheng Chang'an Airport (DBC)
On average, flying from Jiujiang to Baicheng generates about 159 kg of CO2 per passenger, and 159 kilograms equals 351 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Jiujiang to Baicheng
See the map of the shortest flight path between Jiujiang Lushan Airport (JIU) and Baicheng Chang'an Airport (DBC).
Airport information
Origin | Jiujiang Lushan Airport |
---|---|
City: | Jiujiang |
Country: | China |
IATA Code: | JIU |
ICAO Code: | ZSJJ |
Coordinates: | 29°43′58″N, 115°58′58″E |
Destination | Baicheng Chang'an Airport |
---|---|
City: | Baicheng |
Country: | China |
IATA Code: | DBC |
ICAO Code: | ZYBA |
Coordinates: | 45°30′19″N, 123°1′10″E |