How far is Baishan from Nanning?
The distance between Nanning (Nanning Wuxu International Airport) and Baishan (Changbaishan Airport) is 1749 miles / 2815 kilometers / 1520 nautical miles.
The driving distance from Nanning (NNG) to Baishan (NBS) is 2142 miles / 3448 kilometers, and travel time by car is about 38 hours 52 minutes.
Nanning Wuxu International Airport – Changbaishan Airport
Search flights
Distance from Nanning to Baishan
There are several ways to calculate the distance from Nanning to Baishan. Here are two standard methods:
Vincenty's formula (applied above)- 1748.929 miles
- 2814.628 kilometers
- 1519.778 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1750.201 miles
- 2816.676 kilometers
- 1520.883 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nanning to Baishan?
The estimated flight time from Nanning Wuxu International Airport to Changbaishan Airport is 3 hours and 48 minutes.
What is the time difference between Nanning and Baishan?
Flight carbon footprint between Nanning Wuxu International Airport (NNG) and Changbaishan Airport (NBS)
On average, flying from Nanning to Baishan generates about 196 kg of CO2 per passenger, and 196 kilograms equals 433 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Nanning to Baishan
See the map of the shortest flight path between Nanning Wuxu International Airport (NNG) and Changbaishan Airport (NBS).
Airport information
Origin | Nanning Wuxu International Airport |
---|---|
City: | Nanning |
Country: | China |
IATA Code: | NNG |
ICAO Code: | ZGNN |
Coordinates: | 22°36′29″N, 108°10′19″E |
Destination | Changbaishan Airport |
---|---|
City: | Baishan |
Country: | China |
IATA Code: | NBS |
ICAO Code: | ZYBS |
Coordinates: | 42°4′0″N, 127°36′7″E |