How far is Xuzhou from Beijing?
The distance between Beijing (Beijing Nanyuan Airport) and Xuzhou (Xuzhou Guanyin International Airport) is 381 miles / 614 kilometers / 331 nautical miles.
The driving distance from Beijing (NAY) to Xuzhou (XUZ) is 426 miles / 685 kilometers, and travel time by car is about 7 hours 51 minutes.
Beijing Nanyuan Airport – Xuzhou Guanyin International Airport
Search flights
Distance from Beijing to Xuzhou
There are several ways to calculate the distance from Beijing to Xuzhou. Here are two standard methods:
Vincenty's formula (applied above)- 381.368 miles
- 613.752 kilometers
- 331.399 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 382.091 miles
- 614.915 kilometers
- 332.028 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beijing to Xuzhou?
The estimated flight time from Beijing Nanyuan Airport to Xuzhou Guanyin International Airport is 1 hour and 13 minutes.
What is the time difference between Beijing and Xuzhou?
Flight carbon footprint between Beijing Nanyuan Airport (NAY) and Xuzhou Guanyin International Airport (XUZ)
On average, flying from Beijing to Xuzhou generates about 81 kg of CO2 per passenger, and 81 kilograms equals 179 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Beijing to Xuzhou
See the map of the shortest flight path between Beijing Nanyuan Airport (NAY) and Xuzhou Guanyin International Airport (XUZ).
Airport information
Origin | Beijing Nanyuan Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | NAY |
ICAO Code: | ZBNY |
Coordinates: | 39°46′58″N, 116°23′16″E |
Destination | Xuzhou Guanyin International Airport |
---|---|
City: | Xuzhou |
Country: | China |
IATA Code: | XUZ |
ICAO Code: | ZSXZ |
Coordinates: | 34°17′17″N, 117°10′15″E |