Air Miles Calculator logo

How far is Xuzhou from Nan?

The distance between Nan (Nan Nakhon Airport) and Xuzhou (Xuzhou Guanyin International Airport) is 1467 miles / 2361 kilometers / 1275 nautical miles.

The driving distance from Nan (NNT) to Xuzhou (XUZ) is 1948 miles / 3135 kilometers, and travel time by car is about 37 hours 45 minutes.

Nan Nakhon Airport – Xuzhou Guanyin International Airport

Distance arrow
1467
Miles
Distance arrow
2361
Kilometers
Distance arrow
1275
Nautical miles

Search flights

Distance from Nan to Xuzhou

There are several ways to calculate the distance from Nan to Xuzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 1467.363 miles
  • 2361.492 kilometers
  • 1275.104 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1468.884 miles
  • 2363.940 kilometers
  • 1276.425 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nan to Xuzhou?

The estimated flight time from Nan Nakhon Airport to Xuzhou Guanyin International Airport is 3 hours and 16 minutes.

Flight carbon footprint between Nan Nakhon Airport (NNT) and Xuzhou Guanyin International Airport (XUZ)

On average, flying from Nan to Xuzhou generates about 177 kg of CO2 per passenger, and 177 kilograms equals 391 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nan to Xuzhou

See the map of the shortest flight path between Nan Nakhon Airport (NNT) and Xuzhou Guanyin International Airport (XUZ).

Airport information

Origin Nan Nakhon Airport
City: Nan
Country: Thailand Flag of Thailand
IATA Code: NNT
ICAO Code: VTCN
Coordinates: 18°48′28″N, 100°46′58″E
Destination Xuzhou Guanyin International Airport
City: Xuzhou
Country: China Flag of China
IATA Code: XUZ
ICAO Code: ZSXZ
Coordinates: 34°17′17″N, 117°10′15″E