How far is Xuzhou from Naha?
The distance between Naha (Naha Airport) and Xuzhou (Xuzhou Guanyin International Airport) is 838 miles / 1348 kilometers / 728 nautical miles.
The driving distance from Naha (OKA) to Xuzhou (XUZ) is 2206 miles / 3550 kilometers, and travel time by car is about 169 hours 6 minutes.
Naha Airport – Xuzhou Guanyin International Airport
Search flights
Distance from Naha to Xuzhou
There are several ways to calculate the distance from Naha to Xuzhou. Here are two standard methods:
Vincenty's formula (applied above)- 837.624 miles
- 1348.025 kilometers
- 727.875 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 837.839 miles
- 1348.371 kilometers
- 728.062 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Naha to Xuzhou?
The estimated flight time from Naha Airport to Xuzhou Guanyin International Airport is 2 hours and 5 minutes.
What is the time difference between Naha and Xuzhou?
The time difference between Naha and Xuzhou is 1 hour. Xuzhou is 1 hour behind Naha.
Flight carbon footprint between Naha Airport (OKA) and Xuzhou Guanyin International Airport (XUZ)
On average, flying from Naha to Xuzhou generates about 138 kg of CO2 per passenger, and 138 kilograms equals 305 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Naha to Xuzhou
See the map of the shortest flight path between Naha Airport (OKA) and Xuzhou Guanyin International Airport (XUZ).
Airport information
Origin | Naha Airport |
---|---|
City: | Naha |
Country: | Japan |
IATA Code: | OKA |
ICAO Code: | ROAH |
Coordinates: | 26°11′44″N, 127°38′45″E |
Destination | Xuzhou Guanyin International Airport |
---|---|
City: | Xuzhou |
Country: | China |
IATA Code: | XUZ |
ICAO Code: | ZSXZ |
Coordinates: | 34°17′17″N, 117°10′15″E |