Air Miles Calculator logo

How far is Xuzhou from Yanbu?

The distance between Yanbu (Yanbu Airport) and Xuzhou (Xuzhou Guanyin International Airport) is 4714 miles / 7587 kilometers / 4096 nautical miles.

The driving distance from Yanbu (YNB) to Xuzhou (XUZ) is 5991 miles / 9641 kilometers, and travel time by car is about 115 hours 21 minutes.

Yanbu Airport – Xuzhou Guanyin International Airport

Distance arrow
4714
Miles
Distance arrow
7587
Kilometers
Distance arrow
4096
Nautical miles

Search flights

Distance from Yanbu to Xuzhou

There are several ways to calculate the distance from Yanbu to Xuzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 4714.051 miles
  • 7586.530 kilometers
  • 4096.398 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4705.639 miles
  • 7572.992 kilometers
  • 4089.089 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yanbu to Xuzhou?

The estimated flight time from Yanbu Airport to Xuzhou Guanyin International Airport is 9 hours and 25 minutes.

Flight carbon footprint between Yanbu Airport (YNB) and Xuzhou Guanyin International Airport (XUZ)

On average, flying from Yanbu to Xuzhou generates about 547 kg of CO2 per passenger, and 547 kilograms equals 1 206 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yanbu to Xuzhou

See the map of the shortest flight path between Yanbu Airport (YNB) and Xuzhou Guanyin International Airport (XUZ).

Airport information

Origin Yanbu Airport
City: Yanbu
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: YNB
ICAO Code: OEYN
Coordinates: 24°8′39″N, 38°3′48″E
Destination Xuzhou Guanyin International Airport
City: Xuzhou
Country: China Flag of China
IATA Code: XUZ
ICAO Code: ZSXZ
Coordinates: 34°17′17″N, 117°10′15″E