Air Miles Calculator logo

How far is Shijiazhuang from Yonaguni Jima?

The distance between Yonaguni Jima (Yonaguni Airport) and Shijiazhuang (Shijiazhuang Zhengding International Airport) is 1069 miles / 1720 kilometers / 929 nautical miles.

Yonaguni Airport – Shijiazhuang Zhengding International Airport

Distance arrow
1069
Miles
Distance arrow
1720
Kilometers
Distance arrow
929
Nautical miles

Search flights

Distance from Yonaguni Jima to Shijiazhuang

There are several ways to calculate the distance from Yonaguni Jima to Shijiazhuang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1069.023 miles
  • 1720.425 kilometers
  • 928.955 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1070.996 miles
  • 1723.601 kilometers
  • 930.670 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yonaguni Jima to Shijiazhuang?

The estimated flight time from Yonaguni Airport to Shijiazhuang Zhengding International Airport is 2 hours and 31 minutes.

Flight carbon footprint between Yonaguni Airport (OGN) and Shijiazhuang Zhengding International Airport (SJW)

On average, flying from Yonaguni Jima to Shijiazhuang generates about 155 kg of CO2 per passenger, and 155 kilograms equals 342 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Yonaguni Jima to Shijiazhuang

See the map of the shortest flight path between Yonaguni Airport (OGN) and Shijiazhuang Zhengding International Airport (SJW).

Airport information

Origin Yonaguni Airport
City: Yonaguni Jima
Country: Japan Flag of Japan
IATA Code: OGN
ICAO Code: ROYN
Coordinates: 24°28′0″N, 122°58′40″E
Destination Shijiazhuang Zhengding International Airport
City: Shijiazhuang
Country: China Flag of China
IATA Code: SJW
ICAO Code: ZBSJ
Coordinates: 38°16′50″N, 114°41′49″E