Air Miles Calculator logo

How far is Fuyuan from Okinoerabujima?

The distance between Okinoerabujima (Okierabu Airport) and Fuyuan (Fuyuan Dongji Airport) is 1465 miles / 2358 kilometers / 1273 nautical miles.

The driving distance from Okinoerabujima (OKE) to Fuyuan (FYJ) is 2142 miles / 3447 kilometers, and travel time by car is about 142 hours 49 minutes.

Okierabu Airport – Fuyuan Dongji Airport

Distance arrow
1465
Miles
Distance arrow
2358
Kilometers
Distance arrow
1273
Nautical miles

Search flights

Distance from Okinoerabujima to Fuyuan

There are several ways to calculate the distance from Okinoerabujima to Fuyuan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1464.899 miles
  • 2357.527 kilometers
  • 1272.963 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1467.263 miles
  • 2361.331 kilometers
  • 1275.017 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Okinoerabujima to Fuyuan?

The estimated flight time from Okierabu Airport to Fuyuan Dongji Airport is 3 hours and 16 minutes.

Flight carbon footprint between Okierabu Airport (OKE) and Fuyuan Dongji Airport (FYJ)

On average, flying from Okinoerabujima to Fuyuan generates about 177 kg of CO2 per passenger, and 177 kilograms equals 391 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Okinoerabujima to Fuyuan

See the map of the shortest flight path between Okierabu Airport (OKE) and Fuyuan Dongji Airport (FYJ).

Airport information

Origin Okierabu Airport
City: Okinoerabujima
Country: Japan Flag of Japan
IATA Code: OKE
ICAO Code: RJKB
Coordinates: 27°25′31″N, 128°42′3″E
Destination Fuyuan Dongji Airport
City: Fuyuan
Country: China Flag of China
IATA Code: FYJ
ICAO Code: ZYFY
Coordinates: 48°11′58″N, 134°21′59″E