Air Miles Calculator logo

How far is Jixi from Iwakuni?

The distance between Iwakuni (Marine Corps Air Station Iwakuni) and Jixi (Jixi Xingkaihu Airport) is 771 miles / 1241 kilometers / 670 nautical miles.

The driving distance from Iwakuni (IWK) to Jixi (JXA) is 1412 miles / 2273 kilometers, and travel time by car is about 30 hours 15 minutes.

Marine Corps Air Station Iwakuni – Jixi Xingkaihu Airport

Distance arrow
771
Miles
Distance arrow
1241
Kilometers
Distance arrow
670
Nautical miles

Search flights

Distance from Iwakuni to Jixi

There are several ways to calculate the distance from Iwakuni to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 771.170 miles
  • 1241.078 kilometers
  • 670.129 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 772.302 miles
  • 1242.899 kilometers
  • 671.112 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Iwakuni to Jixi?

The estimated flight time from Marine Corps Air Station Iwakuni to Jixi Xingkaihu Airport is 1 hour and 57 minutes.

Flight carbon footprint between Marine Corps Air Station Iwakuni (IWK) and Jixi Xingkaihu Airport (JXA)

On average, flying from Iwakuni to Jixi generates about 132 kg of CO2 per passenger, and 132 kilograms equals 291 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Iwakuni to Jixi

See the map of the shortest flight path between Marine Corps Air Station Iwakuni (IWK) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Marine Corps Air Station Iwakuni
City: Iwakuni
Country: Japan Flag of Japan
IATA Code: IWK
ICAO Code: RJOI
Coordinates: 34°8′38″N, 132°14′9″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E