Air Miles Calculator logo

How far is Masuda from Nanjing?

The distance between Nanjing (Nanjing Lukou International Airport) and Masuda (Iwami Airport) is 775 miles / 1247 kilometers / 674 nautical miles.

The driving distance from Nanjing (NKG) to Masuda (IWJ) is 1867 miles / 3004 kilometers, and travel time by car is about 38 hours 12 minutes.

Nanjing Lukou International Airport – Iwami Airport

Distance arrow
775
Miles
Distance arrow
1247
Kilometers
Distance arrow
674
Nautical miles

Search flights

Distance from Nanjing to Masuda

There are several ways to calculate the distance from Nanjing to Masuda. Here are two standard methods:

Vincenty's formula (applied above)
  • 775.120 miles
  • 1247.434 kilometers
  • 673.561 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 773.725 miles
  • 1245.189 kilometers
  • 672.348 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanjing to Masuda?

The estimated flight time from Nanjing Lukou International Airport to Iwami Airport is 1 hour and 58 minutes.

Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Iwami Airport (IWJ)

On average, flying from Nanjing to Masuda generates about 132 kg of CO2 per passenger, and 132 kilograms equals 292 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nanjing to Masuda

See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Iwami Airport (IWJ).

Airport information

Origin Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E
Destination Iwami Airport
City: Masuda
Country: Japan Flag of Japan
IATA Code: IWJ
ICAO Code: RJOW
Coordinates: 34°40′35″N, 131°47′23″E