Air Miles Calculator logo

How far is Aomori from Nanjing?

The distance between Nanjing (Nanjing Lukou International Airport) and Aomori (Aomori Airport) is 1363 miles / 2193 kilometers / 1184 nautical miles.

The driving distance from Nanjing (NKG) to Aomori (AOJ) is 2701 miles / 4347 kilometers, and travel time by car is about 54 hours 29 minutes.

Nanjing Lukou International Airport – Aomori Airport

Distance arrow
1363
Miles
Distance arrow
2193
Kilometers
Distance arrow
1184
Nautical miles

Search flights

Distance from Nanjing to Aomori

There are several ways to calculate the distance from Nanjing to Aomori. Here are two standard methods:

Vincenty's formula (applied above)
  • 1362.634 miles
  • 2192.947 kilometers
  • 1184.097 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1360.756 miles
  • 2189.924 kilometers
  • 1182.464 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanjing to Aomori?

The estimated flight time from Nanjing Lukou International Airport to Aomori Airport is 3 hours and 4 minutes.

Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Aomori Airport (AOJ)

On average, flying from Nanjing to Aomori generates about 171 kg of CO2 per passenger, and 171 kilograms equals 377 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nanjing to Aomori

See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Aomori Airport (AOJ).

Airport information

Origin Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E
Destination Aomori Airport
City: Aomori
Country: Japan Flag of Japan
IATA Code: AOJ
ICAO Code: RJSA
Coordinates: 40°44′4″N, 140°41′27″E