Air Miles Calculator logo

How far is Nanning from Nagoya?

The distance between Nagoya (Nagoya Airfield) and Nanning (Nanning Wuxu International Airport) is 1938 miles / 3118 kilometers / 1684 nautical miles.

The driving distance from Nagoya (NKM) to Nanning (NNG) is 3099 miles / 4988 kilometers, and travel time by car is about 60 hours 41 minutes.

Nagoya Airfield – Nanning Wuxu International Airport

Distance arrow
1938
Miles
Distance arrow
3118
Kilometers
Distance arrow
1684
Nautical miles

Search flights

Distance from Nagoya to Nanning

There are several ways to calculate the distance from Nagoya to Nanning. Here are two standard methods:

Vincenty's formula (applied above)
  • 1937.549 miles
  • 3118.183 kilometers
  • 1683.684 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1935.879 miles
  • 3115.496 kilometers
  • 1682.233 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nagoya to Nanning?

The estimated flight time from Nagoya Airfield to Nanning Wuxu International Airport is 4 hours and 10 minutes.

Flight carbon footprint between Nagoya Airfield (NKM) and Nanning Wuxu International Airport (NNG)

On average, flying from Nagoya to Nanning generates about 212 kg of CO2 per passenger, and 212 kilograms equals 467 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nagoya to Nanning

See the map of the shortest flight path between Nagoya Airfield (NKM) and Nanning Wuxu International Airport (NNG).

Airport information

Origin Nagoya Airfield
City: Nagoya
Country: Japan Flag of Japan
IATA Code: NKM
ICAO Code: RJNA
Coordinates: 35°15′18″N, 136°55′26″E
Destination Nanning Wuxu International Airport
City: Nanning
Country: China Flag of China
IATA Code: NNG
ICAO Code: ZGNN
Coordinates: 22°36′29″N, 108°10′19″E