Air Miles Calculator logo

How far is Jingdezhen from Nagasaki?

The distance between Nagasaki (Nagasaki Airport) and Jingdezhen (Jingdezhen Luojia Airport) is 793 miles / 1277 kilometers / 690 nautical miles.

The driving distance from Nagasaki (NGS) to Jingdezhen (JDZ) is 2012 miles / 3238 kilometers, and travel time by car is about 40 hours 51 minutes.

Nagasaki Airport – Jingdezhen Luojia Airport

Distance arrow
793
Miles
Distance arrow
1277
Kilometers
Distance arrow
690
Nautical miles

Search flights

Distance from Nagasaki to Jingdezhen

There are several ways to calculate the distance from Nagasaki to Jingdezhen. Here are two standard methods:

Vincenty's formula (applied above)
  • 793.499 miles
  • 1277.014 kilometers
  • 689.532 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 792.282 miles
  • 1275.054 kilometers
  • 688.474 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nagasaki to Jingdezhen?

The estimated flight time from Nagasaki Airport to Jingdezhen Luojia Airport is 2 hours and 0 minutes.

Flight carbon footprint between Nagasaki Airport (NGS) and Jingdezhen Luojia Airport (JDZ)

On average, flying from Nagasaki to Jingdezhen generates about 134 kg of CO2 per passenger, and 134 kilograms equals 296 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nagasaki to Jingdezhen

See the map of the shortest flight path between Nagasaki Airport (NGS) and Jingdezhen Luojia Airport (JDZ).

Airport information

Origin Nagasaki Airport
City: Nagasaki
Country: Japan Flag of Japan
IATA Code: NGS
ICAO Code: RJFU
Coordinates: 32°55′0″N, 129°54′50″E
Destination Jingdezhen Luojia Airport
City: Jingdezhen
Country: China Flag of China
IATA Code: JDZ
ICAO Code: ZSJD
Coordinates: 29°20′18″N, 117°10′33″E