Air Miles Calculator logo

How far is Jingdezhen from Phoenix, AZ?

The distance between Phoenix (Phoenix Sky Harbor International Airport) and Jingdezhen (Jingdezhen Luojia Airport) is 7051 miles / 11347 kilometers / 6127 nautical miles.

Phoenix Sky Harbor International Airport – Jingdezhen Luojia Airport

Distance arrow
7051
Miles
Distance arrow
11347
Kilometers
Distance arrow
6127
Nautical miles

Search flights

Distance from Phoenix to Jingdezhen

There are several ways to calculate the distance from Phoenix to Jingdezhen. Here are two standard methods:

Vincenty's formula (applied above)
  • 7050.850 miles
  • 11347.243 kilometers
  • 6127.021 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7037.785 miles
  • 11326.217 kilometers
  • 6115.668 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phoenix to Jingdezhen?

The estimated flight time from Phoenix Sky Harbor International Airport to Jingdezhen Luojia Airport is 13 hours and 50 minutes.

Flight carbon footprint between Phoenix Sky Harbor International Airport (PHX) and Jingdezhen Luojia Airport (JDZ)

On average, flying from Phoenix to Jingdezhen generates about 862 kg of CO2 per passenger, and 862 kilograms equals 1 900 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Phoenix to Jingdezhen

See the map of the shortest flight path between Phoenix Sky Harbor International Airport (PHX) and Jingdezhen Luojia Airport (JDZ).

Airport information

Origin Phoenix Sky Harbor International Airport
City: Phoenix, AZ
Country: United States Flag of United States
IATA Code: PHX
ICAO Code: KPHX
Coordinates: 33°26′3″N, 112°0′43″W
Destination Jingdezhen Luojia Airport
City: Jingdezhen
Country: China Flag of China
IATA Code: JDZ
ICAO Code: ZSJD
Coordinates: 29°20′18″N, 117°10′33″E