Air Miles Calculator logo

How far is Jiujiang from Las Vegas, NV?

The distance between Las Vegas (Las Vegas Harry Reid International Airport) and Jiujiang (Jiujiang Lushan Airport) is 6821 miles / 10977 kilometers / 5927 nautical miles.

Las Vegas Harry Reid International Airport – Jiujiang Lushan Airport

Distance arrow
6821
Miles
Distance arrow
10977
Kilometers
Distance arrow
5927
Nautical miles

Search flights

Distance from Las Vegas to Jiujiang

There are several ways to calculate the distance from Las Vegas to Jiujiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 6820.836 miles
  • 10977.072 kilometers
  • 5927.145 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6807.618 miles
  • 10955.799 kilometers
  • 5915.658 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Las Vegas to Jiujiang?

The estimated flight time from Las Vegas Harry Reid International Airport to Jiujiang Lushan Airport is 13 hours and 24 minutes.

Flight carbon footprint between Las Vegas Harry Reid International Airport (LAS) and Jiujiang Lushan Airport (JIU)

On average, flying from Las Vegas to Jiujiang generates about 830 kg of CO2 per passenger, and 830 kilograms equals 1 829 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Las Vegas to Jiujiang

See the map of the shortest flight path between Las Vegas Harry Reid International Airport (LAS) and Jiujiang Lushan Airport (JIU).

Airport information

Origin Las Vegas Harry Reid International Airport
City: Las Vegas, NV
Country: United States Flag of United States
IATA Code: LAS
ICAO Code: KLAS
Coordinates: 36°4′48″N, 115°9′7″W
Destination Jiujiang Lushan Airport
City: Jiujiang
Country: China Flag of China
IATA Code: JIU
ICAO Code: ZSJJ
Coordinates: 29°43′58″N, 115°58′58″E