Air Miles Calculator logo

How far is Liupanshui from Niigata?

The distance between Niigata (Niigata Airport) and Liupanshui (Liupanshui Yuezhao Airport) is 2131 miles / 3430 kilometers / 1852 nautical miles.

The driving distance from Niigata (KIJ) to Liupanshui (LPF) is 2946 miles / 4741 kilometers, and travel time by car is about 67 hours 53 minutes.

Niigata Airport – Liupanshui Yuezhao Airport

Distance arrow
2131
Miles
Distance arrow
3430
Kilometers
Distance arrow
1852
Nautical miles

Search flights

Distance from Niigata to Liupanshui

There are several ways to calculate the distance from Niigata to Liupanshui. Here are two standard methods:

Vincenty's formula (applied above)
  • 2131.161 miles
  • 3429.771 kilometers
  • 1851.928 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2128.127 miles
  • 3424.889 kilometers
  • 1849.292 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Niigata to Liupanshui?

The estimated flight time from Niigata Airport to Liupanshui Yuezhao Airport is 4 hours and 32 minutes.

Flight carbon footprint between Niigata Airport (KIJ) and Liupanshui Yuezhao Airport (LPF)

On average, flying from Niigata to Liupanshui generates about 232 kg of CO2 per passenger, and 232 kilograms equals 513 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Niigata to Liupanshui

See the map of the shortest flight path between Niigata Airport (KIJ) and Liupanshui Yuezhao Airport (LPF).

Airport information

Origin Niigata Airport
City: Niigata
Country: Japan Flag of Japan
IATA Code: KIJ
ICAO Code: RJSN
Coordinates: 37°57′21″N, 139°7′15″E
Destination Liupanshui Yuezhao Airport
City: Liupanshui
Country: China Flag of China
IATA Code: LPF
ICAO Code: ZUPS
Coordinates: 26°36′33″N, 104°58′44″E