Air Miles Calculator logo

How far is Wakkanai from Liupanshui?

The distance between Liupanshui (Liupanshui Yuezhao Airport) and Wakkanai (Wakkanai Airport) is 2405 miles / 3870 kilometers / 2090 nautical miles.

The driving distance from Liupanshui (LPF) to Wakkanai (WKJ) is 4000 miles / 6438 kilometers, and travel time by car is about 79 hours 34 minutes.

Liupanshui Yuezhao Airport – Wakkanai Airport

Distance arrow
2405
Miles
Distance arrow
3870
Kilometers
Distance arrow
2090
Nautical miles

Search flights

Distance from Liupanshui to Wakkanai

There are several ways to calculate the distance from Liupanshui to Wakkanai. Here are two standard methods:

Vincenty's formula (applied above)
  • 2404.777 miles
  • 3870.114 kilometers
  • 2089.694 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2402.389 miles
  • 3866.270 kilometers
  • 2087.619 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Liupanshui to Wakkanai?

The estimated flight time from Liupanshui Yuezhao Airport to Wakkanai Airport is 5 hours and 3 minutes.

Flight carbon footprint between Liupanshui Yuezhao Airport (LPF) and Wakkanai Airport (WKJ)

On average, flying from Liupanshui to Wakkanai generates about 264 kg of CO2 per passenger, and 264 kilograms equals 582 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Liupanshui to Wakkanai

See the map of the shortest flight path between Liupanshui Yuezhao Airport (LPF) and Wakkanai Airport (WKJ).

Airport information

Origin Liupanshui Yuezhao Airport
City: Liupanshui
Country: China Flag of China
IATA Code: LPF
ICAO Code: ZUPS
Coordinates: 26°36′33″N, 104°58′44″E
Destination Wakkanai Airport
City: Wakkanai
Country: Japan Flag of Japan
IATA Code: WKJ
ICAO Code: RJCW
Coordinates: 45°24′15″N, 141°48′3″E