Air Miles Calculator logo

How far is Liupanshui from Omitama?

The distance between Omitama (Ibaraki Airport) and Liupanshui (Liupanshui Yuezhao Airport) is 2182 miles / 3511 kilometers / 1896 nautical miles.

The driving distance from Omitama (IBR) to Liupanshui (LPF) is 3282 miles / 5282 kilometers, and travel time by car is about 64 hours 46 minutes.

Ibaraki Airport – Liupanshui Yuezhao Airport

Distance arrow
2182
Miles
Distance arrow
3511
Kilometers
Distance arrow
1896
Nautical miles

Search flights

Distance from Omitama to Liupanshui

There are several ways to calculate the distance from Omitama to Liupanshui. Here are two standard methods:

Vincenty's formula (applied above)
  • 2181.698 miles
  • 3511.102 kilometers
  • 1895.843 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2178.259 miles
  • 3505.567 kilometers
  • 1892.855 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Omitama to Liupanshui?

The estimated flight time from Ibaraki Airport to Liupanshui Yuezhao Airport is 4 hours and 37 minutes.

Flight carbon footprint between Ibaraki Airport (IBR) and Liupanshui Yuezhao Airport (LPF)

On average, flying from Omitama to Liupanshui generates about 238 kg of CO2 per passenger, and 238 kilograms equals 525 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Omitama to Liupanshui

See the map of the shortest flight path between Ibaraki Airport (IBR) and Liupanshui Yuezhao Airport (LPF).

Airport information

Origin Ibaraki Airport
City: Omitama
Country: Japan Flag of Japan
IATA Code: IBR
ICAO Code: RJAH
Coordinates: 36°10′51″N, 140°24′53″E
Destination Liupanshui Yuezhao Airport
City: Liupanshui
Country: China Flag of China
IATA Code: LPF
ICAO Code: ZUPS
Coordinates: 26°36′33″N, 104°58′44″E