Air Miles Calculator logo

How far is Surin from Liupanshui?

The distance between Liupanshui (Liupanshui Yuezhao Airport) and Surin (Surin Airport) is 813 miles / 1309 kilometers / 707 nautical miles.

The driving distance from Liupanshui (LPF) to Surin (PXR) is 1215 miles / 1956 kilometers, and travel time by car is about 24 hours 34 minutes.

Liupanshui Yuezhao Airport – Surin Airport

Distance arrow
813
Miles
Distance arrow
1309
Kilometers
Distance arrow
707
Nautical miles

Search flights

Distance from Liupanshui to Surin

There are several ways to calculate the distance from Liupanshui to Surin. Here are two standard methods:

Vincenty's formula (applied above)
  • 813.381 miles
  • 1309.010 kilometers
  • 706.809 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 816.831 miles
  • 1314.562 kilometers
  • 709.806 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Liupanshui to Surin?

The estimated flight time from Liupanshui Yuezhao Airport to Surin Airport is 2 hours and 2 minutes.

Flight carbon footprint between Liupanshui Yuezhao Airport (LPF) and Surin Airport (PXR)

On average, flying from Liupanshui to Surin generates about 136 kg of CO2 per passenger, and 136 kilograms equals 300 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Liupanshui to Surin

See the map of the shortest flight path between Liupanshui Yuezhao Airport (LPF) and Surin Airport (PXR).

Airport information

Origin Liupanshui Yuezhao Airport
City: Liupanshui
Country: China Flag of China
IATA Code: LPF
ICAO Code: ZUPS
Coordinates: 26°36′33″N, 104°58′44″E
Destination Surin Airport
City: Surin
Country: Thailand Flag of Thailand
IATA Code: PXR
ICAO Code: VTUJ
Coordinates: 14°52′5″N, 103°29′52″E