Air Miles Calculator logo

How far is Surin from Xinyuan County?

The distance between Xinyuan County (Xinyuan Nalati Airport) and Surin (Surin Airport) is 2299 miles / 3700 kilometers / 1998 nautical miles.

The driving distance from Xinyuan County (NLT) to Surin (PXR) is 3440 miles / 5536 kilometers, and travel time by car is about 65 hours 5 minutes.

Xinyuan Nalati Airport – Surin Airport

Distance arrow
2299
Miles
Distance arrow
3700
Kilometers
Distance arrow
1998
Nautical miles

Search flights

Distance from Xinyuan County to Surin

There are several ways to calculate the distance from Xinyuan County to Surin. Here are two standard methods:

Vincenty's formula (applied above)
  • 2299.269 miles
  • 3700.315 kilometers
  • 1998.010 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2303.332 miles
  • 3706.853 kilometers
  • 2001.541 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xinyuan County to Surin?

The estimated flight time from Xinyuan Nalati Airport to Surin Airport is 4 hours and 51 minutes.

Flight carbon footprint between Xinyuan Nalati Airport (NLT) and Surin Airport (PXR)

On average, flying from Xinyuan County to Surin generates about 252 kg of CO2 per passenger, and 252 kilograms equals 555 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xinyuan County to Surin

See the map of the shortest flight path between Xinyuan Nalati Airport (NLT) and Surin Airport (PXR).

Airport information

Origin Xinyuan Nalati Airport
City: Xinyuan County
Country: China Flag of China
IATA Code: NLT
ICAO Code: ZWNL
Coordinates: 43°25′54″N, 83°22′42″E
Destination Surin Airport
City: Surin
Country: Thailand Flag of Thailand
IATA Code: PXR
ICAO Code: VTUJ
Coordinates: 14°52′5″N, 103°29′52″E