Air Miles Calculator logo

How far is Jiayuguan from Lashio?

The distance between Lashio (Lashio Airport) and Jiayuguan (Jiayuguan Airport) is 1163 miles / 1872 kilometers / 1011 nautical miles.

The driving distance from Lashio (LSH) to Jiayuguan (JGN) is 1896 miles / 3051 kilometers, and travel time by car is about 34 hours 41 minutes.

Lashio Airport – Jiayuguan Airport

Distance arrow
1163
Miles
Distance arrow
1872
Kilometers
Distance arrow
1011
Nautical miles
Flight time duration
2 h 42 min
Time Difference
1 h 30 min
CO2 emission
160 kg

Search flights

Distance from Lashio to Jiayuguan

There are several ways to calculate the distance from Lashio to Jiayuguan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1163.440 miles
  • 1872.375 kilometers
  • 1011.002 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1166.737 miles
  • 1877.681 kilometers
  • 1013.867 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lashio to Jiayuguan?

The estimated flight time from Lashio Airport to Jiayuguan Airport is 2 hours and 42 minutes.

Flight carbon footprint between Lashio Airport (LSH) and Jiayuguan Airport (JGN)

On average, flying from Lashio to Jiayuguan generates about 160 kg of CO2 per passenger, and 160 kilograms equals 352 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lashio to Jiayuguan

See the map of the shortest flight path between Lashio Airport (LSH) and Jiayuguan Airport (JGN).

Airport information

Origin Lashio Airport
City: Lashio
Country: Burma Flag of Burma
IATA Code: LSH
ICAO Code: VYLS
Coordinates: 22°58′40″N, 97°45′7″E
Destination Jiayuguan Airport
City: Jiayuguan
Country: China Flag of China
IATA Code: JGN
ICAO Code: ZLJQ
Coordinates: 39°51′24″N, 98°20′29″E