How far is Nyaung U from Lijiang?
The distance between Lijiang (Lijiang Sanyi International Airport) and Nyaung U (Nyaung U Airport) is 506 miles / 815 kilometers / 440 nautical miles.
The driving distance from Lijiang (LJG) to Nyaung U (NYU) is 713 miles / 1147 kilometers, and travel time by car is about 14 hours 15 minutes.
Lijiang Sanyi International Airport – Nyaung U Airport
Search flights
Distance from Lijiang to Nyaung U
There are several ways to calculate the distance from Lijiang to Nyaung U. Here are two standard methods:
Vincenty's formula (applied above)- 506.183 miles
- 814.622 kilometers
- 439.861 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 506.930 miles
- 815.825 kilometers
- 440.510 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Lijiang to Nyaung U?
The estimated flight time from Lijiang Sanyi International Airport to Nyaung U Airport is 1 hour and 27 minutes.
What is the time difference between Lijiang and Nyaung U?
Flight carbon footprint between Lijiang Sanyi International Airport (LJG) and Nyaung U Airport (NYU)
On average, flying from Lijiang to Nyaung U generates about 100 kg of CO2 per passenger, and 100 kilograms equals 219 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Lijiang to Nyaung U
See the map of the shortest flight path between Lijiang Sanyi International Airport (LJG) and Nyaung U Airport (NYU).
Airport information
Origin | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |
Destination | Nyaung U Airport |
---|---|
City: | Nyaung U |
Country: | Burma |
IATA Code: | NYU |
ICAO Code: | VYBG |
Coordinates: | 21°10′43″N, 94°55′48″E |