Air Miles Calculator logo

How far is Yichun from Novy Urengoy?

The distance between Novy Urengoy (Novy Urengoy Airport) and Yichun (Yichun Lindu Airport) is 2257 miles / 3633 kilometers / 1962 nautical miles.

The driving distance from Novy Urengoy (NUX) to Yichun (LDS) is 4189 miles / 6742 kilometers, and travel time by car is about 97 hours 57 minutes.

Novy Urengoy Airport – Yichun Lindu Airport

Distance arrow
2257
Miles
Distance arrow
3633
Kilometers
Distance arrow
1962
Nautical miles

Search flights

Distance from Novy Urengoy to Yichun

There are several ways to calculate the distance from Novy Urengoy to Yichun. Here are two standard methods:

Vincenty's formula (applied above)
  • 2257.392 miles
  • 3632.920 kilometers
  • 1961.620 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2251.098 miles
  • 3622.791 kilometers
  • 1956.151 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Novy Urengoy to Yichun?

The estimated flight time from Novy Urengoy Airport to Yichun Lindu Airport is 4 hours and 46 minutes.

Flight carbon footprint between Novy Urengoy Airport (NUX) and Yichun Lindu Airport (LDS)

On average, flying from Novy Urengoy to Yichun generates about 247 kg of CO2 per passenger, and 247 kilograms equals 545 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Novy Urengoy to Yichun

See the map of the shortest flight path between Novy Urengoy Airport (NUX) and Yichun Lindu Airport (LDS).

Airport information

Origin Novy Urengoy Airport
City: Novy Urengoy
Country: Russia Flag of Russia
IATA Code: NUX
ICAO Code: USMU
Coordinates: 66°4′9″N, 76°31′13″E
Destination Yichun Lindu Airport
City: Yichun
Country: China Flag of China
IATA Code: LDS
ICAO Code: ZYLD
Coordinates: 47°45′7″N, 129°1′8″E