Air Miles Calculator logo

How far is Badanjilin from Longyearbyen?

The distance between Longyearbyen (Svalbard Airport, Longyear) and Badanjilin (Alxa Right Banner Badanjilin Airport) is 3529 miles / 5679 kilometers / 3067 nautical miles.

Svalbard Airport, Longyear – Alxa Right Banner Badanjilin Airport

Distance arrow
3529
Miles
Distance arrow
5679
Kilometers
Distance arrow
3067
Nautical miles

Search flights

Distance from Longyearbyen to Badanjilin

There are several ways to calculate the distance from Longyearbyen to Badanjilin. Here are two standard methods:

Vincenty's formula (applied above)
  • 3528.929 miles
  • 5679.260 kilometers
  • 3066.555 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3520.838 miles
  • 5666.239 kilometers
  • 3059.524 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Longyearbyen to Badanjilin?

The estimated flight time from Svalbard Airport, Longyear to Alxa Right Banner Badanjilin Airport is 7 hours and 10 minutes.

Flight carbon footprint between Svalbard Airport, Longyear (LYR) and Alxa Right Banner Badanjilin Airport (RHT)

On average, flying from Longyearbyen to Badanjilin generates about 398 kg of CO2 per passenger, and 398 kilograms equals 878 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Longyearbyen to Badanjilin

See the map of the shortest flight path between Svalbard Airport, Longyear (LYR) and Alxa Right Banner Badanjilin Airport (RHT).

Airport information

Origin Svalbard Airport, Longyear
City: Longyearbyen
Country: Norway Flag of Norway
IATA Code: LYR
ICAO Code: ENSB
Coordinates: 78°14′45″N, 15°27′56″E
Destination Alxa Right Banner Badanjilin Airport
City: Badanjilin
Country: China Flag of China
IATA Code: RHT
ICAO Code: ZBAR
Coordinates: 39°13′30″N, 101°32′45″E