Air Miles Calculator logo

How far is Badanjilin from Saipan?

The distance between Saipan (Saipan International Airport) and Badanjilin (Alxa Right Banner Badanjilin Airport) is 3143 miles / 5058 kilometers / 2731 nautical miles.

Saipan International Airport – Alxa Right Banner Badanjilin Airport

Distance arrow
3143
Miles
Distance arrow
5058
Kilometers
Distance arrow
2731
Nautical miles

Search flights

Distance from Saipan to Badanjilin

There are several ways to calculate the distance from Saipan to Badanjilin. Here are two standard methods:

Vincenty's formula (applied above)
  • 3143.051 miles
  • 5058.250 kilometers
  • 2731.236 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3141.934 miles
  • 5056.453 kilometers
  • 2730.266 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saipan to Badanjilin?

The estimated flight time from Saipan International Airport to Alxa Right Banner Badanjilin Airport is 6 hours and 27 minutes.

Flight carbon footprint between Saipan International Airport (SPN) and Alxa Right Banner Badanjilin Airport (RHT)

On average, flying from Saipan to Badanjilin generates about 351 kg of CO2 per passenger, and 351 kilograms equals 775 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Saipan to Badanjilin

See the map of the shortest flight path between Saipan International Airport (SPN) and Alxa Right Banner Badanjilin Airport (RHT).

Airport information

Origin Saipan International Airport
City: Saipan
Country: Northern Mariana Islands Flag of Northern Mariana Islands
IATA Code: SPN
ICAO Code: PGSN
Coordinates: 15°7′8″N, 145°43′44″E
Destination Alxa Right Banner Badanjilin Airport
City: Badanjilin
Country: China Flag of China
IATA Code: RHT
ICAO Code: ZBAR
Coordinates: 39°13′30″N, 101°32′45″E