Air Miles Calculator logo

How far is Badanjilin from Magong?

The distance between Magong (Penghu Airport) and Badanjilin (Alxa Right Banner Badanjilin Airport) is 1513 miles / 2434 kilometers / 1314 nautical miles.

Penghu Airport – Alxa Right Banner Badanjilin Airport

Distance arrow
1513
Miles
Distance arrow
2434
Kilometers
Distance arrow
1314
Nautical miles

Search flights

Distance from Magong to Badanjilin

There are several ways to calculate the distance from Magong to Badanjilin. Here are two standard methods:

Vincenty's formula (applied above)
  • 1512.512 miles
  • 2434.152 kilometers
  • 1314.337 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1513.202 miles
  • 2435.262 kilometers
  • 1314.936 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Magong to Badanjilin?

The estimated flight time from Penghu Airport to Alxa Right Banner Badanjilin Airport is 3 hours and 21 minutes.

What is the time difference between Magong and Badanjilin?

There is no time difference between Magong and Badanjilin.

Flight carbon footprint between Penghu Airport (MZG) and Alxa Right Banner Badanjilin Airport (RHT)

On average, flying from Magong to Badanjilin generates about 180 kg of CO2 per passenger, and 180 kilograms equals 397 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Magong to Badanjilin

See the map of the shortest flight path between Penghu Airport (MZG) and Alxa Right Banner Badanjilin Airport (RHT).

Airport information

Origin Penghu Airport
City: Magong
Country: Taiwan Flag of Taiwan
IATA Code: MZG
ICAO Code: RCQC
Coordinates: 23°34′7″N, 119°37′40″E
Destination Alxa Right Banner Badanjilin Airport
City: Badanjilin
Country: China Flag of China
IATA Code: RHT
ICAO Code: ZBAR
Coordinates: 39°13′30″N, 101°32′45″E