Air Miles Calculator logo

How far is Bijie from Mörön?

The distance between Mörön (Mörön Airport) and Bijie (Bijie Feixiong Airport) is 1571 miles / 2528 kilometers / 1365 nautical miles.

The driving distance from Mörön (MXV) to Bijie (BFJ) is 2215 miles / 3564 kilometers, and travel time by car is about 45 hours 40 minutes.

Mörön Airport – Bijie Feixiong Airport

Distance arrow
1571
Miles
Distance arrow
2528
Kilometers
Distance arrow
1365
Nautical miles

Search flights

Distance from Mörön to Bijie

There are several ways to calculate the distance from Mörön to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 1571.064 miles
  • 2528.382 kilometers
  • 1365.217 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1573.492 miles
  • 2532.289 kilometers
  • 1367.327 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Mörön to Bijie?

The estimated flight time from Mörön Airport to Bijie Feixiong Airport is 3 hours and 28 minutes.

What is the time difference between Mörön and Bijie?

There is no time difference between Mörön and Bijie.

Flight carbon footprint between Mörön Airport (MXV) and Bijie Feixiong Airport (BFJ)

On average, flying from Mörön to Bijie generates about 184 kg of CO2 per passenger, and 184 kilograms equals 406 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Mörön to Bijie

See the map of the shortest flight path between Mörön Airport (MXV) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Mörön Airport
City: Mörön
Country: Mongolia Flag of Mongolia
IATA Code: MXV
ICAO Code: ZMMN
Coordinates: 49°39′47″N, 100°5′56″E
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E