Air Miles Calculator logo

How far is Ji'an from Mörön?

The distance between Mörön (Mörön Airport) and Ji'an (Jinggangshan Airport) is 1756 miles / 2826 kilometers / 1526 nautical miles.

The driving distance from Mörön (MXV) to Ji'an (JGS) is 2222 miles / 3576 kilometers, and travel time by car is about 45 hours 20 minutes.

Mörön Airport – Jinggangshan Airport

Distance arrow
1756
Miles
Distance arrow
2826
Kilometers
Distance arrow
1526
Nautical miles

Search flights

Distance from Mörön to Ji'an

There are several ways to calculate the distance from Mörön to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 1755.766 miles
  • 2825.632 kilometers
  • 1525.719 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1757.360 miles
  • 2828.197 kilometers
  • 1527.104 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Mörön to Ji'an?

The estimated flight time from Mörön Airport to Jinggangshan Airport is 3 hours and 49 minutes.

What is the time difference between Mörön and Ji'an?

There is no time difference between Mörön and Ji'an.

Flight carbon footprint between Mörön Airport (MXV) and Jinggangshan Airport (JGS)

On average, flying from Mörön to Ji'an generates about 197 kg of CO2 per passenger, and 197 kilograms equals 434 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Mörön to Ji'an

See the map of the shortest flight path between Mörön Airport (MXV) and Jinggangshan Airport (JGS).

Airport information

Origin Mörön Airport
City: Mörön
Country: Mongolia Flag of Mongolia
IATA Code: MXV
ICAO Code: ZMMN
Coordinates: 49°39′47″N, 100°5′56″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E