Air Miles Calculator logo

How far is Jiayuguan from Muscat?

The distance between Muscat (Muscat International Airport) and Jiayuguan (Jiayuguan Airport) is 2584 miles / 4158 kilometers / 2245 nautical miles.

The driving distance from Muscat (MCT) to Jiayuguan (JGN) is 4826 miles / 7766 kilometers, and travel time by car is about 93 hours 5 minutes.

Muscat International Airport – Jiayuguan Airport

Distance arrow
2584
Miles
Distance arrow
4158
Kilometers
Distance arrow
2245
Nautical miles

Search flights

Distance from Muscat to Jiayuguan

There are several ways to calculate the distance from Muscat to Jiayuguan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2583.514 miles
  • 4157.762 kilometers
  • 2245.012 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2580.600 miles
  • 4153.073 kilometers
  • 2242.480 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Muscat to Jiayuguan?

The estimated flight time from Muscat International Airport to Jiayuguan Airport is 5 hours and 23 minutes.

Flight carbon footprint between Muscat International Airport (MCT) and Jiayuguan Airport (JGN)

On average, flying from Muscat to Jiayuguan generates about 285 kg of CO2 per passenger, and 285 kilograms equals 628 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Muscat to Jiayuguan

See the map of the shortest flight path between Muscat International Airport (MCT) and Jiayuguan Airport (JGN).

Airport information

Origin Muscat International Airport
City: Muscat
Country: Oman Flag of Oman
IATA Code: MCT
ICAO Code: OOMS
Coordinates: 23°35′35″N, 58°17′3″E
Destination Jiayuguan Airport
City: Jiayuguan
Country: China Flag of China
IATA Code: JGN
ICAO Code: ZLJQ
Coordinates: 39°51′24″N, 98°20′29″E