Air Miles Calculator logo

How far is Zhanjiang from Muscat?

The distance between Muscat (Muscat International Airport) and Zhanjiang (Zhanjiang Airport) is 3317 miles / 5339 kilometers / 2883 nautical miles.

The driving distance from Muscat (MCT) to Zhanjiang (ZHA) is 6363 miles / 10240 kilometers, and travel time by car is about 126 hours 4 minutes.

Muscat International Airport – Zhanjiang Airport

Distance arrow
3317
Miles
Distance arrow
5339
Kilometers
Distance arrow
2883
Nautical miles

Search flights

Distance from Muscat to Zhanjiang

There are several ways to calculate the distance from Muscat to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 3317.490 miles
  • 5338.983 kilometers
  • 2882.820 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3312.225 miles
  • 5330.509 kilometers
  • 2878.244 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Muscat to Zhanjiang?

The estimated flight time from Muscat International Airport to Zhanjiang Airport is 6 hours and 46 minutes.

Flight carbon footprint between Muscat International Airport (MCT) and Zhanjiang Airport (ZHA)

On average, flying from Muscat to Zhanjiang generates about 372 kg of CO2 per passenger, and 372 kilograms equals 821 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Muscat to Zhanjiang

See the map of the shortest flight path between Muscat International Airport (MCT) and Zhanjiang Airport (ZHA).

Airport information

Origin Muscat International Airport
City: Muscat
Country: Oman Flag of Oman
IATA Code: MCT
ICAO Code: OOMS
Coordinates: 23°35′35″N, 58°17′3″E
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E