Air Miles Calculator logo

How far is Meixian from Sharjah?

The distance between Sharjah (Sharjah International Airport) and Meixian (Meixian Airport) is 3773 miles / 6073 kilometers / 3279 nautical miles.

The driving distance from Sharjah (SHJ) to Meixian (MXZ) is 6503 miles / 10465 kilometers, and travel time by car is about 123 hours 26 minutes.

Sharjah International Airport – Meixian Airport

Distance arrow
3773
Miles
Distance arrow
6073
Kilometers
Distance arrow
3279
Nautical miles

Search flights

Distance from Sharjah to Meixian

There are several ways to calculate the distance from Sharjah to Meixian. Here are two standard methods:

Vincenty's formula (applied above)
  • 3773.320 miles
  • 6072.569 kilometers
  • 3278.925 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3766.901 miles
  • 6062.239 kilometers
  • 3273.347 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sharjah to Meixian?

The estimated flight time from Sharjah International Airport to Meixian Airport is 7 hours and 38 minutes.

Flight carbon footprint between Sharjah International Airport (SHJ) and Meixian Airport (MXZ)

On average, flying from Sharjah to Meixian generates about 428 kg of CO2 per passenger, and 428 kilograms equals 944 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sharjah to Meixian

See the map of the shortest flight path between Sharjah International Airport (SHJ) and Meixian Airport (MXZ).

Airport information

Origin Sharjah International Airport
City: Sharjah
Country: United Arab Emirates Flag of United Arab Emirates
IATA Code: SHJ
ICAO Code: OMSJ
Coordinates: 25°19′42″N, 55°31′1″E
Destination Meixian Airport
City: Meixian
Country: China Flag of China
IATA Code: MXZ
ICAO Code: ZGMX
Coordinates: 24°21′0″N, 116°7′58″E