Air Miles Calculator logo

How far is Zhanjiang from Meghauli?

The distance between Meghauli (Meghauli Airport) and Zhanjiang (Zhanjiang Airport) is 1700 miles / 2736 kilometers / 1477 nautical miles.

The driving distance from Meghauli (MEY) to Zhanjiang (ZHA) is 2360 miles / 3798 kilometers, and travel time by car is about 47 hours 4 minutes.

Meghauli Airport – Zhanjiang Airport

Distance arrow
1700
Miles
Distance arrow
2736
Kilometers
Distance arrow
1477
Nautical miles
Flight time duration
3 h 43 min
CO2 emission
193 kg

Search flights

Distance from Meghauli to Zhanjiang

There are several ways to calculate the distance from Meghauli to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1700.258 miles
  • 2736.300 kilometers
  • 1477.484 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1698.017 miles
  • 2732.694 kilometers
  • 1475.537 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Meghauli to Zhanjiang?

The estimated flight time from Meghauli Airport to Zhanjiang Airport is 3 hours and 43 minutes.

Flight carbon footprint between Meghauli Airport (MEY) and Zhanjiang Airport (ZHA)

On average, flying from Meghauli to Zhanjiang generates about 193 kg of CO2 per passenger, and 193 kilograms equals 425 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Meghauli to Zhanjiang

See the map of the shortest flight path between Meghauli Airport (MEY) and Zhanjiang Airport (ZHA).

Airport information

Origin Meghauli Airport
City: Meghauli
Country: Nepal Flag of Nepal
IATA Code: MEY
ICAO Code: VNMG
Coordinates: 27°34′58″N, 84°13′58″E
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E