Air Miles Calculator logo

How far is Baise from Meghauli?

The distance between Meghauli (Meghauli Airport) and Baise (Baise Bama Airport) is 1441 miles / 2318 kilometers / 1252 nautical miles.

The driving distance from Meghauli (MEY) to Baise (AEB) is 2051 miles / 3301 kilometers, and travel time by car is about 41 hours 42 minutes.

Meghauli Airport – Baise Bama Airport

Distance arrow
1441
Miles
Distance arrow
2318
Kilometers
Distance arrow
1252
Nautical miles
Flight time duration
3 h 13 min
Time Difference
2 h 15 min
CO2 emission
176 kg

Search flights

Distance from Meghauli to Baise

There are several ways to calculate the distance from Meghauli to Baise. Here are two standard methods:

Vincenty's formula (applied above)
  • 1440.523 miles
  • 2318.298 kilometers
  • 1251.781 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1438.279 miles
  • 2314.686 kilometers
  • 1249.830 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Meghauli to Baise?

The estimated flight time from Meghauli Airport to Baise Bama Airport is 3 hours and 13 minutes.

Flight carbon footprint between Meghauli Airport (MEY) and Baise Bama Airport (AEB)

On average, flying from Meghauli to Baise generates about 176 kg of CO2 per passenger, and 176 kilograms equals 387 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Meghauli to Baise

See the map of the shortest flight path between Meghauli Airport (MEY) and Baise Bama Airport (AEB).

Airport information

Origin Meghauli Airport
City: Meghauli
Country: Nepal Flag of Nepal
IATA Code: MEY
ICAO Code: VNMG
Coordinates: 27°34′58″N, 84°13′58″E
Destination Baise Bama Airport
City: Baise
Country: China Flag of China
IATA Code: AEB
ICAO Code: ZGBS
Coordinates: 23°43′14″N, 106°57′35″E