Air Miles Calculator logo

How far is Memanbetsu from Xingyi?

The distance between Xingyi (Xingyi Wanfenglin Airport) and Memanbetsu (Memanbetsu Airport) is 2552 miles / 4107 kilometers / 2217 nautical miles.

The driving distance from Xingyi (ACX) to Memanbetsu (MMB) is 4060 miles / 6534 kilometers, and travel time by car is about 80 hours 34 minutes.

Xingyi Wanfenglin Airport – Memanbetsu Airport

Distance arrow
2552
Miles
Distance arrow
4107
Kilometers
Distance arrow
2217
Nautical miles

Search flights

Distance from Xingyi to Memanbetsu

There are several ways to calculate the distance from Xingyi to Memanbetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 2551.801 miles
  • 4106.726 kilometers
  • 2217.454 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2549.221 miles
  • 4102.573 kilometers
  • 2215.212 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xingyi to Memanbetsu?

The estimated flight time from Xingyi Wanfenglin Airport to Memanbetsu Airport is 5 hours and 19 minutes.

Flight carbon footprint between Xingyi Wanfenglin Airport (ACX) and Memanbetsu Airport (MMB)

On average, flying from Xingyi to Memanbetsu generates about 281 kg of CO2 per passenger, and 281 kilograms equals 620 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xingyi to Memanbetsu

See the map of the shortest flight path between Xingyi Wanfenglin Airport (ACX) and Memanbetsu Airport (MMB).

Airport information

Origin Xingyi Wanfenglin Airport
City: Xingyi
Country: China Flag of China
IATA Code: ACX
ICAO Code: ZUYI
Coordinates: 25°5′11″N, 104°57′33″E
Destination Memanbetsu Airport
City: Memanbetsu
Country: Japan Flag of Japan
IATA Code: MMB
ICAO Code: RJCM
Coordinates: 43°52′50″N, 144°9′50″E