Air Miles Calculator logo

How far is Qiqihar from Memanbetsu?

The distance between Memanbetsu (Memanbetsu Airport) and Qiqihar (Qiqihar Sanjiazi Airport) is 1006 miles / 1619 kilometers / 874 nautical miles.

The driving distance from Memanbetsu (MMB) to Qiqihar (NDG) is 2673 miles / 4301 kilometers, and travel time by car is about 55 hours 48 minutes.

Memanbetsu Airport – Qiqihar Sanjiazi Airport

Distance arrow
1006
Miles
Distance arrow
1619
Kilometers
Distance arrow
874
Nautical miles

Search flights

Distance from Memanbetsu to Qiqihar

There are several ways to calculate the distance from Memanbetsu to Qiqihar. Here are two standard methods:

Vincenty's formula (applied above)
  • 1006.111 miles
  • 1619.179 kilometers
  • 874.287 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1003.448 miles
  • 1614.893 kilometers
  • 871.973 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Memanbetsu to Qiqihar?

The estimated flight time from Memanbetsu Airport to Qiqihar Sanjiazi Airport is 2 hours and 24 minutes.

Flight carbon footprint between Memanbetsu Airport (MMB) and Qiqihar Sanjiazi Airport (NDG)

On average, flying from Memanbetsu to Qiqihar generates about 151 kg of CO2 per passenger, and 151 kilograms equals 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Memanbetsu to Qiqihar

See the map of the shortest flight path between Memanbetsu Airport (MMB) and Qiqihar Sanjiazi Airport (NDG).

Airport information

Origin Memanbetsu Airport
City: Memanbetsu
Country: Japan Flag of Japan
IATA Code: MMB
ICAO Code: RJCM
Coordinates: 43°52′50″N, 144°9′50″E
Destination Qiqihar Sanjiazi Airport
City: Qiqihar
Country: China Flag of China
IATA Code: NDG
ICAO Code: ZYQQ
Coordinates: 47°14′22″N, 123°55′4″E