Air Miles Calculator logo

How far is Beijing from Hanamaki?

The distance between Hanamaki (Hanamaki Airport) and Beijing (Beijing Daxing International Airport) is 1318 miles / 2121 kilometers / 1145 nautical miles.

The driving distance from Hanamaki (HNA) to Beijing (PKX) is 2123 miles / 3416 kilometers, and travel time by car is about 43 hours 44 minutes.

Hanamaki Airport – Beijing Daxing International Airport

Distance arrow
1318
Miles
Distance arrow
2121
Kilometers
Distance arrow
1145
Nautical miles

Search flights

Distance from Hanamaki to Beijing

There are several ways to calculate the distance from Hanamaki to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 1317.842 miles
  • 2120.861 kilometers
  • 1145.173 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1314.588 miles
  • 2115.624 kilometers
  • 1142.346 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hanamaki to Beijing?

The estimated flight time from Hanamaki Airport to Beijing Daxing International Airport is 2 hours and 59 minutes.

Flight carbon footprint between Hanamaki Airport (HNA) and Beijing Daxing International Airport (PKX)

On average, flying from Hanamaki to Beijing generates about 168 kg of CO2 per passenger, and 168 kilograms equals 370 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hanamaki to Beijing

See the map of the shortest flight path between Hanamaki Airport (HNA) and Beijing Daxing International Airport (PKX).

Airport information

Origin Hanamaki Airport
City: Hanamaki
Country: Japan Flag of Japan
IATA Code: HNA
ICAO Code: RJSI
Coordinates: 39°25′42″N, 141°8′5″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E