Air Miles Calculator logo

How far is Beijing from Akita?

The distance between Akita (Akita Airport) and Beijing (Beijing Daxing International Airport) is 1268 miles / 2040 kilometers / 1102 nautical miles.

The driving distance from Akita (AXT) to Beijing (PKX) is 2074 miles / 3337 kilometers, and travel time by car is about 42 hours 43 minutes.

Akita Airport – Beijing Daxing International Airport

Distance arrow
1268
Miles
Distance arrow
2040
Kilometers
Distance arrow
1102
Nautical miles

Search flights

Distance from Akita to Beijing

There are several ways to calculate the distance from Akita to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 1267.609 miles
  • 2040.018 kilometers
  • 1101.522 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1264.472 miles
  • 2034.970 kilometers
  • 1098.796 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Akita to Beijing?

The estimated flight time from Akita Airport to Beijing Daxing International Airport is 2 hours and 54 minutes.

Flight carbon footprint between Akita Airport (AXT) and Beijing Daxing International Airport (PKX)

On average, flying from Akita to Beijing generates about 165 kg of CO2 per passenger, and 165 kilograms equals 363 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Akita to Beijing

See the map of the shortest flight path between Akita Airport (AXT) and Beijing Daxing International Airport (PKX).

Airport information

Origin Akita Airport
City: Akita
Country: Japan Flag of Japan
IATA Code: AXT
ICAO Code: RJSK
Coordinates: 39°36′56″N, 140°13′8″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E