Air Miles Calculator logo

How far is Beijing from Amami?

The distance between Amami (Amami Airport) and Beijing (Beijing Daxing International Airport) is 1078 miles / 1734 kilometers / 937 nautical miles.

The driving distance from Amami (ASJ) to Beijing (PKX) is 1635 miles / 2631 kilometers, and travel time by car is about 106 hours 21 minutes.

Amami Airport – Beijing Daxing International Airport

Distance arrow
1078
Miles
Distance arrow
1734
Kilometers
Distance arrow
937
Nautical miles

Search flights

Distance from Amami to Beijing

There are several ways to calculate the distance from Amami to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 1077.717 miles
  • 1734.417 kilometers
  • 936.510 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1077.887 miles
  • 1734.691 kilometers
  • 936.658 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Amami to Beijing?

The estimated flight time from Amami Airport to Beijing Daxing International Airport is 2 hours and 32 minutes.

Flight carbon footprint between Amami Airport (ASJ) and Beijing Daxing International Airport (PKX)

On average, flying from Amami to Beijing generates about 156 kg of CO2 per passenger, and 156 kilograms equals 343 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Amami to Beijing

See the map of the shortest flight path between Amami Airport (ASJ) and Beijing Daxing International Airport (PKX).

Airport information

Origin Amami Airport
City: Amami
Country: Japan Flag of Japan
IATA Code: ASJ
ICAO Code: RJKA
Coordinates: 28°25′50″N, 129°42′46″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E