Air Miles Calculator logo

How far is Beijing from Yamagata?

The distance between Yamagata (Yamagata Airport) and Beijing (Beijing Daxing International Airport) is 1289 miles / 2074 kilometers / 1120 nautical miles.

The driving distance from Yamagata (GAJ) to Beijing (PKX) is 2024 miles / 3257 kilometers, and travel time by car is about 41 hours 32 minutes.

Yamagata Airport – Beijing Daxing International Airport

Distance arrow
1289
Miles
Distance arrow
2074
Kilometers
Distance arrow
1120
Nautical miles

Search flights

Distance from Yamagata to Beijing

There are several ways to calculate the distance from Yamagata to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 1288.850 miles
  • 2074.204 kilometers
  • 1119.980 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1285.723 miles
  • 2069.171 kilometers
  • 1117.263 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yamagata to Beijing?

The estimated flight time from Yamagata Airport to Beijing Daxing International Airport is 2 hours and 56 minutes.

Flight carbon footprint between Yamagata Airport (GAJ) and Beijing Daxing International Airport (PKX)

On average, flying from Yamagata to Beijing generates about 166 kg of CO2 per passenger, and 166 kilograms equals 366 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yamagata to Beijing

See the map of the shortest flight path between Yamagata Airport (GAJ) and Beijing Daxing International Airport (PKX).

Airport information

Origin Yamagata Airport
City: Yamagata
Country: Japan Flag of Japan
IATA Code: GAJ
ICAO Code: RJSC
Coordinates: 38°24′42″N, 140°22′15″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E