Air Miles Calculator logo

How far is Jauja from Beijing?

The distance between Beijing (Beijing Capital International Airport) and Jauja (Francisco Carle Airport) is 10349 miles / 16655 kilometers / 8993 nautical miles.

Beijing Capital International Airport – Francisco Carle Airport

Distance arrow
10349
Miles
Distance arrow
16655
Kilometers
Distance arrow
8993
Nautical miles
Flight time duration
20 h 5 min
CO2 emission
1 357 kg

Search flights

Distance from Beijing to Jauja

There are several ways to calculate the distance from Beijing to Jauja. Here are two standard methods:

Vincenty's formula (applied above)
  • 10348.921 miles
  • 16654.973 kilometers
  • 8992.966 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10347.837 miles
  • 16653.230 kilometers
  • 8992.025 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beijing to Jauja?

The estimated flight time from Beijing Capital International Airport to Francisco Carle Airport is 20 hours and 5 minutes.

Flight carbon footprint between Beijing Capital International Airport (PEK) and Francisco Carle Airport (JAU)

On average, flying from Beijing to Jauja generates about 1 357 kg of CO2 per passenger, and 1 357 kilograms equals 2 991 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Beijing to Jauja

See the map of the shortest flight path between Beijing Capital International Airport (PEK) and Francisco Carle Airport (JAU).

Airport information

Origin Beijing Capital International Airport
City: Beijing
Country: China Flag of China
IATA Code: PEK
ICAO Code: ZBAA
Coordinates: 40°4′48″N, 116°35′5″E
Destination Francisco Carle Airport
City: Jauja
Country: Perú Flag of Perú
IATA Code: JAU
ICAO Code: SPJJ
Coordinates: 11°46′59″S, 75°28′24″W