How far is Jauja from Los Angeles, CA?
The distance between Los Angeles (Los Angeles International Airport) and Jauja (Francisco Carle Airport) is 4225 miles / 6800 kilometers / 3672 nautical miles.
Los Angeles International Airport – Francisco Carle Airport
Search flights
Distance from Los Angeles to Jauja
There are several ways to calculate the distance from Los Angeles to Jauja. Here are two standard methods:
Vincenty's formula (applied above)- 4225.377 miles
- 6800.086 kilometers
- 3671.753 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4233.930 miles
- 6813.850 kilometers
- 3679.185 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Los Angeles to Jauja?
The estimated flight time from Los Angeles International Airport to Francisco Carle Airport is 8 hours and 30 minutes.
What is the time difference between Los Angeles and Jauja?
The time difference between Los Angeles and Jauja is 3 hours. Jauja is 3 hours ahead of Los Angeles.
Flight carbon footprint between Los Angeles International Airport (LAX) and Francisco Carle Airport (JAU)
On average, flying from Los Angeles to Jauja generates about 485 kg of CO2 per passenger, and 485 kilograms equals 1 068 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Los Angeles to Jauja
See the map of the shortest flight path between Los Angeles International Airport (LAX) and Francisco Carle Airport (JAU).
Airport information
Origin | Los Angeles International Airport |
---|---|
City: | Los Angeles, CA |
Country: | United States |
IATA Code: | LAX |
ICAO Code: | KLAX |
Coordinates: | 33°56′33″N, 118°24′28″W |
Destination | Francisco Carle Airport |
---|---|
City: | Jauja |
Country: | Perú |
IATA Code: | JAU |
ICAO Code: | SPJJ |
Coordinates: | 11°46′59″S, 75°28′24″W |