Air Miles Calculator logo

How far is Jauja from Três Lagoas?

The distance between Três Lagoas (Três Lagoas Airport) and Jauja (Francisco Carle Airport) is 1693 miles / 2725 kilometers / 1472 nautical miles.

The driving distance from Três Lagoas (TJL) to Jauja (JAU) is 2342 miles / 3769 kilometers, and travel time by car is about 56 hours 45 minutes.

Três Lagoas Airport – Francisco Carle Airport

Distance arrow
1693
Miles
Distance arrow
2725
Kilometers
Distance arrow
1472
Nautical miles

Search flights

Distance from Três Lagoas to Jauja

There are several ways to calculate the distance from Três Lagoas to Jauja. Here are two standard methods:

Vincenty's formula (applied above)
  • 1693.460 miles
  • 2725.360 kilometers
  • 1471.577 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1692.510 miles
  • 2723.831 kilometers
  • 1470.751 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Três Lagoas to Jauja?

The estimated flight time from Três Lagoas Airport to Francisco Carle Airport is 3 hours and 42 minutes.

Flight carbon footprint between Três Lagoas Airport (TJL) and Francisco Carle Airport (JAU)

On average, flying from Três Lagoas to Jauja generates about 192 kg of CO2 per passenger, and 192 kilograms equals 424 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Três Lagoas to Jauja

See the map of the shortest flight path between Três Lagoas Airport (TJL) and Francisco Carle Airport (JAU).

Airport information

Origin Três Lagoas Airport
City: Três Lagoas
Country: Brazil Flag of Brazil
IATA Code: TJL
ICAO Code: SBTG
Coordinates: 20°45′15″S, 51°41′3″W
Destination Francisco Carle Airport
City: Jauja
Country: Perú Flag of Perú
IATA Code: JAU
ICAO Code: SPJJ
Coordinates: 11°46′59″S, 75°28′24″W