Air Miles Calculator logo

How far is La Crosse, WI, from San Julian?

The distance between San Julian (Capitán José Daniel Vazquez Airport) and La Crosse (La Crosse Regional Airport) is 6570 miles / 10574 kilometers / 5709 nautical miles.

Capitán José Daniel Vazquez Airport – La Crosse Regional Airport

Distance arrow
6570
Miles
Distance arrow
10574
Kilometers
Distance arrow
5709
Nautical miles

Search flights

Distance from San Julian to La Crosse

There are several ways to calculate the distance from San Julian to La Crosse. Here are two standard methods:

Vincenty's formula (applied above)
  • 6570.345 miles
  • 10573.946 kilometers
  • 5709.474 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6592.686 miles
  • 10609.899 kilometers
  • 5728.887 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Julian to La Crosse?

The estimated flight time from Capitán José Daniel Vazquez Airport to La Crosse Regional Airport is 12 hours and 56 minutes.

Flight carbon footprint between Capitán José Daniel Vazquez Airport (ULA) and La Crosse Regional Airport (LSE)

On average, flying from San Julian to La Crosse generates about 795 kg of CO2 per passenger, and 795 kilograms equals 1 752 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Julian to La Crosse

See the map of the shortest flight path between Capitán José Daniel Vazquez Airport (ULA) and La Crosse Regional Airport (LSE).

Airport information

Origin Capitán José Daniel Vazquez Airport
City: San Julian
Country: Argentina Flag of Argentina
IATA Code: ULA
ICAO Code: SAWJ
Coordinates: 49°18′24″S, 67°48′9″W
Destination La Crosse Regional Airport
City: La Crosse, WI
Country: United States Flag of United States
IATA Code: LSE
ICAO Code: KLSE
Coordinates: 43°52′44″N, 91°15′24″W