Air Miles Calculator logo

How far is Burlington, IA, from Montevideo?

The distance between Montevideo (Carrasco International Airport) and Burlington (Southeast Iowa Regional Airport) is 5662 miles / 9112 kilometers / 4920 nautical miles.

Carrasco International Airport – Southeast Iowa Regional Airport

Distance arrow
5662
Miles
Distance arrow
9112
Kilometers
Distance arrow
4920
Nautical miles

Search flights

Distance from Montevideo to Burlington

There are several ways to calculate the distance from Montevideo to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 5661.993 miles
  • 9112.094 kilometers
  • 4920.137 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5680.969 miles
  • 9142.634 kilometers
  • 4936.627 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Montevideo to Burlington?

The estimated flight time from Carrasco International Airport to Southeast Iowa Regional Airport is 11 hours and 13 minutes.

Flight carbon footprint between Carrasco International Airport (MVD) and Southeast Iowa Regional Airport (BRL)

On average, flying from Montevideo to Burlington generates about 671 kg of CO2 per passenger, and 671 kilograms equals 1 480 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Montevideo to Burlington

See the map of the shortest flight path between Carrasco International Airport (MVD) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Carrasco International Airport
City: Montevideo
Country: Uruguay Flag of Uruguay
IATA Code: MVD
ICAO Code: SUMU
Coordinates: 34°50′18″S, 56°1′50″W
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W