Air Miles Calculator logo

How far is Burlington, IA, from La Romana?

The distance between La Romana (La Romana International Airport) and Burlington (Southeast Iowa Regional Airport) is 2026 miles / 3260 kilometers / 1760 nautical miles.

La Romana International Airport – Southeast Iowa Regional Airport

Distance arrow
2026
Miles
Distance arrow
3260
Kilometers
Distance arrow
1760
Nautical miles

Search flights

Distance from La Romana to Burlington

There are several ways to calculate the distance from La Romana to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 2025.595 miles
  • 3259.879 kilometers
  • 1760.194 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2027.545 miles
  • 3263.017 kilometers
  • 1761.888 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from La Romana to Burlington?

The estimated flight time from La Romana International Airport to Southeast Iowa Regional Airport is 4 hours and 20 minutes.

Flight carbon footprint between La Romana International Airport (LRM) and Southeast Iowa Regional Airport (BRL)

On average, flying from La Romana to Burlington generates about 220 kg of CO2 per passenger, and 220 kilograms equals 486 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from La Romana to Burlington

See the map of the shortest flight path between La Romana International Airport (LRM) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin La Romana International Airport
City: La Romana
Country: Dominican Republic Flag of Dominican Republic
IATA Code: LRM
ICAO Code: MDLR
Coordinates: 18°27′2″N, 68°54′42″W
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W