Air Miles Calculator logo

How far is Burlington, IA, from Trincomalee?

The distance between Trincomalee (China Bay Airport) and Burlington (Southeast Iowa Regional Airport) is 9002 miles / 14487 kilometers / 7822 nautical miles.

China Bay Airport – Southeast Iowa Regional Airport

Distance arrow
9002
Miles
Distance arrow
14487
Kilometers
Distance arrow
7822
Nautical miles
Flight time duration
17 h 32 min
Time Difference
11 h 30 min
CO2 emission
1 148 kg

Search flights

Distance from Trincomalee to Burlington

There are several ways to calculate the distance from Trincomalee to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 9001.801 miles
  • 14486.995 kilometers
  • 7822.351 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8993.966 miles
  • 14474.386 kilometers
  • 7815.543 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Trincomalee to Burlington?

The estimated flight time from China Bay Airport to Southeast Iowa Regional Airport is 17 hours and 32 minutes.

Flight carbon footprint between China Bay Airport (TRR) and Southeast Iowa Regional Airport (BRL)

On average, flying from Trincomalee to Burlington generates about 1 148 kg of CO2 per passenger, and 1 148 kilograms equals 2 530 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Trincomalee to Burlington

See the map of the shortest flight path between China Bay Airport (TRR) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin China Bay Airport
City: Trincomalee
Country: Sri Lanka Flag of Sri Lanka
IATA Code: TRR
ICAO Code: VCCT
Coordinates: 8°32′18″N, 81°10′54″E
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W