Air Miles Calculator logo

How far is Burlington, IA, from Singapore?

The distance between Singapore (Singapore Changi Airport) and Burlington (Southeast Iowa Regional Airport) is 9380 miles / 15095 kilometers / 8151 nautical miles.

Singapore Changi Airport – Southeast Iowa Regional Airport

Distance arrow
9380
Miles
Distance arrow
15095
Kilometers
Distance arrow
8151
Nautical miles
Flight time duration
18 h 15 min
CO2 emission
1 205 kg

Search flights

Distance from Singapore to Burlington

There are several ways to calculate the distance from Singapore to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 9379.820 miles
  • 15095.357 kilometers
  • 8150.841 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9374.213 miles
  • 15086.333 kilometers
  • 8145.968 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Singapore to Burlington?

The estimated flight time from Singapore Changi Airport to Southeast Iowa Regional Airport is 18 hours and 15 minutes.

Flight carbon footprint between Singapore Changi Airport (SIN) and Southeast Iowa Regional Airport (BRL)

On average, flying from Singapore to Burlington generates about 1 205 kg of CO2 per passenger, and 1 205 kilograms equals 2 657 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Singapore to Burlington

See the map of the shortest flight path between Singapore Changi Airport (SIN) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Singapore Changi Airport
City: Singapore
Country: Singapore Flag of Singapore
IATA Code: SIN
ICAO Code: WSSS
Coordinates: 1°21′0″N, 103°59′38″E
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W