Air Miles Calculator logo

How far is Lopez, WA, from Bangui?

The distance between Bangui (Bangui M'Poko International Airport) and Lopez (Lopez Island Airport) is 8115 miles / 13060 kilometers / 7052 nautical miles.

Bangui M'Poko International Airport – Lopez Island Airport

Distance arrow
8115
Miles
Distance arrow
13060
Kilometers
Distance arrow
7052
Nautical miles
Flight time duration
15 h 51 min
CO2 emission
1 015 kg

Search flights

Distance from Bangui to Lopez

There are several ways to calculate the distance from Bangui to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 8115.143 miles
  • 13060.057 kilometers
  • 7051.867 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8108.329 miles
  • 13049.091 kilometers
  • 7045.945 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bangui to Lopez?

The estimated flight time from Bangui M'Poko International Airport to Lopez Island Airport is 15 hours and 51 minutes.

Flight carbon footprint between Bangui M'Poko International Airport (BGF) and Lopez Island Airport (LPS)

On average, flying from Bangui to Lopez generates about 1 015 kg of CO2 per passenger, and 1 015 kilograms equals 2 238 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bangui to Lopez

See the map of the shortest flight path between Bangui M'Poko International Airport (BGF) and Lopez Island Airport (LPS).

Airport information

Origin Bangui M'Poko International Airport
City: Bangui
Country: Central African Republic Flag of Central African Republic
IATA Code: BGF
ICAO Code: FEFF
Coordinates: 4°23′54″N, 18°31′7″E
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W