Air Miles Calculator logo

How far is Lopez, WA, from Belo Horizonte?

The distance between Belo Horizonte (Belo Horizonte Tancredo Neves International Airport) and Lopez (Lopez Island Airport) is 6735 miles / 10838 kilometers / 5852 nautical miles.

Belo Horizonte Tancredo Neves International Airport – Lopez Island Airport

Distance arrow
6735
Miles
Distance arrow
10838
Kilometers
Distance arrow
5852
Nautical miles

Search flights

Distance from Belo Horizonte to Lopez

There are several ways to calculate the distance from Belo Horizonte to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 6734.700 miles
  • 10838.449 kilometers
  • 5852.294 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6742.438 miles
  • 10850.903 kilometers
  • 5859.019 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Belo Horizonte to Lopez?

The estimated flight time from Belo Horizonte Tancredo Neves International Airport to Lopez Island Airport is 13 hours and 15 minutes.

Flight carbon footprint between Belo Horizonte Tancredo Neves International Airport (CNF) and Lopez Island Airport (LPS)

On average, flying from Belo Horizonte to Lopez generates about 818 kg of CO2 per passenger, and 818 kilograms equals 1 803 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Belo Horizonte to Lopez

See the map of the shortest flight path between Belo Horizonte Tancredo Neves International Airport (CNF) and Lopez Island Airport (LPS).

Airport information

Origin Belo Horizonte Tancredo Neves International Airport
City: Belo Horizonte
Country: Brazil Flag of Brazil
IATA Code: CNF
ICAO Code: SBCF
Coordinates: 19°37′27″S, 43°58′18″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W