Air Miles Calculator logo

How far is Lopez, WA, from Pointe-à-Pitre?

The distance between Pointe-à-Pitre (Pointe-à-Pitre International Airport) and Lopez (Lopez Island Airport) is 4083 miles / 6571 kilometers / 3548 nautical miles.

Pointe-à-Pitre International Airport – Lopez Island Airport

Distance arrow
4083
Miles
Distance arrow
6571
Kilometers
Distance arrow
3548
Nautical miles

Search flights

Distance from Pointe-à-Pitre to Lopez

There are several ways to calculate the distance from Pointe-à-Pitre to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 4082.884 miles
  • 6570.765 kilometers
  • 3547.929 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4080.146 miles
  • 6566.359 kilometers
  • 3545.550 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pointe-à-Pitre to Lopez?

The estimated flight time from Pointe-à-Pitre International Airport to Lopez Island Airport is 8 hours and 13 minutes.

Flight carbon footprint between Pointe-à-Pitre International Airport (PTP) and Lopez Island Airport (LPS)

On average, flying from Pointe-à-Pitre to Lopez generates about 467 kg of CO2 per passenger, and 467 kilograms equals 1 029 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pointe-à-Pitre to Lopez

See the map of the shortest flight path between Pointe-à-Pitre International Airport (PTP) and Lopez Island Airport (LPS).

Airport information

Origin Pointe-à-Pitre International Airport
City: Pointe-à-Pitre
Country: Guadeloupe Flag of Guadeloupe
IATA Code: PTP
ICAO Code: TFFR
Coordinates: 16°15′55″N, 61°31′54″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W