Air Miles Calculator logo

How far is Papa Westray from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and Papa Westray (Papa Westray Airport) is 3124 miles / 5027 kilometers / 2714 nautical miles.

Bradley International Airport – Papa Westray Airport

Distance arrow
3124
Miles
Distance arrow
5027
Kilometers
Distance arrow
2714
Nautical miles

Search flights

Distance from Windsor Locks to Papa Westray

There are several ways to calculate the distance from Windsor Locks to Papa Westray. Here are two standard methods:

Vincenty's formula (applied above)
  • 3123.645 miles
  • 5027.019 kilometers
  • 2714.373 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3115.281 miles
  • 5013.558 kilometers
  • 2707.105 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to Papa Westray?

The estimated flight time from Bradley International Airport to Papa Westray Airport is 6 hours and 24 minutes.

Flight carbon footprint between Bradley International Airport (BDL) and Papa Westray Airport (PPW)

On average, flying from Windsor Locks to Papa Westray generates about 349 kg of CO2 per passenger, and 349 kilograms equals 769 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Windsor Locks to Papa Westray

See the map of the shortest flight path between Bradley International Airport (BDL) and Papa Westray Airport (PPW).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Papa Westray Airport
City: Papa Westray
Country: United Kingdom Flag of United Kingdom
IATA Code: PPW
ICAO Code: EGEP
Coordinates: 59°21′6″N, 2°54′1″W