Air Miles Calculator logo

How far is St. Lewis from Barcelona?

The distance between Barcelona (Barcelona–El Prat Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 2762 miles / 4446 kilometers / 2401 nautical miles.

Barcelona–El Prat Airport – St. Lewis (Fox Harbour) Airport

Distance arrow
2762
Miles
Distance arrow
4446
Kilometers
Distance arrow
2401
Nautical miles
Flight time duration
5 h 43 min
Time Difference
4 h 30 min
CO2 emission
306 kg

Search flights

Distance from Barcelona to St. Lewis

There are several ways to calculate the distance from Barcelona to St. Lewis. Here are two standard methods:

Vincenty's formula (applied above)
  • 2762.458 miles
  • 4445.746 kilometers
  • 2400.511 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2755.161 miles
  • 4434.002 kilometers
  • 2394.169 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Barcelona to St. Lewis?

The estimated flight time from Barcelona–El Prat Airport to St. Lewis (Fox Harbour) Airport is 5 hours and 43 minutes.

Flight carbon footprint between Barcelona–El Prat Airport (BCN) and St. Lewis (Fox Harbour) Airport (YFX)

On average, flying from Barcelona to St. Lewis generates about 306 kg of CO2 per passenger, and 306 kilograms equals 675 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Barcelona to St. Lewis

See the map of the shortest flight path between Barcelona–El Prat Airport (BCN) and St. Lewis (Fox Harbour) Airport (YFX).

Airport information

Origin Barcelona–El Prat Airport
City: Barcelona
Country: Spain Flag of Spain
IATA Code: BCN
ICAO Code: LEBL
Coordinates: 41°17′49″N, 2°4′42″E
Destination St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W