Air Miles Calculator logo

How far is St. Lewis from Hamilton?

The distance between Hamilton (L.F. Wade International Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 1453 miles / 2339 kilometers / 1263 nautical miles.

L.F. Wade International Airport – St. Lewis (Fox Harbour) Airport

Distance arrow
1453
Miles
Distance arrow
2339
Kilometers
Distance arrow
1263
Nautical miles
Flight time duration
3 h 15 min
CO2 emission
177 kg

Search flights

Distance from Hamilton to St. Lewis

There are several ways to calculate the distance from Hamilton to St. Lewis. Here are two standard methods:

Vincenty's formula (applied above)
  • 1453.254 miles
  • 2338.786 kilometers
  • 1262.843 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1454.229 miles
  • 2340.354 kilometers
  • 1263.690 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamilton to St. Lewis?

The estimated flight time from L.F. Wade International Airport to St. Lewis (Fox Harbour) Airport is 3 hours and 15 minutes.

Flight carbon footprint between L.F. Wade International Airport (BDA) and St. Lewis (Fox Harbour) Airport (YFX)

On average, flying from Hamilton to St. Lewis generates about 177 kg of CO2 per passenger, and 177 kilograms equals 389 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hamilton to St. Lewis

See the map of the shortest flight path between L.F. Wade International Airport (BDA) and St. Lewis (Fox Harbour) Airport (YFX).

Airport information

Origin L.F. Wade International Airport
City: Hamilton
Country: Bermuda Flag of Bermuda
IATA Code: BDA
ICAO Code: TXKF
Coordinates: 32°21′50″N, 64°40′43″W
Destination St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W