Air Miles Calculator logo

How far is Lebanon, NH, from Hamilton?

The distance between Hamilton (L.F. Wade International Airport) and Lebanon (Lebanon Municipal Airport (New Hampshire)) is 880 miles / 1417 kilometers / 765 nautical miles.

L.F. Wade International Airport – Lebanon Municipal Airport (New Hampshire)

Distance arrow
880
Miles
Distance arrow
1417
Kilometers
Distance arrow
765
Nautical miles

Search flights

Distance from Hamilton to Lebanon

There are several ways to calculate the distance from Hamilton to Lebanon. Here are two standard methods:

Vincenty's formula (applied above)
  • 880.277 miles
  • 1416.668 kilometers
  • 764.940 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 881.036 miles
  • 1417.889 kilometers
  • 765.599 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamilton to Lebanon?

The estimated flight time from L.F. Wade International Airport to Lebanon Municipal Airport (New Hampshire) is 2 hours and 10 minutes.

Flight carbon footprint between L.F. Wade International Airport (BDA) and Lebanon Municipal Airport (New Hampshire) (LEB)

On average, flying from Hamilton to Lebanon generates about 142 kg of CO2 per passenger, and 142 kilograms equals 313 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hamilton to Lebanon

See the map of the shortest flight path between L.F. Wade International Airport (BDA) and Lebanon Municipal Airport (New Hampshire) (LEB).

Airport information

Origin L.F. Wade International Airport
City: Hamilton
Country: Bermuda Flag of Bermuda
IATA Code: BDA
ICAO Code: TXKF
Coordinates: 32°21′50″N, 64°40′43″W
Destination Lebanon Municipal Airport (New Hampshire)
City: Lebanon, NH
Country: United States Flag of United States
IATA Code: LEB
ICAO Code: KLEB
Coordinates: 43°37′33″N, 72°18′15″W