Air Miles Calculator logo

How far is North Platte, NE, from Hamilton?

The distance between Hamilton (L.F. Wade International Airport) and North Platte (North Platte Regional Airport) is 2071 miles / 3333 kilometers / 1800 nautical miles.

L.F. Wade International Airport – North Platte Regional Airport

Distance arrow
2071
Miles
Distance arrow
3333
Kilometers
Distance arrow
1800
Nautical miles

Search flights

Distance from Hamilton to North Platte

There are several ways to calculate the distance from Hamilton to North Platte. Here are two standard methods:

Vincenty's formula (applied above)
  • 2070.840 miles
  • 3332.693 kilometers
  • 1799.510 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2066.814 miles
  • 3326.215 kilometers
  • 1796.012 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamilton to North Platte?

The estimated flight time from L.F. Wade International Airport to North Platte Regional Airport is 4 hours and 25 minutes.

Flight carbon footprint between L.F. Wade International Airport (BDA) and North Platte Regional Airport (LBF)

On average, flying from Hamilton to North Platte generates about 225 kg of CO2 per passenger, and 225 kilograms equals 497 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hamilton to North Platte

See the map of the shortest flight path between L.F. Wade International Airport (BDA) and North Platte Regional Airport (LBF).

Airport information

Origin L.F. Wade International Airport
City: Hamilton
Country: Bermuda Flag of Bermuda
IATA Code: BDA
ICAO Code: TXKF
Coordinates: 32°21′50″N, 64°40′43″W
Destination North Platte Regional Airport
City: North Platte, NE
Country: United States Flag of United States
IATA Code: LBF
ICAO Code: KLBF
Coordinates: 41°7′34″N, 100°41′2″W