Air Miles Calculator logo

How far is Lutselk'e from West Palm Beach, FL?

The distance between West Palm Beach (Palm Beach International Airport) and Lutselk'e (Lutselk'e Airport) is 2837 miles / 4565 kilometers / 2465 nautical miles.

The driving distance from West Palm Beach (PBI) to Lutselk'e (YSG) is 4118 miles / 6627 kilometers, and travel time by car is about 79 hours 35 minutes.

Palm Beach International Airport – Lutselk'e Airport

Distance arrow
2837
Miles
Distance arrow
4565
Kilometers
Distance arrow
2465
Nautical miles

Search flights

Distance from West Palm Beach to Lutselk'e

There are several ways to calculate the distance from West Palm Beach to Lutselk'e. Here are two standard methods:

Vincenty's formula (applied above)
  • 2836.572 miles
  • 4565.020 kilometers
  • 2464.914 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2836.119 miles
  • 4564.291 kilometers
  • 2464.520 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from West Palm Beach to Lutselk'e?

The estimated flight time from Palm Beach International Airport to Lutselk'e Airport is 5 hours and 52 minutes.

What is the time difference between West Palm Beach and Lutselk'e?

There is no time difference between West Palm Beach and Lutselk'e.

Flight carbon footprint between Palm Beach International Airport (PBI) and Lutselk'e Airport (YSG)

On average, flying from West Palm Beach to Lutselk'e generates about 315 kg of CO2 per passenger, and 315 kilograms equals 694 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from West Palm Beach to Lutselk'e

See the map of the shortest flight path between Palm Beach International Airport (PBI) and Lutselk'e Airport (YSG).

Airport information

Origin Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W
Destination Lutselk'e Airport
City: Lutselk'e
Country: Canada Flag of Canada
IATA Code: YSG
ICAO Code: CYLK
Coordinates: 62°25′5″N, 110°40′55″W