Air Miles Calculator logo

How far is West Palm Beach, FL, from Fort St.John?

The distance between Fort St.John (Fort St. John Airport) and West Palm Beach (Palm Beach International Airport) is 2862 miles / 4607 kilometers / 2487 nautical miles.

The driving distance from Fort St.John (YXJ) to West Palm Beach (PBI) is 3390 miles / 5455 kilometers, and travel time by car is about 62 hours 50 minutes.

Fort St. John Airport – Palm Beach International Airport

Distance arrow
2862
Miles
Distance arrow
4607
Kilometers
Distance arrow
2487
Nautical miles

Search flights

Distance from Fort St.John to West Palm Beach

There are several ways to calculate the distance from Fort St.John to West Palm Beach. Here are two standard methods:

Vincenty's formula (applied above)
  • 2862.414 miles
  • 4606.608 kilometers
  • 2487.369 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2860.551 miles
  • 4603.610 kilometers
  • 2485.751 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fort St.John to West Palm Beach?

The estimated flight time from Fort St. John Airport to Palm Beach International Airport is 5 hours and 55 minutes.

Flight carbon footprint between Fort St. John Airport (YXJ) and Palm Beach International Airport (PBI)

On average, flying from Fort St.John to West Palm Beach generates about 318 kg of CO2 per passenger, and 318 kilograms equals 701 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fort St.John to West Palm Beach

See the map of the shortest flight path between Fort St. John Airport (YXJ) and Palm Beach International Airport (PBI).

Airport information

Origin Fort St. John Airport
City: Fort St.John
Country: Canada Flag of Canada
IATA Code: YXJ
ICAO Code: CYXJ
Coordinates: 56°14′17″N, 120°44′23″W
Destination Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W