Air Miles Calculator logo

How far is Fort St.John from Pointe-à-Pitre?

The distance between Pointe-à-Pitre (Pointe-à-Pitre International Airport) and Fort St.John (Fort St. John Airport) is 4120 miles / 6630 kilometers / 3580 nautical miles.

Pointe-à-Pitre International Airport – Fort St. John Airport

Distance arrow
4120
Miles
Distance arrow
6630
Kilometers
Distance arrow
3580
Nautical miles

Search flights

Distance from Pointe-à-Pitre to Fort St.John

There are several ways to calculate the distance from Pointe-à-Pitre to Fort St.John. Here are two standard methods:

Vincenty's formula (applied above)
  • 4119.722 miles
  • 6630.050 kilometers
  • 3579.941 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4118.310 miles
  • 6627.778 kilometers
  • 3578.714 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pointe-à-Pitre to Fort St.John?

The estimated flight time from Pointe-à-Pitre International Airport to Fort St. John Airport is 8 hours and 18 minutes.

Flight carbon footprint between Pointe-à-Pitre International Airport (PTP) and Fort St. John Airport (YXJ)

On average, flying from Pointe-à-Pitre to Fort St.John generates about 471 kg of CO2 per passenger, and 471 kilograms equals 1 039 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pointe-à-Pitre to Fort St.John

See the map of the shortest flight path between Pointe-à-Pitre International Airport (PTP) and Fort St. John Airport (YXJ).

Airport information

Origin Pointe-à-Pitre International Airport
City: Pointe-à-Pitre
Country: Guadeloupe Flag of Guadeloupe
IATA Code: PTP
ICAO Code: TFFR
Coordinates: 16°15′55″N, 61°31′54″W
Destination Fort St. John Airport
City: Fort St.John
Country: Canada Flag of Canada
IATA Code: YXJ
ICAO Code: CYXJ
Coordinates: 56°14′17″N, 120°44′23″W