Air Miles Calculator logo

How far is Natashquan from West Palm Beach, FL?

The distance between West Palm Beach (Palm Beach International Airport) and Natashquan (Natashquan Airport) is 1890 miles / 3041 kilometers / 1642 nautical miles.

The driving distance from West Palm Beach (PBI) to Natashquan (YNA) is 2353 miles / 3787 kilometers, and travel time by car is about 47 hours 0 minutes.

Palm Beach International Airport – Natashquan Airport

Distance arrow
1890
Miles
Distance arrow
3041
Kilometers
Distance arrow
1642
Nautical miles

Search flights

Distance from West Palm Beach to Natashquan

There are several ways to calculate the distance from West Palm Beach to Natashquan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1889.892 miles
  • 3041.487 kilometers
  • 1642.271 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1891.045 miles
  • 3043.342 kilometers
  • 1643.273 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from West Palm Beach to Natashquan?

The estimated flight time from Palm Beach International Airport to Natashquan Airport is 4 hours and 4 minutes.

What is the time difference between West Palm Beach and Natashquan?

There is no time difference between West Palm Beach and Natashquan.

Flight carbon footprint between Palm Beach International Airport (PBI) and Natashquan Airport (YNA)

On average, flying from West Palm Beach to Natashquan generates about 207 kg of CO2 per passenger, and 207 kilograms equals 457 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from West Palm Beach to Natashquan

See the map of the shortest flight path between Palm Beach International Airport (PBI) and Natashquan Airport (YNA).

Airport information

Origin Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W
Destination Natashquan Airport
City: Natashquan
Country: Canada Flag of Canada
IATA Code: YNA
ICAO Code: CYNA
Coordinates: 50°11′23″N, 61°47′21″W