Air Miles Calculator logo

How far is North Spirit Lake from Gibraltar?

The distance between Gibraltar (Gibraltar International Airport) and North Spirit Lake (North Spirit Lake Airport) is 4209 miles / 6774 kilometers / 3658 nautical miles.

Gibraltar International Airport – North Spirit Lake Airport

Distance arrow
4209
Miles
Distance arrow
6774
Kilometers
Distance arrow
3658
Nautical miles

Search flights

Distance from Gibraltar to North Spirit Lake

There are several ways to calculate the distance from Gibraltar to North Spirit Lake. Here are two standard methods:

Vincenty's formula (applied above)
  • 4208.994 miles
  • 6773.719 kilometers
  • 3657.515 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4198.630 miles
  • 6757.039 kilometers
  • 3648.509 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Gibraltar to North Spirit Lake?

The estimated flight time from Gibraltar International Airport to North Spirit Lake Airport is 8 hours and 28 minutes.

Flight carbon footprint between Gibraltar International Airport (GIB) and North Spirit Lake Airport (YNO)

On average, flying from Gibraltar to North Spirit Lake generates about 483 kg of CO2 per passenger, and 483 kilograms equals 1 064 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Gibraltar to North Spirit Lake

See the map of the shortest flight path between Gibraltar International Airport (GIB) and North Spirit Lake Airport (YNO).

Airport information

Origin Gibraltar International Airport
City: Gibraltar
Country: Gibraltar Flag of Gibraltar
IATA Code: GIB
ICAO Code: LXGB
Coordinates: 36°9′4″N, 5°20′58″W
Destination North Spirit Lake Airport
City: North Spirit Lake
Country: Canada Flag of Canada
IATA Code: YNO
ICAO Code: CKQ3
Coordinates: 52°29′24″N, 92°58′15″W