Air Miles Calculator logo

How far is Gander from Lahore?

The distance between Lahore (Allama Iqbal International Airport) and Gander (Gander International Airport) is 6066 miles / 9762 kilometers / 5271 nautical miles.

Allama Iqbal International Airport – Gander International Airport

Distance arrow
6066
Miles
Distance arrow
9762
Kilometers
Distance arrow
5271
Nautical miles
Flight time duration
11 h 59 min
Time Difference
8 h 30 min
CO2 emission
726 kg

Search flights

Distance from Lahore to Gander

There are several ways to calculate the distance from Lahore to Gander. Here are two standard methods:

Vincenty's formula (applied above)
  • 6065.581 miles
  • 9761.606 kilometers
  • 5270.845 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6052.015 miles
  • 9739.773 kilometers
  • 5259.057 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lahore to Gander?

The estimated flight time from Allama Iqbal International Airport to Gander International Airport is 11 hours and 59 minutes.

Flight carbon footprint between Allama Iqbal International Airport (LHE) and Gander International Airport (YQX)

On average, flying from Lahore to Gander generates about 726 kg of CO2 per passenger, and 726 kilograms equals 1 600 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Lahore to Gander

See the map of the shortest flight path between Allama Iqbal International Airport (LHE) and Gander International Airport (YQX).

Airport information

Origin Allama Iqbal International Airport
City: Lahore
Country: Pakistan Flag of Pakistan
IATA Code: LHE
ICAO Code: OPLA
Coordinates: 31°31′17″N, 74°24′12″E
Destination Gander International Airport
City: Gander
Country: Canada Flag of Canada
IATA Code: YQX
ICAO Code: CYQX
Coordinates: 48°56′12″N, 54°34′5″W