Air Miles Calculator logo

How far is Blanc-Sablon from Lahore?

The distance between Lahore (Allama Iqbal International Airport) and Blanc-Sablon (Lourdes-de-Blanc-Sablon Airport) is 6010 miles / 9671 kilometers / 5222 nautical miles.

Allama Iqbal International Airport – Lourdes-de-Blanc-Sablon Airport

Distance arrow
6010
Miles
Distance arrow
9671
Kilometers
Distance arrow
5222
Nautical miles

Search flights

Distance from Lahore to Blanc-Sablon

There are several ways to calculate the distance from Lahore to Blanc-Sablon. Here are two standard methods:

Vincenty's formula (applied above)
  • 6009.517 miles
  • 9671.380 kilometers
  • 5222.128 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5995.944 miles
  • 9649.536 kilometers
  • 5210.333 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lahore to Blanc-Sablon?

The estimated flight time from Allama Iqbal International Airport to Lourdes-de-Blanc-Sablon Airport is 11 hours and 52 minutes.

Flight carbon footprint between Allama Iqbal International Airport (LHE) and Lourdes-de-Blanc-Sablon Airport (YBX)

On average, flying from Lahore to Blanc-Sablon generates about 718 kg of CO2 per passenger, and 718 kilograms equals 1 583 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Lahore to Blanc-Sablon

See the map of the shortest flight path between Allama Iqbal International Airport (LHE) and Lourdes-de-Blanc-Sablon Airport (YBX).

Airport information

Origin Allama Iqbal International Airport
City: Lahore
Country: Pakistan Flag of Pakistan
IATA Code: LHE
ICAO Code: OPLA
Coordinates: 31°31′17″N, 74°24′12″E
Destination Lourdes-de-Blanc-Sablon Airport
City: Blanc-Sablon
Country: Canada Flag of Canada
IATA Code: YBX
ICAO Code: CYBX
Coordinates: 51°26′36″N, 57°11′7″W