Air Miles Calculator logo

How far is Round Lake from Lahore?

The distance between Lahore (Allama Iqbal International Airport) and Round Lake (Round Lake (Weagamow Lake) Airport) is 6552 miles / 10545 kilometers / 5694 nautical miles.

Allama Iqbal International Airport – Round Lake (Weagamow Lake) Airport

Distance arrow
6552
Miles
Distance arrow
10545
Kilometers
Distance arrow
5694
Nautical miles

Search flights

Distance from Lahore to Round Lake

There are several ways to calculate the distance from Lahore to Round Lake. Here are two standard methods:

Vincenty's formula (applied above)
  • 6552.388 miles
  • 10545.047 kilometers
  • 5693.870 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6537.713 miles
  • 10521.430 kilometers
  • 5681.118 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lahore to Round Lake?

The estimated flight time from Allama Iqbal International Airport to Round Lake (Weagamow Lake) Airport is 12 hours and 54 minutes.

Flight carbon footprint between Allama Iqbal International Airport (LHE) and Round Lake (Weagamow Lake) Airport (ZRJ)

On average, flying from Lahore to Round Lake generates about 792 kg of CO2 per passenger, and 792 kilograms equals 1 747 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Lahore to Round Lake

See the map of the shortest flight path between Allama Iqbal International Airport (LHE) and Round Lake (Weagamow Lake) Airport (ZRJ).

Airport information

Origin Allama Iqbal International Airport
City: Lahore
Country: Pakistan Flag of Pakistan
IATA Code: LHE
ICAO Code: OPLA
Coordinates: 31°31′17″N, 74°24′12″E
Destination Round Lake (Weagamow Lake) Airport
City: Round Lake
Country: Canada Flag of Canada
IATA Code: ZRJ
ICAO Code: CZRJ
Coordinates: 52°56′36″N, 91°18′46″W