Air Miles Calculator logo

How far is Lankaran from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Lankaran (Lankaran International Airport) is 5910 miles / 9511 kilometers / 5136 nautical miles.

Toronto Pearson International Airport – Lankaran International Airport

Distance arrow
5910
Miles
Distance arrow
9511
Kilometers
Distance arrow
5136
Nautical miles

Search flights

Distance from Toronto to Lankaran

There are several ways to calculate the distance from Toronto to Lankaran. Here are two standard methods:

Vincenty's formula (applied above)
  • 5910.018 miles
  • 9511.253 kilometers
  • 5135.666 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5895.499 miles
  • 9487.887 kilometers
  • 5123.049 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Lankaran?

The estimated flight time from Toronto Pearson International Airport to Lankaran International Airport is 11 hours and 41 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Lankaran International Airport (LLK)

On average, flying from Toronto to Lankaran generates about 704 kg of CO2 per passenger, and 704 kilograms equals 1 553 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Lankaran

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Lankaran International Airport (LLK).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Lankaran International Airport
City: Lankaran
Country: Azerbaijan Flag of Azerbaijan
IATA Code: LLK
ICAO Code: UBBL
Coordinates: 38°44′47″N, 48°49′4″E