Air Miles Calculator logo

How far is Windsor from Lusaka?

The distance between Lusaka (Kenneth Kaunda International Airport) and Windsor (Windsor International Airport) is 8015 miles / 12898 kilometers / 6965 nautical miles.

Kenneth Kaunda International Airport – Windsor International Airport

Distance arrow
8015
Miles
Distance arrow
12898
Kilometers
Distance arrow
6965
Nautical miles
Flight time duration
15 h 40 min
CO2 emission
1 001 kg

Search flights

Distance from Lusaka to Windsor

There are several ways to calculate the distance from Lusaka to Windsor. Here are two standard methods:

Vincenty's formula (applied above)
  • 8014.706 miles
  • 12898.419 kilometers
  • 6964.589 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8014.621 miles
  • 12898.283 kilometers
  • 6964.515 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lusaka to Windsor?

The estimated flight time from Kenneth Kaunda International Airport to Windsor International Airport is 15 hours and 40 minutes.

Flight carbon footprint between Kenneth Kaunda International Airport (LUN) and Windsor International Airport (YQG)

On average, flying from Lusaka to Windsor generates about 1 001 kg of CO2 per passenger, and 1 001 kilograms equals 2 206 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Lusaka to Windsor

See the map of the shortest flight path between Kenneth Kaunda International Airport (LUN) and Windsor International Airport (YQG).

Airport information

Origin Kenneth Kaunda International Airport
City: Lusaka
Country: Zambia Flag of Zambia
IATA Code: LUN
ICAO Code: FLLK
Coordinates: 15°19′50″S, 28°27′9″E
Destination Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W