Air Miles Calculator logo

How far is Cranbrook from Leipzig?

The distance between Leipzig (Leipzig/Halle Airport) and Cranbrook (Cranbrook/Canadian Rockies International Airport) is 4832 miles / 7776 kilometers / 4199 nautical miles.

Leipzig/Halle Airport – Cranbrook/Canadian Rockies International Airport

Distance arrow
4832
Miles
Distance arrow
7776
Kilometers
Distance arrow
4199
Nautical miles

Search flights

Distance from Leipzig to Cranbrook

There are several ways to calculate the distance from Leipzig to Cranbrook. Here are two standard methods:

Vincenty's formula (applied above)
  • 4831.668 miles
  • 7775.816 kilometers
  • 4198.605 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4816.889 miles
  • 7752.031 kilometers
  • 4185.762 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Leipzig to Cranbrook?

The estimated flight time from Leipzig/Halle Airport to Cranbrook/Canadian Rockies International Airport is 9 hours and 38 minutes.

Flight carbon footprint between Leipzig/Halle Airport (LEJ) and Cranbrook/Canadian Rockies International Airport (YXC)

On average, flying from Leipzig to Cranbrook generates about 562 kg of CO2 per passenger, and 562 kilograms equals 1 239 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Leipzig to Cranbrook

See the map of the shortest flight path between Leipzig/Halle Airport (LEJ) and Cranbrook/Canadian Rockies International Airport (YXC).

Airport information

Origin Leipzig/Halle Airport
City: Leipzig
Country: Germany Flag of Germany
IATA Code: LEJ
ICAO Code: EDDP
Coordinates: 51°25′56″N, 12°14′29″E
Destination Cranbrook/Canadian Rockies International Airport
City: Cranbrook
Country: Canada Flag of Canada
IATA Code: YXC
ICAO Code: CYXC
Coordinates: 49°36′38″N, 115°46′55″W