How far is Cranbrook from Liverpool?
The distance between Liverpool (Liverpool John Lennon Airport) and Cranbrook (Cranbrook/Canadian Rockies International Airport) is 4339 miles / 6983 kilometers / 3770 nautical miles.
Liverpool John Lennon Airport – Cranbrook/Canadian Rockies International Airport
Search flights
Distance from Liverpool to Cranbrook
There are several ways to calculate the distance from Liverpool to Cranbrook. Here are two standard methods:
Vincenty's formula (applied above)- 4338.896 miles
- 6982.776 kilometers
- 3770.398 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4325.343 miles
- 6960.966 kilometers
- 3758.621 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Liverpool to Cranbrook?
The estimated flight time from Liverpool John Lennon Airport to Cranbrook/Canadian Rockies International Airport is 8 hours and 42 minutes.
What is the time difference between Liverpool and Cranbrook?
Flight carbon footprint between Liverpool John Lennon Airport (LPL) and Cranbrook/Canadian Rockies International Airport (YXC)
On average, flying from Liverpool to Cranbrook generates about 499 kg of CO2 per passenger, and 499 kilograms equals 1 100 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Liverpool to Cranbrook
See the map of the shortest flight path between Liverpool John Lennon Airport (LPL) and Cranbrook/Canadian Rockies International Airport (YXC).
Airport information
Origin | Liverpool John Lennon Airport |
---|---|
City: | Liverpool |
Country: | United Kingdom |
IATA Code: | LPL |
ICAO Code: | EGGP |
Coordinates: | 53°20′0″N, 2°50′58″W |
Destination | Cranbrook/Canadian Rockies International Airport |
---|---|
City: | Cranbrook |
Country: | Canada |
IATA Code: | YXC |
ICAO Code: | CYXC |
Coordinates: | 49°36′38″N, 115°46′55″W |