How far is Cranbrook from Marsh Harbour?
The distance between Marsh Harbour (Marsh Harbour Airport) and Cranbrook (Cranbrook/Canadian Rockies International Airport) is 2599 miles / 4182 kilometers / 2258 nautical miles.
Marsh Harbour Airport – Cranbrook/Canadian Rockies International Airport
Search flights
Distance from Marsh Harbour to Cranbrook
There are several ways to calculate the distance from Marsh Harbour to Cranbrook. Here are two standard methods:
Vincenty's formula (applied above)- 2598.539 miles
- 4181.943 kilometers
- 2258.068 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2596.416 miles
- 4178.527 kilometers
- 2256.224 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Marsh Harbour to Cranbrook?
The estimated flight time from Marsh Harbour Airport to Cranbrook/Canadian Rockies International Airport is 5 hours and 25 minutes.
What is the time difference between Marsh Harbour and Cranbrook?
Flight carbon footprint between Marsh Harbour Airport (MHH) and Cranbrook/Canadian Rockies International Airport (YXC)
On average, flying from Marsh Harbour to Cranbrook generates about 287 kg of CO2 per passenger, and 287 kilograms equals 632 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Marsh Harbour to Cranbrook
See the map of the shortest flight path between Marsh Harbour Airport (MHH) and Cranbrook/Canadian Rockies International Airport (YXC).
Airport information
Origin | Marsh Harbour Airport |
---|---|
City: | Marsh Harbour |
Country: | Bahamas ![]() |
IATA Code: | MHH |
ICAO Code: | MYAM |
Coordinates: | 26°30′41″N, 77°5′0″W |
Destination | Cranbrook/Canadian Rockies International Airport |
---|---|
City: | Cranbrook |
Country: | Canada ![]() |
IATA Code: | YXC |
ICAO Code: | CYXC |
Coordinates: | 49°36′38″N, 115°46′55″W |