Air Miles Calculator logo

How far is Prince Albert from Havana?

The distance between Havana (José Martí International Airport) and Prince Albert (Prince Albert (Glass Field) Airport) is 2416 miles / 3888 kilometers / 2099 nautical miles.

José Martí International Airport – Prince Albert (Glass Field) Airport

Distance arrow
2416
Miles
Distance arrow
3888
Kilometers
Distance arrow
2099
Nautical miles

Search flights

Distance from Havana to Prince Albert

There are several ways to calculate the distance from Havana to Prince Albert. Here are two standard methods:

Vincenty's formula (applied above)
  • 2416.059 miles
  • 3888.269 kilometers
  • 2099.497 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2417.698 miles
  • 3890.908 kilometers
  • 2100.922 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Havana to Prince Albert?

The estimated flight time from José Martí International Airport to Prince Albert (Glass Field) Airport is 5 hours and 4 minutes.

Flight carbon footprint between José Martí International Airport (HAV) and Prince Albert (Glass Field) Airport (YPA)

On average, flying from Havana to Prince Albert generates about 265 kg of CO2 per passenger, and 265 kilograms equals 585 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Havana to Prince Albert

See the map of the shortest flight path between José Martí International Airport (HAV) and Prince Albert (Glass Field) Airport (YPA).

Airport information

Origin José Martí International Airport
City: Havana
Country: Cuba Flag of Cuba
IATA Code: HAV
ICAO Code: MUHA
Coordinates: 22°59′21″N, 82°24′32″W
Destination Prince Albert (Glass Field) Airport
City: Prince Albert
Country: Canada Flag of Canada
IATA Code: YPA
ICAO Code: CYPA
Coordinates: 53°12′51″N, 105°40′22″W