Air Miles Calculator logo

How far is Port Hope Simpson from Blackpool?

The distance between Blackpool (Blackpool Airport) and Port Hope Simpson (Port Hope Simpson Airport) is 2163 miles / 3481 kilometers / 1880 nautical miles.

Blackpool Airport – Port Hope Simpson Airport

Distance arrow
2163
Miles
Distance arrow
3481
Kilometers
Distance arrow
1880
Nautical miles
Flight time duration
4 h 35 min
Time Difference
3 h 30 min
CO2 emission
236 kg

Search flights

Distance from Blackpool to Port Hope Simpson

There are several ways to calculate the distance from Blackpool to Port Hope Simpson. Here are two standard methods:

Vincenty's formula (applied above)
  • 2163.215 miles
  • 3481.357 kilometers
  • 1879.783 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2156.174 miles
  • 3470.025 kilometers
  • 1873.664 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Blackpool to Port Hope Simpson?

The estimated flight time from Blackpool Airport to Port Hope Simpson Airport is 4 hours and 35 minutes.

Flight carbon footprint between Blackpool Airport (BLK) and Port Hope Simpson Airport (YHA)

On average, flying from Blackpool to Port Hope Simpson generates about 236 kg of CO2 per passenger, and 236 kilograms equals 521 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Blackpool to Port Hope Simpson

See the map of the shortest flight path between Blackpool Airport (BLK) and Port Hope Simpson Airport (YHA).

Airport information

Origin Blackpool Airport
City: Blackpool
Country: United Kingdom Flag of United Kingdom
IATA Code: BLK
ICAO Code: EGNH
Coordinates: 53°46′18″N, 3°1′42″W
Destination Port Hope Simpson Airport
City: Port Hope Simpson
Country: Canada Flag of Canada
IATA Code: YHA
ICAO Code: CCP4
Coordinates: 52°31′41″N, 56°17′9″W