Air Miles Calculator logo

How far is Pisa from Humberside?

The distance between Humberside (Humberside Airport) and Pisa (Pisa International Airport) is 840 miles / 1352 kilometers / 730 nautical miles.

The driving distance from Humberside (HUY) to Pisa (PSA) is 1118 miles / 1799 kilometers, and travel time by car is about 19 hours 24 minutes.

Humberside Airport – Pisa International Airport

Distance arrow
840
Miles
Distance arrow
1352
Kilometers
Distance arrow
730
Nautical miles

Search flights

Distance from Humberside to Pisa

There are several ways to calculate the distance from Humberside to Pisa. Here are two standard methods:

Vincenty's formula (applied above)
  • 840.161 miles
  • 1352.107 kilometers
  • 730.080 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 839.275 miles
  • 1350.683 kilometers
  • 729.310 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Humberside to Pisa?

The estimated flight time from Humberside Airport to Pisa International Airport is 2 hours and 5 minutes.

Flight carbon footprint between Humberside Airport (HUY) and Pisa International Airport (PSA)

On average, flying from Humberside to Pisa generates about 139 kg of CO2 per passenger, and 139 kilograms equals 305 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Humberside to Pisa

See the map of the shortest flight path between Humberside Airport (HUY) and Pisa International Airport (PSA).

Airport information

Origin Humberside Airport
City: Humberside
Country: United Kingdom Flag of United Kingdom
IATA Code: HUY
ICAO Code: EGNJ
Coordinates: 53°34′27″N, 0°21′2″W
Destination Pisa International Airport
City: Pisa
Country: Italy Flag of Italy
IATA Code: PSA
ICAO Code: LIRP
Coordinates: 43°41′2″N, 10°23′33″E