Air Miles Calculator logo

How far is Pisa from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Pisa (Pisa International Airport) is 8199 miles / 13195 kilometers / 7125 nautical miles.

Ushuaia – Malvinas Argentinas International Airport – Pisa International Airport

Distance arrow
8199
Miles
Distance arrow
13195
Kilometers
Distance arrow
7125
Nautical miles

Search flights

Distance from Ushuaia to Pisa

There are several ways to calculate the distance from Ushuaia to Pisa. Here are two standard methods:

Vincenty's formula (applied above)
  • 8199.291 miles
  • 13195.480 kilometers
  • 7124.989 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8213.978 miles
  • 13219.116 kilometers
  • 7137.752 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Pisa?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Pisa International Airport is 16 hours and 1 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Pisa International Airport (PSA)

On average, flying from Ushuaia to Pisa generates about 1 028 kg of CO2 per passenger, and 1 028 kilograms equals 2 266 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ushuaia to Pisa

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Pisa International Airport (PSA).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Pisa International Airport
City: Pisa
Country: Italy Flag of Italy
IATA Code: PSA
ICAO Code: LIRP
Coordinates: 43°41′2″N, 10°23′33″E