Air Miles Calculator logo

How far is Perugia from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Perugia (Perugia San Francesco d'Assisi – Umbria International Airport) is 8238 miles / 13258 kilometers / 7159 nautical miles.

Ushuaia – Malvinas Argentinas International Airport – Perugia San Francesco d'Assisi – Umbria International Airport

Distance arrow
8238
Miles
Distance arrow
13258
Kilometers
Distance arrow
7159
Nautical miles

Search flights

Distance from Ushuaia to Perugia

There are several ways to calculate the distance from Ushuaia to Perugia. Here are two standard methods:

Vincenty's formula (applied above)
  • 8237.896 miles
  • 13257.609 kilometers
  • 7158.536 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8252.141 miles
  • 13280.534 kilometers
  • 7170.915 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Perugia?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Perugia San Francesco d'Assisi – Umbria International Airport is 16 hours and 5 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Perugia San Francesco d'Assisi – Umbria International Airport (PEG)

On average, flying from Ushuaia to Perugia generates about 1 033 kg of CO2 per passenger, and 1 033 kilograms equals 2 278 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ushuaia to Perugia

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Perugia San Francesco d'Assisi – Umbria International Airport (PEG).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Perugia San Francesco d'Assisi – Umbria International Airport
City: Perugia
Country: Italy Flag of Italy
IATA Code: PEG
ICAO Code: LIRZ
Coordinates: 43°5′45″N, 12°30′47″E