Air Miles Calculator logo

How far is Pamplona from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Pamplona (Pamplona Airport) is 7778 miles / 12517 kilometers / 6759 nautical miles.

Ushuaia – Malvinas Argentinas International Airport – Pamplona Airport

Distance arrow
7778
Miles
Distance arrow
12517
Kilometers
Distance arrow
6759
Nautical miles

Search flights

Distance from Ushuaia to Pamplona

There are several ways to calculate the distance from Ushuaia to Pamplona. Here are two standard methods:

Vincenty's formula (applied above)
  • 7777.733 miles
  • 12517.047 kilometers
  • 6758.665 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7794.295 miles
  • 12543.702 kilometers
  • 6773.057 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Pamplona?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Pamplona Airport is 15 hours and 13 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Pamplona Airport (PNA)

On average, flying from Ushuaia to Pamplona generates about 966 kg of CO2 per passenger, and 966 kilograms equals 2 130 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ushuaia to Pamplona

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Pamplona Airport (PNA).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Pamplona Airport
City: Pamplona
Country: Spain Flag of Spain
IATA Code: PNA
ICAO Code: LEPP
Coordinates: 42°46′12″N, 1°38′46″W