Air Miles Calculator logo

How far is Wilkes-Barre, PA, from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Wilkes-Barre (Wilkes-Barre/Scranton International Airport) is 6637 miles / 10681 kilometers / 5767 nautical miles.

Ushuaia – Malvinas Argentinas International Airport – Wilkes-Barre/Scranton International Airport

Distance arrow
6637
Miles
Distance arrow
10681
Kilometers
Distance arrow
5767
Nautical miles

Search flights

Distance from Ushuaia to Wilkes-Barre

There are several ways to calculate the distance from Ushuaia to Wilkes-Barre. Here are two standard methods:

Vincenty's formula (applied above)
  • 6637.093 miles
  • 10681.366 kilometers
  • 5767.476 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6659.978 miles
  • 10718.195 kilometers
  • 5787.362 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Wilkes-Barre?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Wilkes-Barre/Scranton International Airport is 13 hours and 3 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Wilkes-Barre/Scranton International Airport (AVP)

On average, flying from Ushuaia to Wilkes-Barre generates about 804 kg of CO2 per passenger, and 804 kilograms equals 1 773 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ushuaia to Wilkes-Barre

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Wilkes-Barre/Scranton International Airport (AVP).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Wilkes-Barre/Scranton International Airport
City: Wilkes-Barre, PA
Country: United States Flag of United States
IATA Code: AVP
ICAO Code: KAVP
Coordinates: 41°20′18″N, 75°43′24″W