Air Miles Calculator logo

How far is Williston, ND, from Buenos Aires?

The distance between Buenos Aires (Aeroparque Jorge Newbery) and Williston (Williston Basin International Airport) is 6350 miles / 10219 kilometers / 5518 nautical miles.

Aeroparque Jorge Newbery – Williston Basin International Airport

Distance arrow
6350
Miles
Distance arrow
10219
Kilometers
Distance arrow
5518
Nautical miles

Search flights

Distance from Buenos Aires to Williston

There are several ways to calculate the distance from Buenos Aires to Williston. Here are two standard methods:

Vincenty's formula (applied above)
  • 6350.042 miles
  • 10219.402 kilometers
  • 5518.036 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6368.176 miles
  • 10248.586 kilometers
  • 5533.794 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Buenos Aires to Williston?

The estimated flight time from Aeroparque Jorge Newbery to Williston Basin International Airport is 12 hours and 31 minutes.

Flight carbon footprint between Aeroparque Jorge Newbery (AEP) and Williston Basin International Airport (XWA)

On average, flying from Buenos Aires to Williston generates about 764 kg of CO2 per passenger, and 764 kilograms equals 1 685 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Buenos Aires to Williston

See the map of the shortest flight path between Aeroparque Jorge Newbery (AEP) and Williston Basin International Airport (XWA).

Airport information

Origin Aeroparque Jorge Newbery
City: Buenos Aires
Country: Argentina Flag of Argentina
IATA Code: AEP
ICAO Code: SABE
Coordinates: 34°33′33″S, 58°24′56″W
Destination Williston Basin International Airport
City: Williston, ND
Country: United States Flag of United States
IATA Code: XWA
ICAO Code: KXWA
Coordinates: 48°15′30″N, 103°44′55″W