Air Miles Calculator logo

How far is Williston, ND, from Napoli?

The distance between Napoli (Naples International Airport) and Williston (Williston Basin International Airport) is 5225 miles / 8408 kilometers / 4540 nautical miles.

Naples International Airport – Williston Basin International Airport

Distance arrow
5225
Miles
Distance arrow
8408
Kilometers
Distance arrow
4540
Nautical miles

Search flights

Distance from Napoli to Williston

There are several ways to calculate the distance from Napoli to Williston. Here are two standard methods:

Vincenty's formula (applied above)
  • 5224.717 miles
  • 8408.367 kilometers
  • 4540.155 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5210.776 miles
  • 8385.931 kilometers
  • 4528.040 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Napoli to Williston?

The estimated flight time from Naples International Airport to Williston Basin International Airport is 10 hours and 23 minutes.

Flight carbon footprint between Naples International Airport (NAP) and Williston Basin International Airport (XWA)

On average, flying from Napoli to Williston generates about 613 kg of CO2 per passenger, and 613 kilograms equals 1 352 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Napoli to Williston

See the map of the shortest flight path between Naples International Airport (NAP) and Williston Basin International Airport (XWA).

Airport information

Origin Naples International Airport
City: Napoli
Country: Italy Flag of Italy
IATA Code: NAP
ICAO Code: LIRN
Coordinates: 40°53′9″N, 14°17′26″E
Destination Williston Basin International Airport
City: Williston, ND
Country: United States Flag of United States
IATA Code: XWA
ICAO Code: KXWA
Coordinates: 48°15′30″N, 103°44′55″W