Air Miles Calculator logo

How far is Williston, ND, from Belgrad?

The distance between Belgrad (Belgrade Nikola Tesla Airport) and Williston (Williston Basin International Airport) is 5186 miles / 8346 kilometers / 4506 nautical miles.

Belgrade Nikola Tesla Airport – Williston Basin International Airport

Distance arrow
5186
Miles
Distance arrow
8346
Kilometers
Distance arrow
4506
Nautical miles

Search flights

Distance from Belgrad to Williston

There are several ways to calculate the distance from Belgrad to Williston. Here are two standard methods:

Vincenty's formula (applied above)
  • 5185.771 miles
  • 8345.690 kilometers
  • 4506.312 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5171.198 miles
  • 8322.237 kilometers
  • 4493.649 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Belgrad to Williston?

The estimated flight time from Belgrade Nikola Tesla Airport to Williston Basin International Airport is 10 hours and 19 minutes.

Flight carbon footprint between Belgrade Nikola Tesla Airport (BEG) and Williston Basin International Airport (XWA)

On average, flying from Belgrad to Williston generates about 608 kg of CO2 per passenger, and 608 kilograms equals 1 341 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Belgrad to Williston

See the map of the shortest flight path between Belgrade Nikola Tesla Airport (BEG) and Williston Basin International Airport (XWA).

Airport information

Origin Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E
Destination Williston Basin International Airport
City: Williston, ND
Country: United States Flag of United States
IATA Code: XWA
ICAO Code: KXWA
Coordinates: 48°15′30″N, 103°44′55″W