How far is Belgrad from Salt Lake City, UT?
The distance between Salt Lake City (Salt Lake City International Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 5840 miles / 9399 kilometers / 5075 nautical miles.
Salt Lake City International Airport – Belgrade Nikola Tesla Airport
Search flights
Distance from Salt Lake City to Belgrad
There are several ways to calculate the distance from Salt Lake City to Belgrad. Here are two standard methods:
Vincenty's formula (applied above)- 5840.357 miles
- 9399.144 kilometers
- 5075.132 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5825.442 miles
- 9375.139 kilometers
- 5062.170 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Salt Lake City to Belgrad?
The estimated flight time from Salt Lake City International Airport to Belgrade Nikola Tesla Airport is 11 hours and 33 minutes.
What is the time difference between Salt Lake City and Belgrad?
Flight carbon footprint between Salt Lake City International Airport (SLC) and Belgrade Nikola Tesla Airport (BEG)
On average, flying from Salt Lake City to Belgrad generates about 695 kg of CO2 per passenger, and 695 kilograms equals 1 532 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Salt Lake City to Belgrad
See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Belgrade Nikola Tesla Airport (BEG).
Airport information
Origin | Salt Lake City International Airport |
---|---|
City: | Salt Lake City, UT |
Country: | United States |
IATA Code: | SLC |
ICAO Code: | KSLC |
Coordinates: | 40°47′18″N, 111°58′40″W |
Destination | Belgrade Nikola Tesla Airport |
---|---|
City: | Belgrad |
Country: | Serbia |
IATA Code: | BEG |
ICAO Code: | LYBE |
Coordinates: | 44°49′6″N, 20°18′32″E |