How far is Belgrad from Ukhta?
The distance between Ukhta (Ukhta Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 1839 miles / 2959 kilometers / 1598 nautical miles.
The driving distance from Ukhta (UCT) to Belgrad (BEG) is 2390 miles / 3847 kilometers, and travel time by car is about 51 hours 18 minutes.
Ukhta Airport – Belgrade Nikola Tesla Airport
Search flights
Distance from Ukhta to Belgrad
There are several ways to calculate the distance from Ukhta to Belgrad. Here are two standard methods:
Vincenty's formula (applied above)- 1838.766 miles
- 2959.207 kilometers
- 1597.844 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1834.838 miles
- 2952.885 kilometers
- 1594.430 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Ukhta to Belgrad?
The estimated flight time from Ukhta Airport to Belgrade Nikola Tesla Airport is 3 hours and 58 minutes.
What is the time difference between Ukhta and Belgrad?
The time difference between Ukhta and Belgrad is 2 hours. Belgrad is 2 hours behind Ukhta.
Flight carbon footprint between Ukhta Airport (UCT) and Belgrade Nikola Tesla Airport (BEG)
On average, flying from Ukhta to Belgrad generates about 203 kg of CO2 per passenger, and 203 kilograms equals 448 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Ukhta to Belgrad
See the map of the shortest flight path between Ukhta Airport (UCT) and Belgrade Nikola Tesla Airport (BEG).
Airport information
Origin | Ukhta Airport |
---|---|
City: | Ukhta |
Country: | Russia |
IATA Code: | UCT |
ICAO Code: | UUYH |
Coordinates: | 63°34′0″N, 53°48′16″E |
Destination | Belgrade Nikola Tesla Airport |
---|---|
City: | Belgrad |
Country: | Serbia |
IATA Code: | BEG |
ICAO Code: | LYBE |
Coordinates: | 44°49′6″N, 20°18′32″E |