How far is Belgrad from St John's?
The distance between St John's (V. C. Bird International Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 5015 miles / 8071 kilometers / 4358 nautical miles.
V. C. Bird International Airport – Belgrade Nikola Tesla Airport
Search flights
Distance from St John's to Belgrad
There are several ways to calculate the distance from St John's to Belgrad. Here are two standard methods:
Vincenty's formula (applied above)- 5015.115 miles
- 8071.046 kilometers
- 4358.016 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5008.735 miles
- 8060.777 kilometers
- 4352.472 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from St John's to Belgrad?
The estimated flight time from V. C. Bird International Airport to Belgrade Nikola Tesla Airport is 9 hours and 59 minutes.
What is the time difference between St John's and Belgrad?
The time difference between St John's and Belgrad is 5 hours. Belgrad is 5 hours ahead of St John's.
Flight carbon footprint between V. C. Bird International Airport (ANU) and Belgrade Nikola Tesla Airport (BEG)
On average, flying from St John's to Belgrad generates about 586 kg of CO2 per passenger, and 586 kilograms equals 1 291 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from St John's to Belgrad
See the map of the shortest flight path between V. C. Bird International Airport (ANU) and Belgrade Nikola Tesla Airport (BEG).
Airport information
Origin | V. C. Bird International Airport |
---|---|
City: | St John's |
Country: | Antigua and Barbuda |
IATA Code: | ANU |
ICAO Code: | TAPA |
Coordinates: | 17°8′12″N, 61°47′33″W |
Destination | Belgrade Nikola Tesla Airport |
---|---|
City: | Belgrad |
Country: | Serbia |
IATA Code: | BEG |
ICAO Code: | LYBE |
Coordinates: | 44°49′6″N, 20°18′32″E |