Air Miles Calculator logo

How far is Belgrad from San Juan?

The distance between San Juan (San Juan Luis Muñoz Marín International Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 5159 miles / 8302 kilometers / 4483 nautical miles.

San Juan Luis Muñoz Marín International Airport – Belgrade Nikola Tesla Airport

Distance arrow
5159
Miles
Distance arrow
8302
Kilometers
Distance arrow
4483
Nautical miles

Search flights

Distance from San Juan to Belgrad

There are several ways to calculate the distance from San Juan to Belgrad. Here are two standard methods:

Vincenty's formula (applied above)
  • 5158.786 miles
  • 8302.261 kilometers
  • 4482.862 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5151.534 miles
  • 8290.590 kilometers
  • 4476.560 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Juan to Belgrad?

The estimated flight time from San Juan Luis Muñoz Marín International Airport to Belgrade Nikola Tesla Airport is 10 hours and 16 minutes.

Flight carbon footprint between San Juan Luis Muñoz Marín International Airport (SJU) and Belgrade Nikola Tesla Airport (BEG)

On average, flying from San Juan to Belgrad generates about 605 kg of CO2 per passenger, and 605 kilograms equals 1 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Juan to Belgrad

See the map of the shortest flight path between San Juan Luis Muñoz Marín International Airport (SJU) and Belgrade Nikola Tesla Airport (BEG).

Airport information

Origin San Juan Luis Muñoz Marín International Airport
City: San Juan
Country: Puerto Rico Flag of Puerto Rico
IATA Code: SJU
ICAO Code: TJSJ
Coordinates: 18°26′21″N, 66°0′6″W
Destination Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E