Air Miles Calculator logo

How far is Philadelphia, PA, from Belgrad?

The distance between Belgrad (Belgrade Nikola Tesla Airport) and Philadelphia (Philadelphia International Airport) is 4605 miles / 7410 kilometers / 4001 nautical miles.

Belgrade Nikola Tesla Airport – Philadelphia International Airport

Distance arrow
4605
Miles
Distance arrow
7410
Kilometers
Distance arrow
4001
Nautical miles

Search flights

Distance from Belgrad to Philadelphia

There are several ways to calculate the distance from Belgrad to Philadelphia. Here are two standard methods:

Vincenty's formula (applied above)
  • 4604.598 miles
  • 7410.382 kilometers
  • 4001.286 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4592.704 miles
  • 7391.240 kilometers
  • 3990.950 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Belgrad to Philadelphia?

The estimated flight time from Belgrade Nikola Tesla Airport to Philadelphia International Airport is 9 hours and 13 minutes.

Flight carbon footprint between Belgrade Nikola Tesla Airport (BEG) and Philadelphia International Airport (PHL)

On average, flying from Belgrad to Philadelphia generates about 533 kg of CO2 per passenger, and 533 kilograms equals 1 175 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Belgrad to Philadelphia

See the map of the shortest flight path between Belgrade Nikola Tesla Airport (BEG) and Philadelphia International Airport (PHL).

Airport information

Origin Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E
Destination Philadelphia International Airport
City: Philadelphia, PA
Country: United States Flag of United States
IATA Code: PHL
ICAO Code: KPHL
Coordinates: 39°52′18″N, 75°14′27″W