Air Miles Calculator logo

How far is Belgrad from Shanghai?

The distance between Shanghai (Shanghai Pudong International Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 5256 miles / 8459 kilometers / 4567 nautical miles.

Shanghai Pudong International Airport – Belgrade Nikola Tesla Airport

Distance arrow
5256
Miles
Distance arrow
8459
Kilometers
Distance arrow
4567
Nautical miles

Search flights

Distance from Shanghai to Belgrad

There are several ways to calculate the distance from Shanghai to Belgrad. Here are two standard methods:

Vincenty's formula (applied above)
  • 5256.008 miles
  • 8458.726 kilometers
  • 4567.346 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5244.472 miles
  • 8440.160 kilometers
  • 4557.322 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Belgrad?

The estimated flight time from Shanghai Pudong International Airport to Belgrade Nikola Tesla Airport is 10 hours and 27 minutes.

Flight carbon footprint between Shanghai Pudong International Airport (PVG) and Belgrade Nikola Tesla Airport (BEG)

On average, flying from Shanghai to Belgrad generates about 617 kg of CO2 per passenger, and 617 kilograms equals 1 361 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shanghai to Belgrad

See the map of the shortest flight path between Shanghai Pudong International Airport (PVG) and Belgrade Nikola Tesla Airport (BEG).

Airport information

Origin Shanghai Pudong International Airport
City: Shanghai
Country: China Flag of China
IATA Code: PVG
ICAO Code: ZSPD
Coordinates: 31°8′36″N, 121°48′18″E
Destination Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E