Air Miles Calculator logo

How far is Belgrad from Norwich?

The distance between Norwich (Norwich Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 1020 miles / 1642 kilometers / 887 nautical miles.

The driving distance from Norwich (NWI) to Belgrad (BEG) is 1365 miles / 2197 kilometers, and travel time by car is about 22 hours 52 minutes.

Norwich Airport – Belgrade Nikola Tesla Airport

Distance arrow
1020
Miles
Distance arrow
1642
Kilometers
Distance arrow
887
Nautical miles

Search flights

Distance from Norwich to Belgrad

There are several ways to calculate the distance from Norwich to Belgrad. Here are two standard methods:

Vincenty's formula (applied above)
  • 1020.274 miles
  • 1641.972 kilometers
  • 886.594 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1018.054 miles
  • 1638.400 kilometers
  • 884.665 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Norwich to Belgrad?

The estimated flight time from Norwich Airport to Belgrade Nikola Tesla Airport is 2 hours and 25 minutes.

Flight carbon footprint between Norwich Airport (NWI) and Belgrade Nikola Tesla Airport (BEG)

On average, flying from Norwich to Belgrad generates about 152 kg of CO2 per passenger, and 152 kilograms equals 335 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Norwich to Belgrad

See the map of the shortest flight path between Norwich Airport (NWI) and Belgrade Nikola Tesla Airport (BEG).

Airport information

Origin Norwich Airport
City: Norwich
Country: United Kingdom Flag of United Kingdom
IATA Code: NWI
ICAO Code: EGSH
Coordinates: 52°40′32″N, 1°16′58″E
Destination Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E