Air Miles Calculator logo

How far is Presque Isle, ME, from Belgrad?

The distance between Belgrad (Belgrade Nikola Tesla Airport) and Presque Isle (Presque Isle International Airport) is 4033 miles / 6490 kilometers / 3505 nautical miles.

Belgrade Nikola Tesla Airport – Presque Isle International Airport

Distance arrow
4033
Miles
Distance arrow
6490
Kilometers
Distance arrow
3505
Nautical miles

Search flights

Distance from Belgrad to Presque Isle

There are several ways to calculate the distance from Belgrad to Presque Isle. Here are two standard methods:

Vincenty's formula (applied above)
  • 4033.005 miles
  • 6490.493 kilometers
  • 3504.586 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4021.664 miles
  • 6472.241 kilometers
  • 3494.731 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Belgrad to Presque Isle?

The estimated flight time from Belgrade Nikola Tesla Airport to Presque Isle International Airport is 8 hours and 8 minutes.

Flight carbon footprint between Belgrade Nikola Tesla Airport (BEG) and Presque Isle International Airport (PQI)

On average, flying from Belgrad to Presque Isle generates about 460 kg of CO2 per passenger, and 460 kilograms equals 1 015 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Belgrad to Presque Isle

See the map of the shortest flight path between Belgrade Nikola Tesla Airport (BEG) and Presque Isle International Airport (PQI).

Airport information

Origin Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E
Destination Presque Isle International Airport
City: Presque Isle, ME
Country: United States Flag of United States
IATA Code: PQI
ICAO Code: KPQI
Coordinates: 46°41′20″N, 68°2′41″W