Air Miles Calculator logo

How far is Panama City Beach, FL, from Sarajevo?

The distance between Sarajevo (Sarajevo International Airport) and Panama City Beach (Northwest Florida Beaches International Airport) is 5442 miles / 8758 kilometers / 4729 nautical miles.

Sarajevo International Airport – Northwest Florida Beaches International Airport

Distance arrow
5442
Miles
Distance arrow
8758
Kilometers
Distance arrow
4729
Nautical miles

Search flights

Distance from Sarajevo to Panama City Beach

There are several ways to calculate the distance from Sarajevo to Panama City Beach. Here are two standard methods:

Vincenty's formula (applied above)
  • 5441.825 miles
  • 8757.768 kilometers
  • 4728.816 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5430.134 miles
  • 8738.954 kilometers
  • 4718.658 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sarajevo to Panama City Beach?

The estimated flight time from Sarajevo International Airport to Northwest Florida Beaches International Airport is 10 hours and 48 minutes.

Flight carbon footprint between Sarajevo International Airport (SJJ) and Northwest Florida Beaches International Airport (ECP)

On average, flying from Sarajevo to Panama City Beach generates about 642 kg of CO2 per passenger, and 642 kilograms equals 1 415 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Sarajevo to Panama City Beach

See the map of the shortest flight path between Sarajevo International Airport (SJJ) and Northwest Florida Beaches International Airport (ECP).

Airport information

Origin Sarajevo International Airport
City: Sarajevo
Country: Bosnia and Herzegovina Flag of Bosnia and Herzegovina
IATA Code: SJJ
ICAO Code: LQSA
Coordinates: 43°49′28″N, 18°19′53″E
Destination Northwest Florida Beaches International Airport
City: Panama City Beach, FL
Country: United States Flag of United States
IATA Code: ECP
ICAO Code: KECP
Coordinates: 30°20′30″N, 85°47′50″W