How far is Sarajevo from West Palm Beach, FL?
The distance between West Palm Beach (Palm Beach International Airport) and Sarajevo (Sarajevo International Airport) is 5365 miles / 8634 kilometers / 4662 nautical miles.
Palm Beach International Airport – Sarajevo International Airport
Search flights
Distance from West Palm Beach to Sarajevo
There are several ways to calculate the distance from West Palm Beach to Sarajevo. Here are two standard methods:
Vincenty's formula (applied above)- 5365.101 miles
- 8634.294 kilometers
- 4662.146 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5354.577 miles
- 8617.356 kilometers
- 4653.000 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from West Palm Beach to Sarajevo?
The estimated flight time from Palm Beach International Airport to Sarajevo International Airport is 10 hours and 39 minutes.
What is the time difference between West Palm Beach and Sarajevo?
Flight carbon footprint between Palm Beach International Airport (PBI) and Sarajevo International Airport (SJJ)
On average, flying from West Palm Beach to Sarajevo generates about 632 kg of CO2 per passenger, and 632 kilograms equals 1 393 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from West Palm Beach to Sarajevo
See the map of the shortest flight path between Palm Beach International Airport (PBI) and Sarajevo International Airport (SJJ).
Airport information
Origin | Palm Beach International Airport |
---|---|
City: | West Palm Beach, FL |
Country: | United States |
IATA Code: | PBI |
ICAO Code: | KPBI |
Coordinates: | 26°40′59″N, 80°5′44″W |
Destination | Sarajevo International Airport |
---|---|
City: | Sarajevo |
Country: | Bosnia and Herzegovina |
IATA Code: | SJJ |
ICAO Code: | LQSA |
Coordinates: | 43°49′28″N, 18°19′53″E |