Air Miles Calculator logo

How far is Bole from Panama City?

The distance between Panama City (Tocumen International Airport) and Bole (Alashankou Bole (Bortala) airport) is 8546 miles / 13753 kilometers / 7426 nautical miles.

Tocumen International Airport – Alashankou Bole (Bortala) airport

Distance arrow
8546
Miles
Distance arrow
13753
Kilometers
Distance arrow
7426
Nautical miles
Flight time duration
16 h 40 min
CO2 emission
1 079 kg

Search flights

Distance from Panama City to Bole

There are several ways to calculate the distance from Panama City to Bole. Here are two standard methods:

Vincenty's formula (applied above)
  • 8545.680 miles
  • 13752.939 kilometers
  • 7425.993 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8537.221 miles
  • 13739.325 kilometers
  • 7418.642 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Panama City to Bole?

The estimated flight time from Tocumen International Airport to Alashankou Bole (Bortala) airport is 16 hours and 40 minutes.

Flight carbon footprint between Tocumen International Airport (PTY) and Alashankou Bole (Bortala) airport (BPL)

On average, flying from Panama City to Bole generates about 1 079 kg of CO2 per passenger, and 1 079 kilograms equals 2 379 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Panama City to Bole

See the map of the shortest flight path between Tocumen International Airport (PTY) and Alashankou Bole (Bortala) airport (BPL).

Airport information

Origin Tocumen International Airport
City: Panama City
Country: Panama Flag of Panama
IATA Code: PTY
ICAO Code: MPTO
Coordinates: 9°4′16″N, 79°23′0″W
Destination Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E