Air Miles Calculator logo

How far is Gillam from Panama City?

The distance between Panama City (Tocumen International Airport) and Gillam (Gillam Airport) is 3363 miles / 5413 kilometers / 2923 nautical miles.

The driving distance from Panama City (PTY) to Gillam (YGX) is 4930 miles / 7934 kilometers, and travel time by car is about 102 hours 1 minutes.

Tocumen International Airport – Gillam Airport

Distance arrow
3363
Miles
Distance arrow
5413
Kilometers
Distance arrow
2923
Nautical miles

Search flights

Distance from Panama City to Gillam

There are several ways to calculate the distance from Panama City to Gillam. Here are two standard methods:

Vincenty's formula (applied above)
  • 3363.497 miles
  • 5413.024 kilometers
  • 2922.799 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3370.761 miles
  • 5424.713 kilometers
  • 2929.111 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Panama City to Gillam?

The estimated flight time from Tocumen International Airport to Gillam Airport is 6 hours and 52 minutes.

Flight carbon footprint between Tocumen International Airport (PTY) and Gillam Airport (YGX)

On average, flying from Panama City to Gillam generates about 378 kg of CO2 per passenger, and 378 kilograms equals 833 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Panama City to Gillam

See the map of the shortest flight path between Tocumen International Airport (PTY) and Gillam Airport (YGX).

Airport information

Origin Tocumen International Airport
City: Panama City
Country: Panama Flag of Panama
IATA Code: PTY
ICAO Code: MPTO
Coordinates: 9°4′16″N, 79°23′0″W
Destination Gillam Airport
City: Gillam
Country: Canada Flag of Canada
IATA Code: YGX
ICAO Code: CYGX
Coordinates: 56°21′26″N, 94°42′38″W