Air Miles Calculator logo

How far is Garzón from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Garzón (Garzón Airport) is 2653 miles / 4269 kilometers / 2305 nautical miles.

Pittsburgh International Airport – Garzón Airport

Distance arrow
2653
Miles
Distance arrow
4269
Kilometers
Distance arrow
2305
Nautical miles

Search flights

Distance from Pittsburgh to Garzón

There are several ways to calculate the distance from Pittsburgh to Garzón. Here are two standard methods:

Vincenty's formula (applied above)
  • 2652.898 miles
  • 4269.425 kilometers
  • 2305.305 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2663.354 miles
  • 4286.254 kilometers
  • 2314.392 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Garzón?

The estimated flight time from Pittsburgh International Airport to Garzón Airport is 5 hours and 31 minutes.

What is the time difference between Pittsburgh and Garzón?

There is no time difference between Pittsburgh and Garzón.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Garzón Airport (GLJ)

On average, flying from Pittsburgh to Garzón generates about 293 kg of CO2 per passenger, and 293 kilograms equals 646 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Garzón

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Garzón Airport (GLJ).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Garzón Airport
City: Garzón
Country: Colombia Flag of Colombia
IATA Code: GLJ
ICAO Code: SKGZ
Coordinates: 2°10′0″N, 75°40′0″W