Air Miles Calculator logo

How far is Pittsburgh, PA, from Oaxaca?

The distance between Oaxaca (Oaxaca International Airport) and Pittsburgh (Pittsburgh International Airport) is 1895 miles / 3050 kilometers / 1647 nautical miles.

The driving distance from Oaxaca (OAX) to Pittsburgh (PIT) is 2519 miles / 4054 kilometers, and travel time by car is about 48 hours 15 minutes.

Oaxaca International Airport – Pittsburgh International Airport

Distance arrow
1895
Miles
Distance arrow
3050
Kilometers
Distance arrow
1647
Nautical miles

Search flights

Distance from Oaxaca to Pittsburgh

There are several ways to calculate the distance from Oaxaca to Pittsburgh. Here are two standard methods:

Vincenty's formula (applied above)
  • 1895.058 miles
  • 3049.801 kilometers
  • 1646.761 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1898.515 miles
  • 3055.364 kilometers
  • 1649.765 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Oaxaca to Pittsburgh?

The estimated flight time from Oaxaca International Airport to Pittsburgh International Airport is 4 hours and 5 minutes.

Flight carbon footprint between Oaxaca International Airport (OAX) and Pittsburgh International Airport (PIT)

On average, flying from Oaxaca to Pittsburgh generates about 208 kg of CO2 per passenger, and 208 kilograms equals 458 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Oaxaca to Pittsburgh

See the map of the shortest flight path between Oaxaca International Airport (OAX) and Pittsburgh International Airport (PIT).

Airport information

Origin Oaxaca International Airport
City: Oaxaca
Country: Mexico Flag of Mexico
IATA Code: OAX
ICAO Code: MMOX
Coordinates: 16°59′59″N, 96°43′35″W
Destination Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W