Air Miles Calculator logo

How far is Philadelphia, PA, from Oaxaca?

The distance between Oaxaca (Oaxaca International Airport) and Philadelphia (Wings Field) is 2048 miles / 3295 kilometers / 1779 nautical miles.

The driving distance from Oaxaca (OAX) to Philadelphia (BBX) is 2691 miles / 4331 kilometers, and travel time by car is about 52 hours 10 minutes.

Oaxaca International Airport – Wings Field

Distance arrow
2048
Miles
Distance arrow
3295
Kilometers
Distance arrow
1779
Nautical miles

Search flights

Distance from Oaxaca to Philadelphia

There are several ways to calculate the distance from Oaxaca to Philadelphia. Here are two standard methods:

Vincenty's formula (applied above)
  • 2047.659 miles
  • 3295.387 kilometers
  • 1779.367 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2050.149 miles
  • 3299.396 kilometers
  • 1781.531 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Oaxaca to Philadelphia?

The estimated flight time from Oaxaca International Airport to Wings Field is 4 hours and 22 minutes.

Flight carbon footprint between Oaxaca International Airport (OAX) and Wings Field (BBX)

On average, flying from Oaxaca to Philadelphia generates about 223 kg of CO2 per passenger, and 223 kilograms equals 491 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Oaxaca to Philadelphia

See the map of the shortest flight path between Oaxaca International Airport (OAX) and Wings Field (BBX).

Airport information

Origin Oaxaca International Airport
City: Oaxaca
Country: Mexico Flag of Mexico
IATA Code: OAX
ICAO Code: MMOX
Coordinates: 16°59′59″N, 96°43′35″W
Destination Wings Field
City: Philadelphia, PA
Country: United States Flag of United States
IATA Code: BBX
ICAO Code: KLOM
Coordinates: 40°8′15″N, 75°15′54″W