Air Miles Calculator logo

How far is Wekweètì from Puebla?

The distance between Puebla (Puebla International Airport) and Wekweètì (Wekweètì Airport) is 3194 miles / 5140 kilometers / 2775 nautical miles.

The driving distance from Puebla (PBC) to Wekweètì (YFJ) is 4061 miles / 6536 kilometers, and travel time by car is about 81 hours 47 minutes.

Puebla International Airport – Wekweètì Airport

Distance arrow
3194
Miles
Distance arrow
5140
Kilometers
Distance arrow
2775
Nautical miles

Search flights

Distance from Puebla to Wekweètì

There are several ways to calculate the distance from Puebla to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 3193.584 miles
  • 5139.575 kilometers
  • 2775.148 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3196.471 miles
  • 5144.222 kilometers
  • 2777.658 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Puebla to Wekweètì?

The estimated flight time from Puebla International Airport to Wekweètì Airport is 6 hours and 32 minutes.

Flight carbon footprint between Puebla International Airport (PBC) and Wekweètì Airport (YFJ)

On average, flying from Puebla to Wekweètì generates about 357 kg of CO2 per passenger, and 357 kilograms equals 788 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Puebla to Wekweètì

See the map of the shortest flight path between Puebla International Airport (PBC) and Wekweètì Airport (YFJ).

Airport information

Origin Puebla International Airport
City: Puebla
Country: Mexico Flag of Mexico
IATA Code: PBC
ICAO Code: MMPB
Coordinates: 19°9′29″N, 98°22′17″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W