Air Miles Calculator logo

How far is Wekweètì from Tampico?

The distance between Tampico (Tampico International Airport) and Wekweètì (Wekweètì Airport) is 2986 miles / 4806 kilometers / 2595 nautical miles.

The driving distance from Tampico (TAM) to Wekweètì (YFJ) is 3772 miles / 6070 kilometers, and travel time by car is about 76 hours 18 minutes.

Tampico International Airport – Wekweètì Airport

Distance arrow
2986
Miles
Distance arrow
4806
Kilometers
Distance arrow
2595
Nautical miles

Search flights

Distance from Tampico to Wekweètì

There are several ways to calculate the distance from Tampico to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 2986.449 miles
  • 4806.223 kilometers
  • 2595.153 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2988.352 miles
  • 4809.286 kilometers
  • 2596.807 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tampico to Wekweètì?

The estimated flight time from Tampico International Airport to Wekweètì Airport is 6 hours and 9 minutes.

Flight carbon footprint between Tampico International Airport (TAM) and Wekweètì Airport (YFJ)

On average, flying from Tampico to Wekweètì generates about 333 kg of CO2 per passenger, and 333 kilograms equals 733 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tampico to Wekweètì

See the map of the shortest flight path between Tampico International Airport (TAM) and Wekweètì Airport (YFJ).

Airport information

Origin Tampico International Airport
City: Tampico
Country: Mexico Flag of Mexico
IATA Code: TAM
ICAO Code: MMTM
Coordinates: 22°17′47″N, 97°51′57″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W