Air Miles Calculator logo

How far is Wekweètì from Iquitos?

The distance between Iquitos (Coronel FAP Francisco Secada Vignetta International Airport) and Wekweètì (Wekweètì Airport) is 5129 miles / 8254 kilometers / 4457 nautical miles.

Coronel FAP Francisco Secada Vignetta International Airport – Wekweètì Airport

Distance arrow
5129
Miles
Distance arrow
8254
Kilometers
Distance arrow
4457
Nautical miles

Search flights

Distance from Iquitos to Wekweètì

There are several ways to calculate the distance from Iquitos to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 5128.659 miles
  • 8253.776 kilometers
  • 4456.683 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5137.787 miles
  • 8268.466 kilometers
  • 4464.615 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Iquitos to Wekweètì?

The estimated flight time from Coronel FAP Francisco Secada Vignetta International Airport to Wekweètì Airport is 10 hours and 12 minutes.

Flight carbon footprint between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Wekweètì Airport (YFJ)

On average, flying from Iquitos to Wekweètì generates about 601 kg of CO2 per passenger, and 601 kilograms equals 1 324 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Iquitos to Wekweètì

See the map of the shortest flight path between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Wekweètì Airport (YFJ).

Airport information

Origin Coronel FAP Francisco Secada Vignetta International Airport
City: Iquitos
Country: Perú Flag of Perú
IATA Code: IQT
ICAO Code: SPQT
Coordinates: 3°47′5″S, 73°18′31″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W