Air Miles Calculator logo

How far is Red Lake from Iquitos?

The distance between Iquitos (Coronel FAP Francisco Secada Vignetta International Airport) and Red Lake (Red Lake Airport) is 3967 miles / 6384 kilometers / 3447 nautical miles.

Coronel FAP Francisco Secada Vignetta International Airport – Red Lake Airport

Distance arrow
3967
Miles
Distance arrow
6384
Kilometers
Distance arrow
3447
Nautical miles

Search flights

Distance from Iquitos to Red Lake

There are several ways to calculate the distance from Iquitos to Red Lake. Here are two standard methods:

Vincenty's formula (applied above)
  • 3966.843 miles
  • 6384.016 kilometers
  • 3447.093 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3978.750 miles
  • 6403.178 kilometers
  • 3457.440 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Iquitos to Red Lake?

The estimated flight time from Coronel FAP Francisco Secada Vignetta International Airport to Red Lake Airport is 8 hours and 0 minutes.

Flight carbon footprint between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Red Lake Airport (YRL)

On average, flying from Iquitos to Red Lake generates about 452 kg of CO2 per passenger, and 452 kilograms equals 997 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Iquitos to Red Lake

See the map of the shortest flight path between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Red Lake Airport (YRL).

Airport information

Origin Coronel FAP Francisco Secada Vignetta International Airport
City: Iquitos
Country: Perú Flag of Perú
IATA Code: IQT
ICAO Code: SPQT
Coordinates: 3°47′5″S, 73°18′31″W
Destination Red Lake Airport
City: Red Lake
Country: Canada Flag of Canada
IATA Code: YRL
ICAO Code: CYRL
Coordinates: 51°4′0″N, 93°47′35″W