Air Miles Calculator logo

How far is Nanaimo from Iquitos?

The distance between Iquitos (Coronel FAP Francisco Secada Vignetta International Airport) and Nanaimo (Nanaimo Airport) is 4730 miles / 7612 kilometers / 4110 nautical miles.

Coronel FAP Francisco Secada Vignetta International Airport – Nanaimo Airport

Distance arrow
4730
Miles
Distance arrow
7612
Kilometers
Distance arrow
4110
Nautical miles

Search flights

Distance from Iquitos to Nanaimo

There are several ways to calculate the distance from Iquitos to Nanaimo. Here are two standard methods:

Vincenty's formula (applied above)
  • 4729.971 miles
  • 7612.150 kilometers
  • 4110.232 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4736.896 miles
  • 7623.296 kilometers
  • 4116.250 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Iquitos to Nanaimo?

The estimated flight time from Coronel FAP Francisco Secada Vignetta International Airport to Nanaimo Airport is 9 hours and 27 minutes.

Flight carbon footprint between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Nanaimo Airport (YCD)

On average, flying from Iquitos to Nanaimo generates about 549 kg of CO2 per passenger, and 549 kilograms equals 1 210 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Iquitos to Nanaimo

See the map of the shortest flight path between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Nanaimo Airport (YCD).

Airport information

Origin Coronel FAP Francisco Secada Vignetta International Airport
City: Iquitos
Country: Perú Flag of Perú
IATA Code: IQT
ICAO Code: SPQT
Coordinates: 3°47′5″S, 73°18′31″W
Destination Nanaimo Airport
City: Nanaimo
Country: Canada Flag of Canada
IATA Code: YCD
ICAO Code: CYCD
Coordinates: 49°3′8″N, 123°52′12″W