How far is London from Iquitos?
The distance between Iquitos (Coronel FAP Francisco Secada Vignetta International Airport) and London (London International Airport) is 3259 miles / 5245 kilometers / 2832 nautical miles.
Coronel FAP Francisco Secada Vignetta International Airport – London International Airport
Search flights
Distance from Iquitos to London
There are several ways to calculate the distance from Iquitos to London. Here are two standard methods:
Vincenty's formula (applied above)- 3259.081 miles
- 5244.982 kilometers
- 2832.064 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3271.874 miles
- 5265.571 kilometers
- 2843.181 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Iquitos to London?
The estimated flight time from Coronel FAP Francisco Secada Vignetta International Airport to London International Airport is 6 hours and 40 minutes.
What is the time difference between Iquitos and London?
Flight carbon footprint between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and London International Airport (YXU)
On average, flying from Iquitos to London generates about 365 kg of CO2 per passenger, and 365 kilograms equals 805 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Iquitos to London
See the map of the shortest flight path between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and London International Airport (YXU).
Airport information
Origin | Coronel FAP Francisco Secada Vignetta International Airport |
---|---|
City: | Iquitos |
Country: | Perú |
IATA Code: | IQT |
ICAO Code: | SPQT |
Coordinates: | 3°47′5″S, 73°18′31″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |