Air Miles Calculator logo

How far is Brochet from Iquitos?

The distance between Iquitos (Coronel FAP Francisco Secada Vignetta International Airport) and Brochet (Brochet Airport) is 4532 miles / 7293 kilometers / 3938 nautical miles.

Coronel FAP Francisco Secada Vignetta International Airport – Brochet Airport

Distance arrow
4532
Miles
Distance arrow
7293
Kilometers
Distance arrow
3938
Nautical miles

Search flights

Distance from Iquitos to Brochet

There are several ways to calculate the distance from Iquitos to Brochet. Here are two standard methods:

Vincenty's formula (applied above)
  • 4531.733 miles
  • 7293.118 kilometers
  • 3937.969 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4542.615 miles
  • 7310.630 kilometers
  • 3947.424 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Iquitos to Brochet?

The estimated flight time from Coronel FAP Francisco Secada Vignetta International Airport to Brochet Airport is 9 hours and 4 minutes.

Flight carbon footprint between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Brochet Airport (YBT)

On average, flying from Iquitos to Brochet generates about 523 kg of CO2 per passenger, and 523 kilograms equals 1 154 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Iquitos to Brochet

See the map of the shortest flight path between Coronel FAP Francisco Secada Vignetta International Airport (IQT) and Brochet Airport (YBT).

Airport information

Origin Coronel FAP Francisco Secada Vignetta International Airport
City: Iquitos
Country: Perú Flag of Perú
IATA Code: IQT
ICAO Code: SPQT
Coordinates: 3°47′5″S, 73°18′31″W
Destination Brochet Airport
City: Brochet
Country: Canada Flag of Canada
IATA Code: YBT
ICAO Code: CYBT
Coordinates: 57°53′21″N, 101°40′44″W