Air Miles Calculator logo

How far is Omitama from Guayaquil?

The distance between Guayaquil (José Joaquín de Olmedo International Airport) and Omitama (Ibaraki Airport) is 8961 miles / 14421 kilometers / 7787 nautical miles.

José Joaquín de Olmedo International Airport – Ibaraki Airport

Distance arrow
8961
Miles
Distance arrow
14421
Kilometers
Distance arrow
7787
Nautical miles
Flight time duration
17 h 27 min
CO2 emission
1 141 kg

Search flights

Distance from Guayaquil to Omitama

There are several ways to calculate the distance from Guayaquil to Omitama. Here are two standard methods:

Vincenty's formula (applied above)
  • 8960.839 miles
  • 14421.073 kilometers
  • 7786.756 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8954.340 miles
  • 14410.614 kilometers
  • 7781.109 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Guayaquil to Omitama?

The estimated flight time from José Joaquín de Olmedo International Airport to Ibaraki Airport is 17 hours and 27 minutes.

Flight carbon footprint between José Joaquín de Olmedo International Airport (GYE) and Ibaraki Airport (IBR)

On average, flying from Guayaquil to Omitama generates about 1 141 kg of CO2 per passenger, and 1 141 kilograms equals 2 516 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Guayaquil to Omitama

See the map of the shortest flight path between José Joaquín de Olmedo International Airport (GYE) and Ibaraki Airport (IBR).

Airport information

Origin José Joaquín de Olmedo International Airport
City: Guayaquil
Country: Ecuador Flag of Ecuador
IATA Code: GYE
ICAO Code: SEGU
Coordinates: 2°9′26″S, 79°53′0″W
Destination Ibaraki Airport
City: Omitama
Country: Japan Flag of Japan
IATA Code: IBR
ICAO Code: RJAH
Coordinates: 36°10′51″N, 140°24′53″E