Air Miles Calculator logo

How far is Texada from Lapu-Lapu City?

The distance between Lapu-Lapu City (Mactan–Cebu International Airport) and Texada (Texada/Gillies Bay Airport) is 6609 miles / 10637 kilometers / 5743 nautical miles.

Mactan–Cebu International Airport – Texada/Gillies Bay Airport

Distance arrow
6609
Miles
Distance arrow
10637
Kilometers
Distance arrow
5743
Nautical miles

Search flights

Distance from Lapu-Lapu City to Texada

There are several ways to calculate the distance from Lapu-Lapu City to Texada. Here are two standard methods:

Vincenty's formula (applied above)
  • 6609.224 miles
  • 10636.515 kilometers
  • 5743.259 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6602.333 miles
  • 10625.425 kilometers
  • 5737.271 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lapu-Lapu City to Texada?

The estimated flight time from Mactan–Cebu International Airport to Texada/Gillies Bay Airport is 13 hours and 0 minutes.

Flight carbon footprint between Mactan–Cebu International Airport (CEB) and Texada/Gillies Bay Airport (YGB)

On average, flying from Lapu-Lapu City to Texada generates about 800 kg of CO2 per passenger, and 800 kilograms equals 1 764 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Lapu-Lapu City to Texada

See the map of the shortest flight path between Mactan–Cebu International Airport (CEB) and Texada/Gillies Bay Airport (YGB).

Airport information

Origin Mactan–Cebu International Airport
City: Lapu-Lapu City
Country: Philippines Flag of Philippines
IATA Code: CEB
ICAO Code: RPVM
Coordinates: 10°18′26″N, 123°58′44″E
Destination Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W