Air Miles Calculator logo

How far is Comox from Punta Gorda, FL?

The distance between Punta Gorda (Punta Gorda Airport (Florida)) and Comox (CFB Comox) is 2756 miles / 4436 kilometers / 2395 nautical miles.

The driving distance from Punta Gorda (PGD) to Comox (YQQ) is 3472 miles / 5588 kilometers, and travel time by car is about 63 hours 34 minutes.

Punta Gorda Airport (Florida) – CFB Comox

Distance arrow
2756
Miles
Distance arrow
4436
Kilometers
Distance arrow
2395
Nautical miles

Search flights

Distance from Punta Gorda to Comox

There are several ways to calculate the distance from Punta Gorda to Comox. Here are two standard methods:

Vincenty's formula (applied above)
  • 2756.153 miles
  • 4435.599 kilometers
  • 2395.032 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2753.266 miles
  • 4430.953 kilometers
  • 2392.523 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Punta Gorda to Comox?

The estimated flight time from Punta Gorda Airport (Florida) to CFB Comox is 5 hours and 43 minutes.

Flight carbon footprint between Punta Gorda Airport (Florida) (PGD) and CFB Comox (YQQ)

On average, flying from Punta Gorda to Comox generates about 305 kg of CO2 per passenger, and 305 kilograms equals 673 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Punta Gorda to Comox

See the map of the shortest flight path between Punta Gorda Airport (Florida) (PGD) and CFB Comox (YQQ).

Airport information

Origin Punta Gorda Airport (Florida)
City: Punta Gorda, FL
Country: United States Flag of United States
IATA Code: PGD
ICAO Code: KPGD
Coordinates: 26°55′12″N, 81°59′25″W
Destination CFB Comox
City: Comox
Country: Canada Flag of Canada
IATA Code: YQQ
ICAO Code: CYQQ
Coordinates: 49°42′38″N, 124°53′13″W