How far is Gillam from Barcelona?
The distance between Barcelona (General José Antonio Anzoátegui International Airport) and Gillam (Gillam Airport) is 3574 miles / 5752 kilometers / 3106 nautical miles.
General José Antonio Anzoátegui International Airport – Gillam Airport
Search flights
Distance from Barcelona to Gillam
There are several ways to calculate the distance from Barcelona to Gillam. Here are two standard methods:
Vincenty's formula (applied above)- 3574.186 miles
- 5752.095 kilometers
- 3105.883 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3579.410 miles
- 5760.502 kilometers
- 3110.422 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Barcelona to Gillam?
The estimated flight time from General José Antonio Anzoátegui International Airport to Gillam Airport is 7 hours and 16 minutes.
What is the time difference between Barcelona and Gillam?
The time difference between Barcelona and Gillam is 1 hour. Gillam is 1 hour behind Barcelona.
Flight carbon footprint between General José Antonio Anzoátegui International Airport (BLA) and Gillam Airport (YGX)
On average, flying from Barcelona to Gillam generates about 404 kg of CO2 per passenger, and 404 kilograms equals 890 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Barcelona to Gillam
See the map of the shortest flight path between General José Antonio Anzoátegui International Airport (BLA) and Gillam Airport (YGX).
Airport information
Origin | General José Antonio Anzoátegui International Airport |
---|---|
City: | Barcelona |
Country: | Venezuela |
IATA Code: | BLA |
ICAO Code: | SVBC |
Coordinates: | 10°6′25″N, 64°41′21″W |
Destination | Gillam Airport |
---|---|
City: | Gillam |
Country: | Canada |
IATA Code: | YGX |
ICAO Code: | CYGX |
Coordinates: | 56°21′26″N, 94°42′38″W |