How far is Uranium City from Barcelona?
The distance between Barcelona (General José Antonio Anzoátegui International Airport) and Uranium City (Uranium City Airport) is 4090 miles / 6583 kilometers / 3554 nautical miles.
General José Antonio Anzoátegui International Airport – Uranium City Airport
Search flights
Distance from Barcelona to Uranium City
There are several ways to calculate the distance from Barcelona to Uranium City. Here are two standard methods:
Vincenty's formula (applied above)- 4090.265 miles
- 6582.644 kilometers
- 3554.343 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4093.611 miles
- 6588.028 kilometers
- 3557.251 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Barcelona to Uranium City?
The estimated flight time from General José Antonio Anzoátegui International Airport to Uranium City Airport is 8 hours and 14 minutes.
What is the time difference between Barcelona and Uranium City?
Flight carbon footprint between General José Antonio Anzoátegui International Airport (BLA) and Uranium City Airport (YBE)
On average, flying from Barcelona to Uranium City generates about 468 kg of CO2 per passenger, and 468 kilograms equals 1 031 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Barcelona to Uranium City
See the map of the shortest flight path between General José Antonio Anzoátegui International Airport (BLA) and Uranium City Airport (YBE).
Airport information
Origin | General José Antonio Anzoátegui International Airport |
---|---|
City: | Barcelona |
Country: | Venezuela |
IATA Code: | BLA |
ICAO Code: | SVBC |
Coordinates: | 10°6′25″N, 64°41′21″W |
Destination | Uranium City Airport |
---|---|
City: | Uranium City |
Country: | Canada |
IATA Code: | YBE |
ICAO Code: | CYBE |
Coordinates: | 59°33′41″N, 108°28′51″W |