How far is Belfast from Tucson, AZ?
The distance between Tucson (Tucson International Airport) and Belfast (George Best Belfast City Airport) is 4999 miles / 8045 kilometers / 4344 nautical miles.
Tucson International Airport – George Best Belfast City Airport
Search flights
Distance from Tucson to Belfast
There are several ways to calculate the distance from Tucson to Belfast. Here are two standard methods:
Vincenty's formula (applied above)- 4998.837 miles
- 8044.848 kilometers
- 4343.870 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4987.471 miles
- 8026.557 kilometers
- 4333.994 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tucson to Belfast?
The estimated flight time from Tucson International Airport to George Best Belfast City Airport is 9 hours and 57 minutes.
What is the time difference between Tucson and Belfast?
The time difference between Tucson and Belfast is 7 hours. Belfast is 7 hours ahead of Tucson.
Flight carbon footprint between Tucson International Airport (TUS) and George Best Belfast City Airport (BHD)
On average, flying from Tucson to Belfast generates about 584 kg of CO2 per passenger, and 584 kilograms equals 1 287 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Tucson to Belfast
See the map of the shortest flight path between Tucson International Airport (TUS) and George Best Belfast City Airport (BHD).
Airport information
Origin | Tucson International Airport |
---|---|
City: | Tucson, AZ |
Country: | United States |
IATA Code: | TUS |
ICAO Code: | KTUS |
Coordinates: | 32°6′57″N, 110°56′27″W |
Destination | George Best Belfast City Airport |
---|---|
City: | Belfast |
Country: | United Kingdom |
IATA Code: | BHD |
ICAO Code: | EGAC |
Coordinates: | 54°37′5″N, 5°52′20″W |