How far is Granada from Tampa, FL?
The distance between Tampa (Tampa International Airport) and Granada (Federico García Lorca Granada Airport) is 4508 miles / 7255 kilometers / 3918 nautical miles.
Tampa International Airport – Federico García Lorca Granada Airport
Search flights
Distance from Tampa to Granada
There are several ways to calculate the distance from Tampa to Granada. Here are two standard methods:
Vincenty's formula (applied above)- 4508.227 miles
- 7255.288 kilometers
- 3917.542 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4499.328 miles
- 7240.966 kilometers
- 3909.809 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tampa to Granada?
The estimated flight time from Tampa International Airport to Federico García Lorca Granada Airport is 9 hours and 2 minutes.
What is the time difference between Tampa and Granada?
The time difference between Tampa and Granada is 6 hours. Granada is 6 hours ahead of Tampa.
Flight carbon footprint between Tampa International Airport (TPA) and Federico García Lorca Granada Airport (GRX)
On average, flying from Tampa to Granada generates about 520 kg of CO2 per passenger, and 520 kilograms equals 1 147 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Tampa to Granada
See the map of the shortest flight path between Tampa International Airport (TPA) and Federico García Lorca Granada Airport (GRX).
Airport information
Origin | Tampa International Airport |
---|---|
City: | Tampa, FL |
Country: | United States |
IATA Code: | TPA |
ICAO Code: | KTPA |
Coordinates: | 27°58′31″N, 82°31′59″W |
Destination | Federico García Lorca Granada Airport |
---|---|
City: | Granada |
Country: | Spain |
IATA Code: | GRX |
ICAO Code: | LEGR |
Coordinates: | 37°11′19″N, 3°46′38″W |