How far is San Fernando from Salt Lake City, UT?
The distance between Salt Lake City (Salt Lake City International Airport) and San Fernando (San Fernando Airport) is 7259 miles / 11683 kilometers / 6308 nautical miles.
Salt Lake City International Airport – San Fernando Airport
Search flights
Distance from Salt Lake City to San Fernando
There are several ways to calculate the distance from Salt Lake City to San Fernando. Here are two standard methods:
Vincenty's formula (applied above)- 7259.315 miles
- 11682.735 kilometers
- 6308.172 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 7248.686 miles
- 11665.630 kilometers
- 6298.936 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Salt Lake City to San Fernando?
The estimated flight time from Salt Lake City International Airport to San Fernando Airport is 14 hours and 14 minutes.
What is the time difference between Salt Lake City and San Fernando?
Flight carbon footprint between Salt Lake City International Airport (SLC) and San Fernando Airport (SFE)
On average, flying from Salt Lake City to San Fernando generates about 892 kg of CO2 per passenger, and 892 kilograms equals 1 966 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Salt Lake City to San Fernando
See the map of the shortest flight path between Salt Lake City International Airport (SLC) and San Fernando Airport (SFE).
Airport information
Origin | Salt Lake City International Airport |
---|---|
City: | Salt Lake City, UT |
Country: | United States |
IATA Code: | SLC |
ICAO Code: | KSLC |
Coordinates: | 40°47′18″N, 111°58′40″W |
Destination | San Fernando Airport |
---|---|
City: | San Fernando |
Country: | Philippines |
IATA Code: | SFE |
ICAO Code: | RPUS |
Coordinates: | 16°35′44″N, 120°18′10″E |