How far is Corumbá from Salt Lake City, UT?
The distance between Salt Lake City (Salt Lake City International Airport) and Corumbá (Corumbá International Airport) is 5391 miles / 8676 kilometers / 4685 nautical miles.
Salt Lake City International Airport – Corumbá International Airport
Search flights
Distance from Salt Lake City to Corumbá
There are several ways to calculate the distance from Salt Lake City to Corumbá. Here are two standard methods:
Vincenty's formula (applied above)- 5391.054 miles
- 8676.061 kilometers
- 4684.698 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5401.733 miles
- 8693.246 kilometers
- 4693.977 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Salt Lake City to Corumbá?
The estimated flight time from Salt Lake City International Airport to Corumbá International Airport is 10 hours and 42 minutes.
What is the time difference between Salt Lake City and Corumbá?
Flight carbon footprint between Salt Lake City International Airport (SLC) and Corumbá International Airport (CMG)
On average, flying from Salt Lake City to Corumbá generates about 635 kg of CO2 per passenger, and 635 kilograms equals 1 400 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Salt Lake City to Corumbá
See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Corumbá International Airport (CMG).
Airport information
Origin | Salt Lake City International Airport |
---|---|
City: | Salt Lake City, UT |
Country: | United States |
IATA Code: | SLC |
ICAO Code: | KSLC |
Coordinates: | 40°47′18″N, 111°58′40″W |
Destination | Corumbá International Airport |
---|---|
City: | Corumbá |
Country: | Brazil |
IATA Code: | CMG |
ICAO Code: | SBCR |
Coordinates: | 19°0′42″S, 57°40′17″W |