How far is Natal from Araguaína?
The distance between Araguaína (Araguaína Airport) and Natal (Greater Natal International Airport) is 890 miles / 1432 kilometers / 773 nautical miles.
The driving distance from Araguaína (AUX) to Natal (NAT) is 1191 miles / 1916 kilometers, and travel time by car is about 24 hours 23 minutes.
Araguaína Airport – Greater Natal International Airport
Search flights
Distance from Araguaína to Natal
There are several ways to calculate the distance from Araguaína to Natal. Here are two standard methods:
Vincenty's formula (applied above)- 889.785 miles
- 1431.970 kilometers
- 773.202 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 888.827 miles
- 1430.429 kilometers
- 772.370 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Araguaína to Natal?
The estimated flight time from Araguaína Airport to Greater Natal International Airport is 2 hours and 11 minutes.
What is the time difference between Araguaína and Natal?
Flight carbon footprint between Araguaína Airport (AUX) and Greater Natal International Airport (NAT)
On average, flying from Araguaína to Natal generates about 143 kg of CO2 per passenger, and 143 kilograms equals 315 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Araguaína to Natal
See the map of the shortest flight path between Araguaína Airport (AUX) and Greater Natal International Airport (NAT).
Airport information
Origin | Araguaína Airport |
---|---|
City: | Araguaína |
Country: | Brazil |
IATA Code: | AUX |
ICAO Code: | SWGN |
Coordinates: | 7°13′40″S, 48°14′25″W |
Destination | Greater Natal International Airport |
---|---|
City: | Natal |
Country: | Brazil |
IATA Code: | NAT |
ICAO Code: | SBSG |
Coordinates: | 5°46′5″S, 35°22′33″W |