Air Miles Calculator logo

How far is Grayling, AK, from Antofagasta?

The distance between Antofagasta (Andrés Sabella Gálvez International Airport) and Grayling (Grayling Airport) is 7630 miles / 12280 kilometers / 6630 nautical miles.

Andrés Sabella Gálvez International Airport – Grayling Airport

Distance arrow
7630
Miles
Distance arrow
12280
Kilometers
Distance arrow
6630
Nautical miles

Search flights

Distance from Antofagasta to Grayling

There are several ways to calculate the distance from Antofagasta to Grayling. Here are two standard methods:

Vincenty's formula (applied above)
  • 7630.214 miles
  • 12279.639 kilometers
  • 6630.474 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7639.872 miles
  • 12295.182 kilometers
  • 6638.867 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Antofagasta to Grayling?

The estimated flight time from Andrés Sabella Gálvez International Airport to Grayling Airport is 14 hours and 56 minutes.

Flight carbon footprint between Andrés Sabella Gálvez International Airport (ANF) and Grayling Airport (KGX)

On average, flying from Antofagasta to Grayling generates about 945 kg of CO2 per passenger, and 945 kilograms equals 2 083 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Antofagasta to Grayling

See the map of the shortest flight path between Andrés Sabella Gálvez International Airport (ANF) and Grayling Airport (KGX).

Airport information

Origin Andrés Sabella Gálvez International Airport
City: Antofagasta
Country: Chile Flag of Chile
IATA Code: ANF
ICAO Code: SCFA
Coordinates: 23°26′40″S, 70°26′42″W
Destination Grayling Airport
City: Grayling, AK
Country: United States Flag of United States
IATA Code: KGX
ICAO Code: PAGX
Coordinates: 62°53′42″N, 160°3′58″W