Air Miles Calculator logo

How far is San Angelo, TX, from Arctic Bay?

The distance between Arctic Bay (Arctic Bay Airport) and San Angelo (San Angelo Regional Airport) is 2933 miles / 4720 kilometers / 2549 nautical miles.

Arctic Bay Airport – San Angelo Regional Airport

Distance arrow
2933
Miles
Distance arrow
4720
Kilometers
Distance arrow
2549
Nautical miles

Search flights

Distance from Arctic Bay to San Angelo

There are several ways to calculate the distance from Arctic Bay to San Angelo. Here are two standard methods:

Vincenty's formula (applied above)
  • 2932.902 miles
  • 4720.049 kilometers
  • 2548.622 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2930.956 miles
  • 4716.917 kilometers
  • 2546.931 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Arctic Bay to San Angelo?

The estimated flight time from Arctic Bay Airport to San Angelo Regional Airport is 6 hours and 3 minutes.

What is the time difference between Arctic Bay and San Angelo?

There is no time difference between Arctic Bay and San Angelo.

Flight carbon footprint between Arctic Bay Airport (YAB) and San Angelo Regional Airport (SJT)

On average, flying from Arctic Bay to San Angelo generates about 326 kg of CO2 per passenger, and 326 kilograms equals 719 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Arctic Bay to San Angelo

See the map of the shortest flight path between Arctic Bay Airport (YAB) and San Angelo Regional Airport (SJT).

Airport information

Origin Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W
Destination San Angelo Regional Airport
City: San Angelo, TX
Country: United States Flag of United States
IATA Code: SJT
ICAO Code: KSJT
Coordinates: 31°21′27″N, 100°29′45″W