Air Miles Calculator logo

How far is San Angelo, TX, from Repulse Bay?

The distance between Repulse Bay (Naujaat Airport) and San Angelo (San Angelo Regional Airport) is 2501 miles / 4026 kilometers / 2174 nautical miles.

Naujaat Airport – San Angelo Regional Airport

Distance arrow
2501
Miles
Distance arrow
4026
Kilometers
Distance arrow
2174
Nautical miles

Search flights

Distance from Repulse Bay to San Angelo

There are several ways to calculate the distance from Repulse Bay to San Angelo. Here are two standard methods:

Vincenty's formula (applied above)
  • 2501.442 miles
  • 4025.681 kilometers
  • 2173.694 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2500.844 miles
  • 4024.718 kilometers
  • 2173.174 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Repulse Bay to San Angelo?

The estimated flight time from Naujaat Airport to San Angelo Regional Airport is 5 hours and 14 minutes.

What is the time difference between Repulse Bay and San Angelo?

There is no time difference between Repulse Bay and San Angelo.

Flight carbon footprint between Naujaat Airport (YUT) and San Angelo Regional Airport (SJT)

On average, flying from Repulse Bay to San Angelo generates about 275 kg of CO2 per passenger, and 275 kilograms equals 607 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Repulse Bay to San Angelo

See the map of the shortest flight path between Naujaat Airport (YUT) and San Angelo Regional Airport (SJT).

Airport information

Origin Naujaat Airport
City: Repulse Bay
Country: Canada Flag of Canada
IATA Code: YUT
ICAO Code: CYUT
Coordinates: 66°31′17″N, 86°13′28″W
Destination San Angelo Regional Airport
City: San Angelo, TX
Country: United States Flag of United States
IATA Code: SJT
ICAO Code: KSJT
Coordinates: 31°21′27″N, 100°29′45″W