Air Miles Calculator logo

How far is Barrow, AK, from Nain?

The distance between Nain (Nain Airport) and Barrow (Wiley Post–Will Rogers Memorial Airport) is 2721 miles / 4378 kilometers / 2364 nautical miles.

The driving distance from Nain (YDP) to Barrow (BRW) is 6251 miles / 10060 kilometers, and travel time by car is about 162 hours 3 minutes.

Nain Airport – Wiley Post–Will Rogers Memorial Airport

Distance arrow
2721
Miles
Distance arrow
4378
Kilometers
Distance arrow
2364
Nautical miles

Search flights

Distance from Nain to Barrow

There are several ways to calculate the distance from Nain to Barrow. Here are two standard methods:

Vincenty's formula (applied above)
  • 2720.590 miles
  • 4378.365 kilometers
  • 2364.128 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2710.822 miles
  • 4362.645 kilometers
  • 2355.640 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nain to Barrow?

The estimated flight time from Nain Airport to Wiley Post–Will Rogers Memorial Airport is 5 hours and 39 minutes.

Flight carbon footprint between Nain Airport (YDP) and Wiley Post–Will Rogers Memorial Airport (BRW)

On average, flying from Nain to Barrow generates about 301 kg of CO2 per passenger, and 301 kilograms equals 664 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nain to Barrow

See the map of the shortest flight path between Nain Airport (YDP) and Wiley Post–Will Rogers Memorial Airport (BRW).

Airport information

Origin Nain Airport
City: Nain
Country: Canada Flag of Canada
IATA Code: YDP
ICAO Code: CYDP
Coordinates: 56°32′57″N, 61°40′49″W
Destination Wiley Post–Will Rogers Memorial Airport
City: Barrow, AK
Country: United States Flag of United States
IATA Code: BRW
ICAO Code: PABR
Coordinates: 71°17′7″N, 156°45′57″W