Air Miles Calculator logo

How far is Uranium City from Hoonah, AK?

The distance between Hoonah (Hoonah Airport) and Uranium City (Uranium City Airport) is 965 miles / 1553 kilometers / 839 nautical miles.

The driving distance from Hoonah (HNH) to Uranium City (YBE) is 2403 miles / 3868 kilometers, and travel time by car is about 67 hours 49 minutes.

Hoonah Airport – Uranium City Airport

Distance arrow
965
Miles
Distance arrow
1553
Kilometers
Distance arrow
839
Nautical miles

Search flights

Distance from Hoonah to Uranium City

There are several ways to calculate the distance from Hoonah to Uranium City. Here are two standard methods:

Vincenty's formula (applied above)
  • 965.068 miles
  • 1553.126 kilometers
  • 838.621 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 961.644 miles
  • 1547.616 kilometers
  • 835.646 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hoonah to Uranium City?

The estimated flight time from Hoonah Airport to Uranium City Airport is 2 hours and 19 minutes.

Flight carbon footprint between Hoonah Airport (HNH) and Uranium City Airport (YBE)

On average, flying from Hoonah to Uranium City generates about 148 kg of CO2 per passenger, and 148 kilograms equals 327 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hoonah to Uranium City

See the map of the shortest flight path between Hoonah Airport (HNH) and Uranium City Airport (YBE).

Airport information

Origin Hoonah Airport
City: Hoonah, AK
Country: United States Flag of United States
IATA Code: HNH
ICAO Code: PAOH
Coordinates: 58°5′45″N, 135°24′36″W
Destination Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W