How far is Kalskag, AK, from Hamilton?
The distance between Hamilton (John C. Munro Hamilton International Airport) and Kalskag (Kalskag Airport) is 3377 miles / 5435 kilometers / 2935 nautical miles.
John C. Munro Hamilton International Airport – Kalskag Airport
Search flights
Distance from Hamilton to Kalskag
There are several ways to calculate the distance from Hamilton to Kalskag. Here are two standard methods:
Vincenty's formula (applied above)- 3377.344 miles
- 5435.308 kilometers
- 2934.831 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3367.848 miles
- 5420.026 kilometers
- 2926.580 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hamilton to Kalskag?
The estimated flight time from John C. Munro Hamilton International Airport to Kalskag Airport is 6 hours and 53 minutes.
What is the time difference between Hamilton and Kalskag?
The time difference between Hamilton and Kalskag is 4 hours. Kalskag is 4 hours behind Hamilton.
Flight carbon footprint between John C. Munro Hamilton International Airport (YHM) and Kalskag Airport (KLG)
On average, flying from Hamilton to Kalskag generates about 380 kg of CO2 per passenger, and 380 kilograms equals 837 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Hamilton to Kalskag
See the map of the shortest flight path between John C. Munro Hamilton International Airport (YHM) and Kalskag Airport (KLG).
Airport information
Origin | John C. Munro Hamilton International Airport |
---|---|
City: | Hamilton |
Country: | Canada |
IATA Code: | YHM |
ICAO Code: | CYHM |
Coordinates: | 43°10′24″N, 79°56′5″W |
Destination | Kalskag Airport |
---|---|
City: | Kalskag, AK |
Country: | United States |
IATA Code: | KLG |
ICAO Code: | PALG |
Coordinates: | 61°32′10″N, 160°20′27″W |