Air Miles Calculator logo

How far is Petersburg, AK, from Kangerlussuaq?

The distance between Kangerlussuaq (Kangerlussuaq Airport) and Petersburg (Petersburg James A. Johnson Airport) is 2562 miles / 4123 kilometers / 2226 nautical miles.

Kangerlussuaq Airport – Petersburg James A. Johnson Airport

Distance arrow
2562
Miles
Distance arrow
4123
Kilometers
Distance arrow
2226
Nautical miles

Search flights

Distance from Kangerlussuaq to Petersburg

There are several ways to calculate the distance from Kangerlussuaq to Petersburg. Here are two standard methods:

Vincenty's formula (applied above)
  • 2561.751 miles
  • 4122.739 kilometers
  • 2226.101 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2552.562 miles
  • 4107.950 kilometers
  • 2218.115 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kangerlussuaq to Petersburg?

The estimated flight time from Kangerlussuaq Airport to Petersburg James A. Johnson Airport is 5 hours and 21 minutes.

Flight carbon footprint between Kangerlussuaq Airport (SFJ) and Petersburg James A. Johnson Airport (PSG)

On average, flying from Kangerlussuaq to Petersburg generates about 282 kg of CO2 per passenger, and 282 kilograms equals 623 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kangerlussuaq to Petersburg

See the map of the shortest flight path between Kangerlussuaq Airport (SFJ) and Petersburg James A. Johnson Airport (PSG).

Airport information

Origin Kangerlussuaq Airport
City: Kangerlussuaq
Country: Greenland Flag of Greenland
IATA Code: SFJ
ICAO Code: BGSF
Coordinates: 67°0′43″N, 50°42′41″W
Destination Petersburg James A. Johnson Airport
City: Petersburg, AK
Country: United States Flag of United States
IATA Code: PSG
ICAO Code: PAPG
Coordinates: 56°48′6″N, 132°56′42″W