Air Miles Calculator logo

How far is Petersburg, AK, from Longyearbyen?

The distance between Longyearbyen (Svalbard Airport, Longyear) and Petersburg (Petersburg James A. Johnson Airport) is 3023 miles / 4865 kilometers / 2627 nautical miles.

Svalbard Airport, Longyear – Petersburg James A. Johnson Airport

Distance arrow
3023
Miles
Distance arrow
4865
Kilometers
Distance arrow
2627
Nautical miles

Search flights

Distance from Longyearbyen to Petersburg

There are several ways to calculate the distance from Longyearbyen to Petersburg. Here are two standard methods:

Vincenty's formula (applied above)
  • 3023.180 miles
  • 4865.337 kilometers
  • 2627.072 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3012.175 miles
  • 4847.626 kilometers
  • 2617.509 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Longyearbyen to Petersburg?

The estimated flight time from Svalbard Airport, Longyear to Petersburg James A. Johnson Airport is 6 hours and 13 minutes.

Flight carbon footprint between Svalbard Airport, Longyear (LYR) and Petersburg James A. Johnson Airport (PSG)

On average, flying from Longyearbyen to Petersburg generates about 337 kg of CO2 per passenger, and 337 kilograms equals 743 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Longyearbyen to Petersburg

See the map of the shortest flight path between Svalbard Airport, Longyear (LYR) and Petersburg James A. Johnson Airport (PSG).

Airport information

Origin Svalbard Airport, Longyear
City: Longyearbyen
Country: Norway Flag of Norway
IATA Code: LYR
ICAO Code: ENSB
Coordinates: 78°14′45″N, 15°27′56″E
Destination Petersburg James A. Johnson Airport
City: Petersburg, AK
Country: United States Flag of United States
IATA Code: PSG
ICAO Code: PAPG
Coordinates: 56°48′6″N, 132°56′42″W