How far is Petersburg, AK, from Gods Lake Narrows?
The distance between Gods Lake Narrows (Gods Lake Narrows Airport) and Petersburg (Petersburg James A. Johnson Airport) is 1491 miles / 2400 kilometers / 1296 nautical miles.
Gods Lake Narrows Airport – Petersburg James A. Johnson Airport
Search flights
Distance from Gods Lake Narrows to Petersburg
There are several ways to calculate the distance from Gods Lake Narrows to Petersburg. Here are two standard methods:
Vincenty's formula (applied above)- 1491.032 miles
- 2399.584 kilometers
- 1295.672 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1485.998 miles
- 2391.481 kilometers
- 1291.297 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Gods Lake Narrows to Petersburg?
The estimated flight time from Gods Lake Narrows Airport to Petersburg James A. Johnson Airport is 3 hours and 19 minutes.
What is the time difference between Gods Lake Narrows and Petersburg?
Flight carbon footprint between Gods Lake Narrows Airport (YGO) and Petersburg James A. Johnson Airport (PSG)
On average, flying from Gods Lake Narrows to Petersburg generates about 179 kg of CO2 per passenger, and 179 kilograms equals 394 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Gods Lake Narrows to Petersburg
See the map of the shortest flight path between Gods Lake Narrows Airport (YGO) and Petersburg James A. Johnson Airport (PSG).
Airport information
Origin | Gods Lake Narrows Airport |
---|---|
City: | Gods Lake Narrows |
Country: | Canada |
IATA Code: | YGO |
ICAO Code: | CYGO |
Coordinates: | 54°33′32″N, 94°29′29″W |
Destination | Petersburg James A. Johnson Airport |
---|---|
City: | Petersburg, AK |
Country: | United States |
IATA Code: | PSG |
ICAO Code: | PAPG |
Coordinates: | 56°48′6″N, 132°56′42″W |