How far is Petersburg, AK, from St. Theresa Point?
The distance between St. Theresa Point (St. Theresa Point Airport) and Petersburg (Petersburg James A. Johnson Airport) is 1496 miles / 2408 kilometers / 1300 nautical miles.
St. Theresa Point Airport – Petersburg James A. Johnson Airport
Search flights
Distance from St. Theresa Point to Petersburg
There are several ways to calculate the distance from St. Theresa Point to Petersburg. Here are two standard methods:
Vincenty's formula (applied above)- 1496.436 miles
- 2408.280 kilometers
- 1300.367 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1491.439 miles
- 2400.239 kilometers
- 1296.025 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from St. Theresa Point to Petersburg?
The estimated flight time from St. Theresa Point Airport to Petersburg James A. Johnson Airport is 3 hours and 19 minutes.
What is the time difference between St. Theresa Point and Petersburg?
Flight carbon footprint between St. Theresa Point Airport (YST) and Petersburg James A. Johnson Airport (PSG)
On average, flying from St. Theresa Point to Petersburg generates about 179 kg of CO2 per passenger, and 179 kilograms equals 395 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from St. Theresa Point to Petersburg
See the map of the shortest flight path between St. Theresa Point Airport (YST) and Petersburg James A. Johnson Airport (PSG).
Airport information
Origin | St. Theresa Point Airport |
---|---|
City: | St. Theresa Point |
Country: | Canada |
IATA Code: | YST |
ICAO Code: | CYST |
Coordinates: | 53°50′44″N, 94°51′6″W |
Destination | Petersburg James A. Johnson Airport |
---|---|
City: | Petersburg, AK |
Country: | United States |
IATA Code: | PSG |
ICAO Code: | PAPG |
Coordinates: | 56°48′6″N, 132°56′42″W |