How far is Petersburg, AK, from London?
The distance between London (London Heathrow Airport) and Petersburg (Petersburg James A. Johnson Airport) is 4499 miles / 7240 kilometers / 3909 nautical miles.
London Heathrow Airport – Petersburg James A. Johnson Airport
Search flights
Distance from London to Petersburg
There are several ways to calculate the distance from London to Petersburg. Here are two standard methods:
Vincenty's formula (applied above)- 4498.573 miles
- 7239.751 kilometers
- 3909.153 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4483.923 miles
- 7216.174 kilometers
- 3896.422 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from London to Petersburg?
The estimated flight time from London Heathrow Airport to Petersburg James A. Johnson Airport is 9 hours and 1 minutes.
What is the time difference between London and Petersburg?
The time difference between London and Petersburg is 9 hours. Petersburg is 9 hours behind London.
Flight carbon footprint between London Heathrow Airport (LHR) and Petersburg James A. Johnson Airport (PSG)
On average, flying from London to Petersburg generates about 519 kg of CO2 per passenger, and 519 kilograms equals 1 145 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from London to Petersburg
See the map of the shortest flight path between London Heathrow Airport (LHR) and Petersburg James A. Johnson Airport (PSG).
Airport information
Origin | London Heathrow Airport |
---|---|
City: | London |
Country: | United Kingdom |
IATA Code: | LHR |
ICAO Code: | EGLL |
Coordinates: | 51°28′14″N, 0°27′42″W |
Destination | Petersburg James A. Johnson Airport |
---|---|
City: | Petersburg, AK |
Country: | United States |
IATA Code: | PSG |
ICAO Code: | PAPG |
Coordinates: | 56°48′6″N, 132°56′42″W |