How far is London from Paamiut?
The distance between Paamiut (Paamiut Airport) and London (London International Airport) is 1833 miles / 2949 kilometers / 1592 nautical miles.
Paamiut Airport – London International Airport
Search flights
Distance from Paamiut to London
There are several ways to calculate the distance from Paamiut to London. Here are two standard methods:
Vincenty's formula (applied above)- 1832.587 miles
- 2949.263 kilometers
- 1592.475 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1829.078 miles
- 2943.616 kilometers
- 1589.425 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Paamiut to London?
The estimated flight time from Paamiut Airport to London International Airport is 3 hours and 58 minutes.
What is the time difference between Paamiut and London?
The time difference between Paamiut and London is 3 hours. London is 3 hours behind Paamiut.
Flight carbon footprint between Paamiut Airport (JFR) and London International Airport (YXU)
On average, flying from Paamiut to London generates about 203 kg of CO2 per passenger, and 203 kilograms equals 447 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Paamiut to London
See the map of the shortest flight path between Paamiut Airport (JFR) and London International Airport (YXU).
Airport information
Origin | Paamiut Airport |
---|---|
City: | Paamiut |
Country: | Greenland |
IATA Code: | JFR |
ICAO Code: | BGPT |
Coordinates: | 62°0′53″N, 49°40′15″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |