How far is Paamiut from London?
The distance between London (London International Airport) and Paamiut (Paamiut Airport) is 1833 miles / 2949 kilometers / 1592 nautical miles.
London International Airport – Paamiut Airport
Search flights
Distance from London to Paamiut
There are several ways to calculate the distance from London to Paamiut. Here are two standard methods:
Vincenty's formula (applied above)- 1832.587 miles
- 2949.263 kilometers
- 1592.475 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1829.078 miles
- 2943.616 kilometers
- 1589.425 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from London to Paamiut?
The estimated flight time from London International Airport to Paamiut Airport is 3 hours and 58 minutes.
What is the time difference between London and Paamiut?
The time difference between London and Paamiut is 3 hours. Paamiut is 3 hours ahead of London.
Flight carbon footprint between London International Airport (YXU) and Paamiut Airport (JFR)
On average, flying from London to Paamiut generates about 203 kg of CO2 per passenger, and 203 kilograms equals 447 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from London to Paamiut
See the map of the shortest flight path between London International Airport (YXU) and Paamiut Airport (JFR).
Airport information
Origin | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |
Destination | Paamiut Airport |
---|---|
City: | Paamiut |
Country: | Greenland |
IATA Code: | JFR |
ICAO Code: | BGPT |
Coordinates: | 62°0′53″N, 49°40′15″W |