How far is London from Punta Cana?
The distance between Punta Cana (Punta Cana International Airport) and London (London Stansted Airport) is 4312 miles / 6940 kilometers / 3747 nautical miles.
Punta Cana International Airport – London Stansted Airport
Search flights
Distance from Punta Cana to London
There are several ways to calculate the distance from Punta Cana to London. Here are two standard methods:
Vincenty's formula (applied above)- 4312.211 miles
- 6939.831 kilometers
- 3747.209 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4308.131 miles
- 6933.265 kilometers
- 3743.664 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Punta Cana to London?
The estimated flight time from Punta Cana International Airport to London Stansted Airport is 8 hours and 39 minutes.
What is the time difference between Punta Cana and London?
The time difference between Punta Cana and London is 4 hours. London is 4 hours ahead of Punta Cana.
Flight carbon footprint between Punta Cana International Airport (PUJ) and London Stansted Airport (STN)
On average, flying from Punta Cana to London generates about 496 kg of CO2 per passenger, and 496 kilograms equals 1 093 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Punta Cana to London
See the map of the shortest flight path between Punta Cana International Airport (PUJ) and London Stansted Airport (STN).
Airport information
Origin | Punta Cana International Airport |
---|---|
City: | Punta Cana |
Country: | Dominican Republic |
IATA Code: | PUJ |
ICAO Code: | MDPC |
Coordinates: | 18°34′2″N, 68°21′48″W |
Destination | London Stansted Airport |
---|---|
City: | London |
Country: | United Kingdom |
IATA Code: | STN |
ICAO Code: | EGSS |
Coordinates: | 51°53′5″N, 0°14′5″E |