Air Miles Calculator logo

How far is London from Puerto Asís?

The distance between Puerto Asís (Tres de Mayo Airport) and London (London International Airport) is 2941 miles / 4734 kilometers / 2556 nautical miles.

Tres de Mayo Airport – London International Airport

Distance arrow
2941
Miles
Distance arrow
4734
Kilometers
Distance arrow
2556
Nautical miles

Search flights

Distance from Puerto Asís to London

There are several ways to calculate the distance from Puerto Asís to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2941.343 miles
  • 4733.632 kilometers
  • 2555.957 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2952.649 miles
  • 4751.828 kilometers
  • 2565.782 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Puerto Asís to London?

The estimated flight time from Tres de Mayo Airport to London International Airport is 6 hours and 4 minutes.

What is the time difference between Puerto Asís and London?

There is no time difference between Puerto Asís and London.

Flight carbon footprint between Tres de Mayo Airport (PUU) and London International Airport (YXU)

On average, flying from Puerto Asís to London generates about 327 kg of CO2 per passenger, and 327 kilograms equals 721 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Puerto Asís to London

See the map of the shortest flight path between Tres de Mayo Airport (PUU) and London International Airport (YXU).

Airport information

Origin Tres de Mayo Airport
City: Puerto Asís
Country: Colombia Flag of Colombia
IATA Code: PUU
ICAO Code: SKAS
Coordinates: 0°30′18″N, 76°30′2″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W