Air Miles Calculator logo

How far is London from Kingston?

The distance between Kingston (Norman Manley International Airport) and London (London International Airport) is 1748 miles / 2813 kilometers / 1519 nautical miles.

Norman Manley International Airport – London International Airport

Distance arrow
1748
Miles
Distance arrow
2813
Kilometers
Distance arrow
1519
Nautical miles

Search flights

Distance from Kingston to London

There are several ways to calculate the distance from Kingston to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1748.028 miles
  • 2813.178 kilometers
  • 1518.995 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1752.979 miles
  • 2821.147 kilometers
  • 1523.297 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kingston to London?

The estimated flight time from Norman Manley International Airport to London International Airport is 3 hours and 48 minutes.

What is the time difference between Kingston and London?

There is no time difference between Kingston and London.

Flight carbon footprint between Norman Manley International Airport (KIN) and London International Airport (YXU)

On average, flying from Kingston to London generates about 196 kg of CO2 per passenger, and 196 kilograms equals 432 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kingston to London

See the map of the shortest flight path between Norman Manley International Airport (KIN) and London International Airport (YXU).

Airport information

Origin Norman Manley International Airport
City: Kingston
Country: Jamaica Flag of Jamaica
IATA Code: KIN
ICAO Code: MKJP
Coordinates: 17°56′8″N, 76°47′14″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W