How far is London from Hebron, KY?
The distance between Hebron (Cincinnati/Northern Kentucky International Airport) and London (London International Airport) is 331 miles / 532 kilometers / 287 nautical miles.
The driving distance from Hebron (CVG) to London (YXU) is 400 miles / 644 kilometers, and travel time by car is about 8 hours 2 minutes.
Cincinnati/Northern Kentucky International Airport – London International Airport
Search flights
Distance from Hebron to London
There are several ways to calculate the distance from Hebron to London. Here are two standard methods:
Vincenty's formula (applied above)- 330.670 miles
- 532.162 kilometers
- 287.344 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 330.699 miles
- 532.209 kilometers
- 287.370 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hebron to London?
The estimated flight time from Cincinnati/Northern Kentucky International Airport to London International Airport is 1 hour and 7 minutes.
What is the time difference between Hebron and London?
Flight carbon footprint between Cincinnati/Northern Kentucky International Airport (CVG) and London International Airport (YXU)
On average, flying from Hebron to London generates about 74 kg of CO2 per passenger, and 74 kilograms equals 162 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Hebron to London
See the map of the shortest flight path between Cincinnati/Northern Kentucky International Airport (CVG) and London International Airport (YXU).
Airport information
Origin | Cincinnati/Northern Kentucky International Airport |
---|---|
City: | Hebron, KY |
Country: | United States |
IATA Code: | CVG |
ICAO Code: | KCVG |
Coordinates: | 39°2′55″N, 84°40′4″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |