Air Miles Calculator logo

How far is Napier from London?

The distance between London (London Heathrow Airport) and Napier (Hawke's Bay Airport) is 11591 miles / 18654 kilometers / 10072 nautical miles.

London Heathrow Airport – Hawke's Bay Airport

Distance arrow
11591
Miles
Distance arrow
18654
Kilometers
Distance arrow
10072
Nautical miles
Flight time duration
22 h 26 min
CO2 emission
1 558 kg

Search flights

Distance from London to Napier

There are several ways to calculate the distance from London to Napier. Here are two standard methods:

Vincenty's formula (applied above)
  • 11591.127 miles
  • 18654.111 kilometers
  • 10072.414 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 11597.499 miles
  • 18664.365 kilometers
  • 10077.951 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from London to Napier?

The estimated flight time from London Heathrow Airport to Hawke's Bay Airport is 22 hours and 26 minutes.

Flight carbon footprint between London Heathrow Airport (LHR) and Hawke's Bay Airport (NPE)

On average, flying from London to Napier generates about 1 558 kg of CO2 per passenger, and 1 558 kilograms equals 3 435 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from London to Napier

See the map of the shortest flight path between London Heathrow Airport (LHR) and Hawke's Bay Airport (NPE).

Airport information

Origin London Heathrow Airport
City: London
Country: United Kingdom Flag of United Kingdom
IATA Code: LHR
ICAO Code: EGLL
Coordinates: 51°28′14″N, 0°27′42″W
Destination Hawke's Bay Airport
City: Napier
Country: New Zealand Flag of New Zealand
IATA Code: NPE
ICAO Code: NZNR
Coordinates: 39°27′56″S, 176°52′11″E