How far is Napier from London?
The distance between London (London Gatwick Airport) and Napier (Hawke's Bay Airport) is 11611 miles / 18686 kilometers / 10090 nautical miles.
London Gatwick Airport – Hawke's Bay Airport
Search flights
Distance from London to Napier
There are several ways to calculate the distance from London to Napier. Here are two standard methods:
Vincenty's formula (applied above)- 11610.971 miles
- 18686.047 kilometers
- 10089.658 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 11617.279 miles
- 18696.198 kilometers
- 10095.139 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from London to Napier?
The estimated flight time from London Gatwick Airport to Hawke's Bay Airport is 22 hours and 29 minutes.
What is the time difference between London and Napier?
The time difference between London and Napier is 13 hours. Napier is 13 hours ahead of London.
Flight carbon footprint between London Gatwick Airport (LGW) and Hawke's Bay Airport (NPE)
On average, flying from London to Napier generates about 1 561 kg of CO2 per passenger, and 1 561 kilograms equals 3 442 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from London to Napier
See the map of the shortest flight path between London Gatwick Airport (LGW) and Hawke's Bay Airport (NPE).
Airport information
Origin | London Gatwick Airport |
---|---|
City: | London |
Country: | United Kingdom |
IATA Code: | LGW |
ICAO Code: | EGKK |
Coordinates: | 51°8′53″N, 0°11′25″W |
Destination | Hawke's Bay Airport |
---|---|
City: | Napier |
Country: | New Zealand |
IATA Code: | NPE |
ICAO Code: | NZNR |
Coordinates: | 39°27′56″S, 176°52′11″E |