How far is Walvis Bay from London?
The distance between London (London Heathrow Airport) and Walvis Bay (Walvis Bay Airport) is 5206 miles / 8378 kilometers / 4524 nautical miles.
London Heathrow Airport – Walvis Bay Airport
Search flights
Distance from London to Walvis Bay
There are several ways to calculate the distance from London to Walvis Bay. Here are two standard methods:
Vincenty's formula (applied above)- 5206.081 miles
- 8378.375 kilometers
- 4523.960 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5225.256 miles
- 8409.234 kilometers
- 4540.623 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from London to Walvis Bay?
The estimated flight time from London Heathrow Airport to Walvis Bay Airport is 10 hours and 21 minutes.
What is the time difference between London and Walvis Bay?
The time difference between London and Walvis Bay is 2 hours. Walvis Bay is 2 hours ahead of London.
Flight carbon footprint between London Heathrow Airport (LHR) and Walvis Bay Airport (WVB)
On average, flying from London to Walvis Bay generates about 611 kg of CO2 per passenger, and 611 kilograms equals 1 346 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from London to Walvis Bay
See the map of the shortest flight path between London Heathrow Airport (LHR) and Walvis Bay Airport (WVB).
Airport information
Origin | London Heathrow Airport |
---|---|
City: | London |
Country: | United Kingdom |
IATA Code: | LHR |
ICAO Code: | EGLL |
Coordinates: | 51°28′14″N, 0°27′42″W |
Destination | Walvis Bay Airport |
---|---|
City: | Walvis Bay |
Country: | Namibia |
IATA Code: | WVB |
ICAO Code: | FYWB |
Coordinates: | 22°58′47″S, 14°38′43″E |