Air Miles Calculator logo

How far is London from Victoria?

The distance between Victoria (Victoria Inner Harbour Airport) and London (London International Airport) is 2049 miles / 3298 kilometers / 1781 nautical miles.

The driving distance from Victoria (YWH) to London (YXU) is 2462 miles / 3963 kilometers, and travel time by car is about 46 hours 53 minutes.

Victoria Inner Harbour Airport – London International Airport

Distance arrow
2049
Miles
Distance arrow
3298
Kilometers
Distance arrow
1781
Nautical miles

Search flights

Distance from Victoria to London

There are several ways to calculate the distance from Victoria to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2049.428 miles
  • 3298.234 kilometers
  • 1780.904 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2043.850 miles
  • 3289.257 kilometers
  • 1776.057 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Victoria to London?

The estimated flight time from Victoria Inner Harbour Airport to London International Airport is 4 hours and 22 minutes.

Flight carbon footprint between Victoria Inner Harbour Airport (YWH) and London International Airport (YXU)

On average, flying from Victoria to London generates about 223 kg of CO2 per passenger, and 223 kilograms equals 492 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Victoria to London

See the map of the shortest flight path between Victoria Inner Harbour Airport (YWH) and London International Airport (YXU).

Airport information

Origin Victoria Inner Harbour Airport
City: Victoria
Country: Canada Flag of Canada
IATA Code: YWH
ICAO Code: CYWH
Coordinates: 48°25′29″N, 123°23′19″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W