Air Miles Calculator logo

How far is London from San Jose, CA?

The distance between San Jose (San Jose International Airport) and London (London International Airport) is 2171 miles / 3493 kilometers / 1886 nautical miles.

The driving distance from San Jose (SJC) to London (YXU) is 2557 miles / 4115 kilometers, and travel time by car is about 44 hours 42 minutes.

San Jose International Airport – London International Airport

Distance arrow
2171
Miles
Distance arrow
3493
Kilometers
Distance arrow
1886
Nautical miles

Search flights

Distance from San Jose to London

There are several ways to calculate the distance from San Jose to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2170.576 miles
  • 3493.204 kilometers
  • 1886.179 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2165.408 miles
  • 3484.887 kilometers
  • 1881.688 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Jose to London?

The estimated flight time from San Jose International Airport to London International Airport is 4 hours and 36 minutes.

Flight carbon footprint between San Jose International Airport (SJC) and London International Airport (YXU)

On average, flying from San Jose to London generates about 237 kg of CO2 per passenger, and 237 kilograms equals 522 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from San Jose to London

See the map of the shortest flight path between San Jose International Airport (SJC) and London International Airport (YXU).

Airport information

Origin San Jose International Airport
City: San Jose, CA
Country: United States Flag of United States
IATA Code: SJC
ICAO Code: KSJC
Coordinates: 37°21′45″N, 121°55′44″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W