Air Miles Calculator logo

How far is Naxos from San Francisco, CA?

The distance between San Francisco (San Francisco International Airport) and Naxos (Naxos Island National Airport) is 6895 miles / 11096 kilometers / 5991 nautical miles.

San Francisco International Airport – Naxos Island National Airport

Distance arrow
6895
Miles
Distance arrow
11096
Kilometers
Distance arrow
5991
Nautical miles

Search flights

Distance from San Francisco to Naxos

There are several ways to calculate the distance from San Francisco to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 6894.851 miles
  • 11096.187 kilometers
  • 5991.462 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6879.999 miles
  • 11072.286 kilometers
  • 5978.556 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Francisco to Naxos?

The estimated flight time from San Francisco International Airport to Naxos Island National Airport is 13 hours and 33 minutes.

Flight carbon footprint between San Francisco International Airport (SFO) and Naxos Island National Airport (JNX)

On average, flying from San Francisco to Naxos generates about 840 kg of CO2 per passenger, and 840 kilograms equals 1 852 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Francisco to Naxos

See the map of the shortest flight path between San Francisco International Airport (SFO) and Naxos Island National Airport (JNX).

Airport information

Origin San Francisco International Airport
City: San Francisco, CA
Country: United States Flag of United States
IATA Code: SFO
ICAO Code: KSFO
Coordinates: 37°37′8″N, 122°22′30″W
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E