Air Miles Calculator logo

How far is Aomori from San Juan?

The distance between San Juan (San Juan Luis Muñoz Marín International Airport) and Aomori (Aomori Airport) is 8015 miles / 12899 kilometers / 6965 nautical miles.

San Juan Luis Muñoz Marín International Airport – Aomori Airport

Distance arrow
8015
Miles
Distance arrow
12899
Kilometers
Distance arrow
6965
Nautical miles
Flight time duration
15 h 40 min
CO2 emission
1 001 kg

Search flights

Distance from San Juan to Aomori

There are several ways to calculate the distance from San Juan to Aomori. Here are two standard methods:

Vincenty's formula (applied above)
  • 8015.083 miles
  • 12899.027 kilometers
  • 6964.917 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8003.622 miles
  • 12880.581 kilometers
  • 6954.957 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Juan to Aomori?

The estimated flight time from San Juan Luis Muñoz Marín International Airport to Aomori Airport is 15 hours and 40 minutes.

Flight carbon footprint between San Juan Luis Muñoz Marín International Airport (SJU) and Aomori Airport (AOJ)

On average, flying from San Juan to Aomori generates about 1 001 kg of CO2 per passenger, and 1 001 kilograms equals 2 206 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Juan to Aomori

See the map of the shortest flight path between San Juan Luis Muñoz Marín International Airport (SJU) and Aomori Airport (AOJ).

Airport information

Origin San Juan Luis Muñoz Marín International Airport
City: San Juan
Country: Puerto Rico Flag of Puerto Rico
IATA Code: SJU
ICAO Code: TJSJ
Coordinates: 18°26′21″N, 66°0′6″W
Destination Aomori Airport
City: Aomori
Country: Japan Flag of Japan
IATA Code: AOJ
ICAO Code: RJSA
Coordinates: 40°44′4″N, 140°41′27″E