Air Miles Calculator logo

How far is Nakashibetsu from Buenos Aires?

The distance between Buenos Aires (Aeroparque Jorge Newbery) and Nakashibetsu (Nakashibetsu Airport) is 11044 miles / 17773 kilometers / 9597 nautical miles.

Aeroparque Jorge Newbery – Nakashibetsu Airport

Distance arrow
11044
Miles
Distance arrow
17773
Kilometers
Distance arrow
9597
Nautical miles
Flight time duration
21 h 24 min
CO2 emission
1 468 kg

Search flights

Distance from Buenos Aires to Nakashibetsu

There are several ways to calculate the distance from Buenos Aires to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 11043.774 miles
  • 17773.232 kilometers
  • 9596.777 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 11043.169 miles
  • 17772.258 kilometers
  • 9596.252 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Buenos Aires to Nakashibetsu?

The estimated flight time from Aeroparque Jorge Newbery to Nakashibetsu Airport is 21 hours and 24 minutes.

Flight carbon footprint between Aeroparque Jorge Newbery (AEP) and Nakashibetsu Airport (SHB)

On average, flying from Buenos Aires to Nakashibetsu generates about 1 468 kg of CO2 per passenger, and 1 468 kilograms equals 3 237 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Buenos Aires to Nakashibetsu

See the map of the shortest flight path between Aeroparque Jorge Newbery (AEP) and Nakashibetsu Airport (SHB).

Airport information

Origin Aeroparque Jorge Newbery
City: Buenos Aires
Country: Argentina Flag of Argentina
IATA Code: AEP
ICAO Code: SABE
Coordinates: 34°33′33″S, 58°24′56″W
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E