How far is Amakusa from Nakashibetsu?
The distance between Nakashibetsu (Nakashibetsu Airport) and Amakusa (Amakusa Airfield) is 1109 miles / 1785 kilometers / 964 nautical miles.
The driving distance from Nakashibetsu (SHB) to Amakusa (AXJ) is 1576 miles / 2537 kilometers, and travel time by car is about 32 hours 12 minutes.
Nakashibetsu Airport – Amakusa Airfield
Search flights
Distance from Nakashibetsu to Amakusa
There are several ways to calculate the distance from Nakashibetsu to Amakusa. Here are two standard methods:
Vincenty's formula (applied above)- 1109.230 miles
- 1785.133 kilometers
- 963.895 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1108.786 miles
- 1784.418 kilometers
- 963.508 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nakashibetsu to Amakusa?
The estimated flight time from Nakashibetsu Airport to Amakusa Airfield is 2 hours and 36 minutes.
What is the time difference between Nakashibetsu and Amakusa?
There is no time difference between Nakashibetsu and Amakusa.
Flight carbon footprint between Nakashibetsu Airport (SHB) and Amakusa Airfield (AXJ)
On average, flying from Nakashibetsu to Amakusa generates about 157 kg of CO2 per passenger, and 157 kilograms equals 347 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Nakashibetsu to Amakusa
See the map of the shortest flight path between Nakashibetsu Airport (SHB) and Amakusa Airfield (AXJ).
Airport information
Origin | Nakashibetsu Airport |
---|---|
City: | Nakashibetsu |
Country: | Japan |
IATA Code: | SHB |
ICAO Code: | RJCN |
Coordinates: | 43°34′38″N, 144°57′36″E |
Destination | Amakusa Airfield |
---|---|
City: | Amakusa |
Country: | Japan |
IATA Code: | AXJ |
ICAO Code: | RJDA |
Coordinates: | 32°28′56″N, 130°9′32″E |