Air Miles Calculator logo

How far is Niigata from Amman?

The distance between Amman (Queen Alia International Airport) and Niigata (Niigata Airport) is 5550 miles / 8931 kilometers / 4823 nautical miles.

The driving distance from Amman (AMM) to Niigata (KIJ) is 7195 miles / 11580 kilometers, and travel time by car is about 141 hours 27 minutes.

Queen Alia International Airport – Niigata Airport

Distance arrow
5550
Miles
Distance arrow
8931
Kilometers
Distance arrow
4823
Nautical miles

Search flights

Distance from Amman to Niigata

There are several ways to calculate the distance from Amman to Niigata. Here are two standard methods:

Vincenty's formula (applied above)
  • 5549.635 miles
  • 8931.271 kilometers
  • 4822.501 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5537.831 miles
  • 8912.275 kilometers
  • 4812.244 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Amman to Niigata?

The estimated flight time from Queen Alia International Airport to Niigata Airport is 11 hours and 0 minutes.

Flight carbon footprint between Queen Alia International Airport (AMM) and Niigata Airport (KIJ)

On average, flying from Amman to Niigata generates about 656 kg of CO2 per passenger, and 656 kilograms equals 1 447 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Amman to Niigata

See the map of the shortest flight path between Queen Alia International Airport (AMM) and Niigata Airport (KIJ).

Airport information

Origin Queen Alia International Airport
City: Amman
Country: Jordan Flag of Jordan
IATA Code: AMM
ICAO Code: OJAI
Coordinates: 31°43′21″N, 35°59′35″E
Destination Niigata Airport
City: Niigata
Country: Japan Flag of Japan
IATA Code: KIJ
ICAO Code: RJSN
Coordinates: 37°57′21″N, 139°7′15″E