Air Miles Calculator logo

How far is Masuda from Riyadh?

The distance between Riyadh (King Khalid International Airport) and Masuda (Iwami Airport) is 5005 miles / 8055 kilometers / 4350 nautical miles.

The driving distance from Riyadh (RUH) to Masuda (IWJ) is 6608 miles / 10634 kilometers, and travel time by car is about 130 hours 14 minutes.

King Khalid International Airport – Iwami Airport

Distance arrow
5005
Miles
Distance arrow
8055
Kilometers
Distance arrow
4350
Nautical miles

Search flights

Distance from Riyadh to Masuda

There are several ways to calculate the distance from Riyadh to Masuda. Here are two standard methods:

Vincenty's formula (applied above)
  • 5005.318 miles
  • 8055.278 kilometers
  • 4349.502 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4996.161 miles
  • 8040.542 kilometers
  • 4341.546 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Riyadh to Masuda?

The estimated flight time from King Khalid International Airport to Iwami Airport is 9 hours and 58 minutes.

Flight carbon footprint between King Khalid International Airport (RUH) and Iwami Airport (IWJ)

On average, flying from Riyadh to Masuda generates about 585 kg of CO2 per passenger, and 585 kilograms equals 1 289 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Riyadh to Masuda

See the map of the shortest flight path between King Khalid International Airport (RUH) and Iwami Airport (IWJ).

Airport information

Origin King Khalid International Airport
City: Riyadh
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: RUH
ICAO Code: OERK
Coordinates: 24°57′27″N, 46°41′55″E
Destination Iwami Airport
City: Masuda
Country: Japan Flag of Japan
IATA Code: IWJ
ICAO Code: RJOW
Coordinates: 34°40′35″N, 131°47′23″E