Air Miles Calculator logo

How far is Yonago from Senai?

The distance between Senai (Senai International Airport) and Yonago (Miho-Yonago Airport) is 3004 miles / 4835 kilometers / 2611 nautical miles.

The driving distance from Senai (JHB) to Yonago (YGJ) is 5064 miles / 8150 kilometers, and travel time by car is about 98 hours 56 minutes.

Senai International Airport – Miho-Yonago Airport

Distance arrow
3004
Miles
Distance arrow
4835
Kilometers
Distance arrow
2611
Nautical miles

Search flights

Distance from Senai to Yonago

There are several ways to calculate the distance from Senai to Yonago. Here are two standard methods:

Vincenty's formula (applied above)
  • 3004.430 miles
  • 4835.162 kilometers
  • 2610.778 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3010.499 miles
  • 4844.929 kilometers
  • 2616.052 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Senai to Yonago?

The estimated flight time from Senai International Airport to Miho-Yonago Airport is 6 hours and 11 minutes.

Flight carbon footprint between Senai International Airport (JHB) and Miho-Yonago Airport (YGJ)

On average, flying from Senai to Yonago generates about 335 kg of CO2 per passenger, and 335 kilograms equals 738 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Senai to Yonago

See the map of the shortest flight path between Senai International Airport (JHB) and Miho-Yonago Airport (YGJ).

Airport information

Origin Senai International Airport
City: Senai
Country: Malaysia Flag of Malaysia
IATA Code: JHB
ICAO Code: WMKJ
Coordinates: 1°38′28″N, 103°40′11″E
Destination Miho-Yonago Airport
City: Yonago
Country: Japan Flag of Japan
IATA Code: YGJ
ICAO Code: RJOH
Coordinates: 35°29′31″N, 133°14′9″E