Air Miles Calculator logo

How far is Yonago from Springfield, IL?

The distance between Springfield (Springfield Abraham Lincoln Capital Airport) and Yonago (Miho-Yonago Airport) is 6573 miles / 10579 kilometers / 5712 nautical miles.

Springfield Abraham Lincoln Capital Airport – Miho-Yonago Airport

Distance arrow
6573
Miles
Distance arrow
10579
Kilometers
Distance arrow
5712
Nautical miles

Search flights

Distance from Springfield to Yonago

There are several ways to calculate the distance from Springfield to Yonago. Here are two standard methods:

Vincenty's formula (applied above)
  • 6573.175 miles
  • 10578.500 kilometers
  • 5711.933 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6558.671 miles
  • 10555.158 kilometers
  • 5699.329 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Springfield to Yonago?

The estimated flight time from Springfield Abraham Lincoln Capital Airport to Miho-Yonago Airport is 12 hours and 56 minutes.

Flight carbon footprint between Springfield Abraham Lincoln Capital Airport (SPI) and Miho-Yonago Airport (YGJ)

On average, flying from Springfield to Yonago generates about 795 kg of CO2 per passenger, and 795 kilograms equals 1 753 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Springfield to Yonago

See the map of the shortest flight path between Springfield Abraham Lincoln Capital Airport (SPI) and Miho-Yonago Airport (YGJ).

Airport information

Origin Springfield Abraham Lincoln Capital Airport
City: Springfield, IL
Country: United States Flag of United States
IATA Code: SPI
ICAO Code: KSPI
Coordinates: 39°50′38″N, 89°40′40″W
Destination Miho-Yonago Airport
City: Yonago
Country: Japan Flag of Japan
IATA Code: YGJ
ICAO Code: RJOH
Coordinates: 35°29′31″N, 133°14′9″E