Air Miles Calculator logo

How far is Ordos from Osaka?

The distance between Osaka (Kansai International Airport) and Ordos (Ordos Ejin Horo Airport) is 1442 miles / 2321 kilometers / 1253 nautical miles.

The driving distance from Osaka (KIX) to Ordos (DSN) is 2025 miles / 3259 kilometers, and travel time by car is about 41 hours 32 minutes.

Kansai International Airport – Ordos Ejin Horo Airport

Distance arrow
1442
Miles
Distance arrow
2321
Kilometers
Distance arrow
1253
Nautical miles

Search flights

Distance from Osaka to Ordos

There are several ways to calculate the distance from Osaka to Ordos. Here are two standard methods:

Vincenty's formula (applied above)
  • 1442.108 miles
  • 2320.847 kilometers
  • 1253.157 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1439.117 miles
  • 2316.034 kilometers
  • 1250.558 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Osaka to Ordos?

The estimated flight time from Kansai International Airport to Ordos Ejin Horo Airport is 3 hours and 13 minutes.

Flight carbon footprint between Kansai International Airport (KIX) and Ordos Ejin Horo Airport (DSN)

On average, flying from Osaka to Ordos generates about 176 kg of CO2 per passenger, and 176 kilograms equals 388 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Osaka to Ordos

See the map of the shortest flight path between Kansai International Airport (KIX) and Ordos Ejin Horo Airport (DSN).

Airport information

Origin Kansai International Airport
City: Osaka
Country: Japan Flag of Japan
IATA Code: KIX
ICAO Code: RJBB
Coordinates: 34°25′38″N, 135°14′38″E
Destination Ordos Ejin Horo Airport
City: Ordos
Country: China Flag of China
IATA Code: DSN
ICAO Code: ZBDS
Coordinates: 39°29′24″N, 109°51′41″E