Air Miles Calculator logo

How far is Baoshan from Osaka?

The distance between Osaka (Kansai International Airport) and Baoshan (Baoshan Yunrui Airport) is 2249 miles / 3620 kilometers / 1955 nautical miles.

The driving distance from Osaka (KIX) to Baoshan (BSD) is 3362 miles / 5411 kilometers, and travel time by car is about 65 hours 37 minutes.

Kansai International Airport – Baoshan Yunrui Airport

Distance arrow
2249
Miles
Distance arrow
3620
Kilometers
Distance arrow
1955
Nautical miles

Search flights

Distance from Osaka to Baoshan

There are several ways to calculate the distance from Osaka to Baoshan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2249.225 miles
  • 3619.776 kilometers
  • 1954.523 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2245.801 miles
  • 3614.266 kilometers
  • 1951.548 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Osaka to Baoshan?

The estimated flight time from Kansai International Airport to Baoshan Yunrui Airport is 4 hours and 45 minutes.

Flight carbon footprint between Kansai International Airport (KIX) and Baoshan Yunrui Airport (BSD)

On average, flying from Osaka to Baoshan generates about 246 kg of CO2 per passenger, and 246 kilograms equals 542 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Osaka to Baoshan

See the map of the shortest flight path between Kansai International Airport (KIX) and Baoshan Yunrui Airport (BSD).

Airport information

Origin Kansai International Airport
City: Osaka
Country: Japan Flag of Japan
IATA Code: KIX
ICAO Code: RJBB
Coordinates: 34°25′38″N, 135°14′38″E
Destination Baoshan Yunrui Airport
City: Baoshan
Country: China Flag of China
IATA Code: BSD
ICAO Code: ZPBS
Coordinates: 25°3′11″N, 99°10′5″E