Air Miles Calculator logo

How far is Baishan from Kitadaitōjima?

The distance between Kitadaitōjima (Kitadaito Airport) and Baishan (Changbaishan Airport) is 1131 miles / 1821 kilometers / 983 nautical miles.

The driving distance from Kitadaitōjima (KTD) to Baishan (NBS) is 1757 miles / 2827 kilometers, and travel time by car is about 236 hours 44 minutes.

Kitadaito Airport – Changbaishan Airport

Distance arrow
1131
Miles
Distance arrow
1821
Kilometers
Distance arrow
983
Nautical miles

Search flights

Distance from Kitadaitōjima to Baishan

There are several ways to calculate the distance from Kitadaitōjima to Baishan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1131.301 miles
  • 1820.652 kilometers
  • 983.074 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1133.870 miles
  • 1824.787 kilometers
  • 985.306 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kitadaitōjima to Baishan?

The estimated flight time from Kitadaito Airport to Changbaishan Airport is 2 hours and 38 minutes.

Flight carbon footprint between Kitadaito Airport (KTD) and Changbaishan Airport (NBS)

On average, flying from Kitadaitōjima to Baishan generates about 158 kg of CO2 per passenger, and 158 kilograms equals 349 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kitadaitōjima to Baishan

See the map of the shortest flight path between Kitadaito Airport (KTD) and Changbaishan Airport (NBS).

Airport information

Origin Kitadaito Airport
City: Kitadaitōjima
Country: Japan Flag of Japan
IATA Code: KTD
ICAO Code: RORK
Coordinates: 25°56′40″N, 131°19′37″E
Destination Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E