Air Miles Calculator logo

How far is Nakashibetsu from Xilinhot?

The distance between Xilinhot (Xilinhot Airport) and Nakashibetsu (Nakashibetsu Airport) is 1444 miles / 2324 kilometers / 1255 nautical miles.

The driving distance from Xilinhot (XIL) to Nakashibetsu (SHB) is 2762 miles / 4445 kilometers, and travel time by car is about 57 hours 29 minutes.

Xilinhot Airport – Nakashibetsu Airport

Distance arrow
1444
Miles
Distance arrow
2324
Kilometers
Distance arrow
1255
Nautical miles

Search flights

Distance from Xilinhot to Nakashibetsu

There are several ways to calculate the distance from Xilinhot to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 1443.923 miles
  • 2323.769 kilometers
  • 1254.735 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1440.001 miles
  • 2317.456 kilometers
  • 1251.326 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xilinhot to Nakashibetsu?

The estimated flight time from Xilinhot Airport to Nakashibetsu Airport is 3 hours and 14 minutes.

Flight carbon footprint between Xilinhot Airport (XIL) and Nakashibetsu Airport (SHB)

On average, flying from Xilinhot to Nakashibetsu generates about 176 kg of CO2 per passenger, and 176 kilograms equals 388 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xilinhot to Nakashibetsu

See the map of the shortest flight path between Xilinhot Airport (XIL) and Nakashibetsu Airport (SHB).

Airport information

Origin Xilinhot Airport
City: Xilinhot
Country: China Flag of China
IATA Code: XIL
ICAO Code: ZBXH
Coordinates: 43°54′56″N, 115°57′50″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E