How far is Nakashibetsu from Wuhai?
The distance between Wuhai (Wuhai Airport) and Nakashibetsu (Nakashibetsu Airport) is 1974 miles / 3177 kilometers / 1715 nautical miles.
The driving distance from Wuhai (WUA) to Nakashibetsu (SHB) is 3290 miles / 5295 kilometers, and travel time by car is about 66 hours 56 minutes.
Wuhai Airport – Nakashibetsu Airport
Search flights
Distance from Wuhai to Nakashibetsu
There are several ways to calculate the distance from Wuhai to Nakashibetsu. Here are two standard methods:
Vincenty's formula (applied above)- 1973.824 miles
- 3176.562 kilometers
- 1715.206 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1968.830 miles
- 3168.525 kilometers
- 1710.867 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Wuhai to Nakashibetsu?
The estimated flight time from Wuhai Airport to Nakashibetsu Airport is 4 hours and 14 minutes.
What is the time difference between Wuhai and Nakashibetsu?
The time difference between Wuhai and Nakashibetsu is 1 hour. Nakashibetsu is 1 hour ahead of Wuhai.
Flight carbon footprint between Wuhai Airport (WUA) and Nakashibetsu Airport (SHB)
On average, flying from Wuhai to Nakashibetsu generates about 215 kg of CO2 per passenger, and 215 kilograms equals 474 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Wuhai to Nakashibetsu
See the map of the shortest flight path between Wuhai Airport (WUA) and Nakashibetsu Airport (SHB).
Airport information
Origin | Wuhai Airport |
---|---|
City: | Wuhai |
Country: | China |
IATA Code: | WUA |
ICAO Code: | ZBUH |
Coordinates: | 39°47′36″N, 106°47′57″E |
Destination | Nakashibetsu Airport |
---|---|
City: | Nakashibetsu |
Country: | Japan |
IATA Code: | SHB |
ICAO Code: | RJCN |
Coordinates: | 43°34′38″N, 144°57′36″E |