How far is Baoshan from Niigata?
The distance between Niigata (Niigata Airport) and Baoshan (Baoshan Yunrui Airport) is 2498 miles / 4020 kilometers / 2170 nautical miles.
The driving distance from Niigata (KIJ) to Baoshan (BSD) is 3367 miles / 5418 kilometers, and travel time by car is about 75 hours 32 minutes.
Niigata Airport – Baoshan Yunrui Airport
Search flights
Distance from Niigata to Baoshan
There are several ways to calculate the distance from Niigata to Baoshan. Here are two standard methods:
Vincenty's formula (applied above)- 2497.623 miles
- 4019.534 kilometers
- 2170.375 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2494.100 miles
- 4013.864 kilometers
- 2167.313 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Niigata to Baoshan?
The estimated flight time from Niigata Airport to Baoshan Yunrui Airport is 5 hours and 13 minutes.
What is the time difference between Niigata and Baoshan?
The time difference between Niigata and Baoshan is 1 hour. Baoshan is 1 hour behind Niigata.
Flight carbon footprint between Niigata Airport (KIJ) and Baoshan Yunrui Airport (BSD)
On average, flying from Niigata to Baoshan generates about 275 kg of CO2 per passenger, and 275 kilograms equals 606 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Niigata to Baoshan
See the map of the shortest flight path between Niigata Airport (KIJ) and Baoshan Yunrui Airport (BSD).
Airport information
Origin | Niigata Airport |
---|---|
City: | Niigata |
Country: | Japan ![]() |
IATA Code: | KIJ |
ICAO Code: | RJSN |
Coordinates: | 37°57′21″N, 139°7′15″E |
Destination | Baoshan Yunrui Airport |
---|---|
City: | Baoshan |
Country: | China ![]() |
IATA Code: | BSD |
ICAO Code: | ZPBS |
Coordinates: | 25°3′11″N, 99°10′5″E |