Air Miles Calculator logo

How far is Wakkanai from Shangrao?

The distance between Shangrao (Shangrao Sanqingshan Airport) and Wakkanai (Wakkanai Airport) is 1754 miles / 2823 kilometers / 1524 nautical miles.

The driving distance from Shangrao (SQD) to Wakkanai (WKJ) is 3456 miles / 5562 kilometers, and travel time by car is about 69 hours 59 minutes.

Shangrao Sanqingshan Airport – Wakkanai Airport

Distance arrow
1754
Miles
Distance arrow
2823
Kilometers
Distance arrow
1524
Nautical miles

Search flights

Distance from Shangrao to Wakkanai

There are several ways to calculate the distance from Shangrao to Wakkanai. Here are two standard methods:

Vincenty's formula (applied above)
  • 1754.255 miles
  • 2823.199 kilometers
  • 1524.406 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1753.556 miles
  • 2822.075 kilometers
  • 1523.799 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shangrao to Wakkanai?

The estimated flight time from Shangrao Sanqingshan Airport to Wakkanai Airport is 3 hours and 49 minutes.

Flight carbon footprint between Shangrao Sanqingshan Airport (SQD) and Wakkanai Airport (WKJ)

On average, flying from Shangrao to Wakkanai generates about 197 kg of CO2 per passenger, and 197 kilograms equals 433 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shangrao to Wakkanai

See the map of the shortest flight path between Shangrao Sanqingshan Airport (SQD) and Wakkanai Airport (WKJ).

Airport information

Origin Shangrao Sanqingshan Airport
City: Shangrao
Country: China Flag of China
IATA Code: SQD
ICAO Code: ZSSR
Coordinates: 28°22′46″N, 117°57′51″E
Destination Wakkanai Airport
City: Wakkanai
Country: Japan Flag of Japan
IATA Code: WKJ
ICAO Code: RJCW
Coordinates: 45°24′15″N, 141°48′3″E