Air Miles Calculator logo

How far is Saga from Shanghai?

The distance between Shanghai (Shanghai Hongqiao International Airport) and Saga (Saga Airport) is 542 miles / 873 kilometers / 471 nautical miles.

The driving distance from Shanghai (SHA) to Saga (HSG) is 1854 miles / 2983 kilometers, and travel time by car is about 37 hours 40 minutes.

Shanghai Hongqiao International Airport – Saga Airport

Distance arrow
542
Miles
Distance arrow
873
Kilometers
Distance arrow
471
Nautical miles

Search flights

Distance from Shanghai to Saga

There are several ways to calculate the distance from Shanghai to Saga. Here are two standard methods:

Vincenty's formula (applied above)
  • 542.183 miles
  • 872.558 kilometers
  • 471.144 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 541.223 miles
  • 871.015 kilometers
  • 470.310 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Saga?

The estimated flight time from Shanghai Hongqiao International Airport to Saga Airport is 1 hour and 31 minutes.

Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Saga Airport (HSG)

On average, flying from Shanghai to Saga generates about 105 kg of CO2 per passenger, and 105 kilograms equals 231 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shanghai to Saga

See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Saga Airport (HSG).

Airport information

Origin Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E
Destination Saga Airport
City: Saga
Country: Japan Flag of Japan
IATA Code: HSG
ICAO Code: RJFS
Coordinates: 33°8′58″N, 130°18′7″E