Air Miles Calculator logo

How far is Saga from Hangzhou?

The distance between Hangzhou (Hangzhou Xiaoshan International Airport) and Saga (Saga Airport) is 615 miles / 990 kilometers / 534 nautical miles.

The driving distance from Hangzhou (HGH) to Saga (HSG) is 1918 miles / 3086 kilometers, and travel time by car is about 39 hours 3 minutes.

Hangzhou Xiaoshan International Airport – Saga Airport

Distance arrow
615
Miles
Distance arrow
990
Kilometers
Distance arrow
534
Nautical miles

Search flights

Distance from Hangzhou to Saga

There are several ways to calculate the distance from Hangzhou to Saga. Here are two standard methods:

Vincenty's formula (applied above)
  • 614.863 miles
  • 989.526 kilometers
  • 534.301 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 613.929 miles
  • 988.023 kilometers
  • 533.490 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hangzhou to Saga?

The estimated flight time from Hangzhou Xiaoshan International Airport to Saga Airport is 1 hour and 39 minutes.

Flight carbon footprint between Hangzhou Xiaoshan International Airport (HGH) and Saga Airport (HSG)

On average, flying from Hangzhou to Saga generates about 115 kg of CO2 per passenger, and 115 kilograms equals 253 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hangzhou to Saga

See the map of the shortest flight path between Hangzhou Xiaoshan International Airport (HGH) and Saga Airport (HSG).

Airport information

Origin Hangzhou Xiaoshan International Airport
City: Hangzhou
Country: China Flag of China
IATA Code: HGH
ICAO Code: ZSHC
Coordinates: 30°13′46″N, 120°26′2″E
Destination Saga Airport
City: Saga
Country: Japan Flag of Japan
IATA Code: HSG
ICAO Code: RJFS
Coordinates: 33°8′58″N, 130°18′7″E