Air Miles Calculator logo

How far is Yanji from Saga?

The distance between Saga (Saga Airport) and Yanji (Yanji Chaoyangchuan International Airport) is 673 miles / 1083 kilometers / 585 nautical miles.

The driving distance from Saga (HSG) to Yanji (YNJ) is 988 miles / 1590 kilometers, and travel time by car is about 23 hours 43 minutes.

Saga Airport – Yanji Chaoyangchuan International Airport

Distance arrow
673
Miles
Distance arrow
1083
Kilometers
Distance arrow
585
Nautical miles

Search flights

Distance from Saga to Yanji

There are several ways to calculate the distance from Saga to Yanji. Here are two standard methods:

Vincenty's formula (applied above)
  • 672.891 miles
  • 1082.913 kilometers
  • 584.726 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 674.076 miles
  • 1084.820 kilometers
  • 585.756 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saga to Yanji?

The estimated flight time from Saga Airport to Yanji Chaoyangchuan International Airport is 1 hour and 46 minutes.

Flight carbon footprint between Saga Airport (HSG) and Yanji Chaoyangchuan International Airport (YNJ)

On average, flying from Saga to Yanji generates about 122 kg of CO2 per passenger, and 122 kilograms equals 268 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Saga to Yanji

See the map of the shortest flight path between Saga Airport (HSG) and Yanji Chaoyangchuan International Airport (YNJ).

Airport information

Origin Saga Airport
City: Saga
Country: Japan Flag of Japan
IATA Code: HSG
ICAO Code: RJFS
Coordinates: 33°8′58″N, 130°18′7″E
Destination Yanji Chaoyangchuan International Airport
City: Yanji
Country: China Flag of China
IATA Code: YNJ
ICAO Code: ZYYJ
Coordinates: 42°52′58″N, 129°27′3″E