Air Miles Calculator logo

How far is Saga from Kabul?

The distance between Kabul (Kabul International Airport) and Saga (Saga Airport) is 3458 miles / 5565 kilometers / 3005 nautical miles.

The driving distance from Kabul (KBL) to Saga (HSG) is 4638 miles / 7464 kilometers, and travel time by car is about 90 hours 9 minutes.

Kabul International Airport – Saga Airport

Distance arrow
3458
Miles
Distance arrow
5565
Kilometers
Distance arrow
3005
Nautical miles
Flight time duration
7 h 2 min
Time Difference
4 h 30 min
CO2 emission
390 kg

Search flights

Distance from Kabul to Saga

There are several ways to calculate the distance from Kabul to Saga. Here are two standard methods:

Vincenty's formula (applied above)
  • 3458.139 miles
  • 5565.336 kilometers
  • 3005.041 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3450.717 miles
  • 5553.391 kilometers
  • 2998.591 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kabul to Saga?

The estimated flight time from Kabul International Airport to Saga Airport is 7 hours and 2 minutes.

Flight carbon footprint between Kabul International Airport (KBL) and Saga Airport (HSG)

On average, flying from Kabul to Saga generates about 390 kg of CO2 per passenger, and 390 kilograms equals 859 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kabul to Saga

See the map of the shortest flight path between Kabul International Airport (KBL) and Saga Airport (HSG).

Airport information

Origin Kabul International Airport
City: Kabul
Country: Afghanistan Flag of Afghanistan
IATA Code: KBL
ICAO Code: OAKB
Coordinates: 34°33′57″N, 69°12′44″E
Destination Saga Airport
City: Saga
Country: Japan Flag of Japan
IATA Code: HSG
ICAO Code: RJFS
Coordinates: 33°8′58″N, 130°18′7″E