Air Miles Calculator logo

How far is Jiagedaqi from Shirahama?

The distance between Shirahama (Nanki–Shirahama Airport) and Jiagedaqi (Jiagedaqi Airport) is 1287 miles / 2071 kilometers / 1118 nautical miles.

The driving distance from Shirahama (SHM) to Jiagedaqi (JGD) is 1967 miles / 3166 kilometers, and travel time by car is about 40 hours 41 minutes.

Nanki–Shirahama Airport – Jiagedaqi Airport

Distance arrow
1287
Miles
Distance arrow
2071
Kilometers
Distance arrow
1118
Nautical miles

Search flights

Distance from Shirahama to Jiagedaqi

There are several ways to calculate the distance from Shirahama to Jiagedaqi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1286.980 miles
  • 2071.193 kilometers
  • 1118.355 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1287.444 miles
  • 2071.941 kilometers
  • 1118.759 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shirahama to Jiagedaqi?

The estimated flight time from Nanki–Shirahama Airport to Jiagedaqi Airport is 2 hours and 56 minutes.

Flight carbon footprint between Nanki–Shirahama Airport (SHM) and Jiagedaqi Airport (JGD)

On average, flying from Shirahama to Jiagedaqi generates about 166 kg of CO2 per passenger, and 166 kilograms equals 366 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shirahama to Jiagedaqi

See the map of the shortest flight path between Nanki–Shirahama Airport (SHM) and Jiagedaqi Airport (JGD).

Airport information

Origin Nanki–Shirahama Airport
City: Shirahama
Country: Japan Flag of Japan
IATA Code: SHM
ICAO Code: RJBD
Coordinates: 33°39′43″N, 135°21′50″E
Destination Jiagedaqi Airport
City: Jiagedaqi
Country: China Flag of China
IATA Code: JGD
ICAO Code: ZYJD
Coordinates: 50°22′17″N, 124°7′3″E