Air Miles Calculator logo

How far is Nanjing from Hiroshima?

The distance between Hiroshima (Hiroshima Airport) and Nanjing (Nanjing Lukou International Airport) is 836 miles / 1345 kilometers / 726 nautical miles.

The driving distance from Hiroshima (HIJ) to Nanjing (NKG) is 1924 miles / 3097 kilometers, and travel time by car is about 39 hours 18 minutes.

Hiroshima Airport – Nanjing Lukou International Airport

Distance arrow
836
Miles
Distance arrow
1345
Kilometers
Distance arrow
726
Nautical miles

Search flights

Distance from Hiroshima to Nanjing

There are several ways to calculate the distance from Hiroshima to Nanjing. Here are two standard methods:

Vincenty's formula (applied above)
  • 835.562 miles
  • 1344.707 kilometers
  • 726.083 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 833.990 miles
  • 1342.177 kilometers
  • 724.718 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hiroshima to Nanjing?

The estimated flight time from Hiroshima Airport to Nanjing Lukou International Airport is 2 hours and 4 minutes.

Flight carbon footprint between Hiroshima Airport (HIJ) and Nanjing Lukou International Airport (NKG)

On average, flying from Hiroshima to Nanjing generates about 138 kg of CO2 per passenger, and 138 kilograms equals 304 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hiroshima to Nanjing

See the map of the shortest flight path between Hiroshima Airport (HIJ) and Nanjing Lukou International Airport (NKG).

Airport information

Origin Hiroshima Airport
City: Hiroshima
Country: Japan Flag of Japan
IATA Code: HIJ
ICAO Code: RJOA
Coordinates: 34°26′9″N, 132°55′8″E
Destination Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E