Air Miles Calculator logo

How far is Hamilton Island from Nanjing?

The distance between Nanjing (Nanjing Lukou International Airport) and Hamilton Island (Great Barrier Reef Airport) is 4100 miles / 6599 kilometers / 3563 nautical miles.

Nanjing Lukou International Airport – Great Barrier Reef Airport

Distance arrow
4100
Miles
Distance arrow
6599
Kilometers
Distance arrow
3563
Nautical miles

Search flights

Distance from Nanjing to Hamilton Island

There are several ways to calculate the distance from Nanjing to Hamilton Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 4100.365 miles
  • 6598.898 kilometers
  • 3563.120 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4114.275 miles
  • 6621.284 kilometers
  • 3575.207 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanjing to Hamilton Island?

The estimated flight time from Nanjing Lukou International Airport to Great Barrier Reef Airport is 8 hours and 15 minutes.

Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Great Barrier Reef Airport (HTI)

On average, flying from Nanjing to Hamilton Island generates about 469 kg of CO2 per passenger, and 469 kilograms equals 1 034 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nanjing to Hamilton Island

See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Great Barrier Reef Airport (HTI).

Airport information

Origin Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E
Destination Great Barrier Reef Airport
City: Hamilton Island
Country: Australia Flag of Australia
IATA Code: HTI
ICAO Code: YBHM
Coordinates: 20°21′29″S, 148°57′7″E