Air Miles Calculator logo

How far is Bar Harbor, ME, from Nanjing?

The distance between Nanjing (Nanjing Lukou International Airport) and Bar Harbor (Hancock County–Bar Harbor Airport) is 7168 miles / 11535 kilometers / 6229 nautical miles.

Nanjing Lukou International Airport – Hancock County–Bar Harbor Airport

Distance arrow
7168
Miles
Distance arrow
11535
Kilometers
Distance arrow
6229
Nautical miles

Search flights

Distance from Nanjing to Bar Harbor

There are several ways to calculate the distance from Nanjing to Bar Harbor. Here are two standard methods:

Vincenty's formula (applied above)
  • 7167.671 miles
  • 11535.249 kilometers
  • 6228.536 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7152.819 miles
  • 11511.347 kilometers
  • 6215.630 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanjing to Bar Harbor?

The estimated flight time from Nanjing Lukou International Airport to Hancock County–Bar Harbor Airport is 14 hours and 4 minutes.

Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Hancock County–Bar Harbor Airport (BHB)

On average, flying from Nanjing to Bar Harbor generates about 879 kg of CO2 per passenger, and 879 kilograms equals 1 937 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nanjing to Bar Harbor

See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Hancock County–Bar Harbor Airport (BHB).

Airport information

Origin Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E
Destination Hancock County–Bar Harbor Airport
City: Bar Harbor, ME
Country: United States Flag of United States
IATA Code: BHB
ICAO Code: KBHB
Coordinates: 44°27′0″N, 68°21′41″W