Air Miles Calculator logo

How far is Windsor Locks, CT, from Nanjing?

The distance between Nanjing (Nanjing Lukou International Airport) and Windsor Locks (Bradley International Airport) is 7308 miles / 11761 kilometers / 6350 nautical miles.

Nanjing Lukou International Airport – Bradley International Airport

Distance arrow
7308
Miles
Distance arrow
11761
Kilometers
Distance arrow
6350
Nautical miles

Search flights

Distance from Nanjing to Windsor Locks

There are several ways to calculate the distance from Nanjing to Windsor Locks. Here are two standard methods:

Vincenty's formula (applied above)
  • 7307.945 miles
  • 11760.997 kilometers
  • 6350.430 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7293.255 miles
  • 11737.356 kilometers
  • 6337.665 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanjing to Windsor Locks?

The estimated flight time from Nanjing Lukou International Airport to Bradley International Airport is 14 hours and 20 minutes.

Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Bradley International Airport (BDL)

On average, flying from Nanjing to Windsor Locks generates about 898 kg of CO2 per passenger, and 898 kilograms equals 1 981 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nanjing to Windsor Locks

See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Bradley International Airport (BDL).

Airport information

Origin Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E
Destination Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W