Air Miles Calculator logo

How far is Tunxi from Johannesburg?

The distance between Johannesburg (Lanseria International Airport) and Tunxi (Huangshan Tunxi International Airport) is 7099 miles / 11425 kilometers / 6169 nautical miles.

Lanseria International Airport – Huangshan Tunxi International Airport

Distance arrow
7099
Miles
Distance arrow
11425
Kilometers
Distance arrow
6169
Nautical miles

Search flights

Distance from Johannesburg to Tunxi

There are several ways to calculate the distance from Johannesburg to Tunxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 7099.156 miles
  • 11424.985 kilometers
  • 6168.998 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7102.337 miles
  • 11430.103 kilometers
  • 6171.762 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Johannesburg to Tunxi?

The estimated flight time from Lanseria International Airport to Huangshan Tunxi International Airport is 13 hours and 56 minutes.

Flight carbon footprint between Lanseria International Airport (HLA) and Huangshan Tunxi International Airport (TXN)

On average, flying from Johannesburg to Tunxi generates about 869 kg of CO2 per passenger, and 869 kilograms equals 1 915 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Johannesburg to Tunxi

See the map of the shortest flight path between Lanseria International Airport (HLA) and Huangshan Tunxi International Airport (TXN).

Airport information

Origin Lanseria International Airport
City: Johannesburg
Country: South Africa Flag of South Africa
IATA Code: HLA
ICAO Code: FALA
Coordinates: 25°56′18″S, 27°55′33″E
Destination Huangshan Tunxi International Airport
City: Tunxi
Country: China Flag of China
IATA Code: TXN
ICAO Code: ZSTX
Coordinates: 29°43′59″N, 118°15′21″E