Air Miles Calculator logo

How far is Xi'an from Johannesburg?

The distance between Johannesburg (Lanseria International Airport) and Xi'an (Xi'an Xianyang International Airport) is 6725 miles / 10823 kilometers / 5844 nautical miles.

Lanseria International Airport – Xi'an Xianyang International Airport

Distance arrow
6725
Miles
Distance arrow
10823
Kilometers
Distance arrow
5844
Nautical miles

Search flights

Distance from Johannesburg to Xi'an

There are several ways to calculate the distance from Johannesburg to Xi'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 6725.061 miles
  • 10822.937 kilometers
  • 5843.919 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6731.247 miles
  • 10832.892 kilometers
  • 5849.294 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Johannesburg to Xi'an?

The estimated flight time from Lanseria International Airport to Xi'an Xianyang International Airport is 13 hours and 13 minutes.

Flight carbon footprint between Lanseria International Airport (HLA) and Xi'an Xianyang International Airport (XIY)

On average, flying from Johannesburg to Xi'an generates about 816 kg of CO2 per passenger, and 816 kilograms equals 1 800 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Johannesburg to Xi'an

See the map of the shortest flight path between Lanseria International Airport (HLA) and Xi'an Xianyang International Airport (XIY).

Airport information

Origin Lanseria International Airport
City: Johannesburg
Country: South Africa Flag of South Africa
IATA Code: HLA
ICAO Code: FALA
Coordinates: 25°56′18″S, 27°55′33″E
Destination Xi'an Xianyang International Airport
City: Xi'an
Country: China Flag of China
IATA Code: XIY
ICAO Code: ZLXY
Coordinates: 34°26′49″N, 108°45′7″E