Air Miles Calculator logo

How far is Zhanjiang from Sakon Nakhon?

The distance between Sakon Nakhon (Sakon Nakhon Airport) and Zhanjiang (Zhanjiang Airport) is 492 miles / 793 kilometers / 428 nautical miles.

The driving distance from Sakon Nakhon (SNO) to Zhanjiang (ZHA) is 772 miles / 1242 kilometers, and travel time by car is about 15 hours 20 minutes.

Sakon Nakhon Airport – Zhanjiang Airport

Distance arrow
492
Miles
Distance arrow
793
Kilometers
Distance arrow
428
Nautical miles

Search flights

Distance from Sakon Nakhon to Zhanjiang

There are several ways to calculate the distance from Sakon Nakhon to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 492.471 miles
  • 792.555 kilometers
  • 427.945 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 492.675 miles
  • 792.883 kilometers
  • 428.122 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sakon Nakhon to Zhanjiang?

The estimated flight time from Sakon Nakhon Airport to Zhanjiang Airport is 1 hour and 25 minutes.

Flight carbon footprint between Sakon Nakhon Airport (SNO) and Zhanjiang Airport (ZHA)

On average, flying from Sakon Nakhon to Zhanjiang generates about 98 kg of CO2 per passenger, and 98 kilograms equals 215 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sakon Nakhon to Zhanjiang

See the map of the shortest flight path between Sakon Nakhon Airport (SNO) and Zhanjiang Airport (ZHA).

Airport information

Origin Sakon Nakhon Airport
City: Sakon Nakhon
Country: Thailand Flag of Thailand
IATA Code: SNO
ICAO Code: VTUI
Coordinates: 17°11′42″N, 104°7′8″E
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E