Air Miles Calculator logo

How far is Long Bawan from Nakhon Si Thammarat?

The distance between Nakhon Si Thammarat (Nakhon Si Thammarat Airport) and Long Bawan (Juvai Semaring Airport) is 1129 miles / 1816 kilometers / 981 nautical miles.

Nakhon Si Thammarat Airport – Juvai Semaring Airport

Distance arrow
1129
Miles
Distance arrow
1816
Kilometers
Distance arrow
981
Nautical miles

Search flights

Distance from Nakhon Si Thammarat to Long Bawan

There are several ways to calculate the distance from Nakhon Si Thammarat to Long Bawan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1128.586 miles
  • 1816.284 kilometers
  • 980.715 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1127.888 miles
  • 1815.160 kilometers
  • 980.108 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nakhon Si Thammarat to Long Bawan?

The estimated flight time from Nakhon Si Thammarat Airport to Juvai Semaring Airport is 2 hours and 38 minutes.

Flight carbon footprint between Nakhon Si Thammarat Airport (NST) and Juvai Semaring Airport (LBW)

On average, flying from Nakhon Si Thammarat to Long Bawan generates about 158 kg of CO2 per passenger, and 158 kilograms equals 349 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nakhon Si Thammarat to Long Bawan

See the map of the shortest flight path between Nakhon Si Thammarat Airport (NST) and Juvai Semaring Airport (LBW).

Airport information

Origin Nakhon Si Thammarat Airport
City: Nakhon Si Thammarat
Country: Thailand Flag of Thailand
IATA Code: NST
ICAO Code: VTSF
Coordinates: 8°32′22″N, 99°56′40″E
Destination Juvai Semaring Airport
City: Long Bawan
Country: Indonesia Flag of Indonesia
IATA Code: LBW
ICAO Code: WRLB
Coordinates: 3°52′1″N, 115°40′58″E