Air Miles Calculator logo

How far is Bijie from Sakon Nakhon?

The distance between Sakon Nakhon (Sakon Nakhon Airport) and Bijie (Bijie Feixiong Airport) is 698 miles / 1124 kilometers / 607 nautical miles.

The driving distance from Sakon Nakhon (SNO) to Bijie (BFJ) is 1059 miles / 1705 kilometers, and travel time by car is about 20 hours 49 minutes.

Sakon Nakhon Airport – Bijie Feixiong Airport

Distance arrow
698
Miles
Distance arrow
1124
Kilometers
Distance arrow
607
Nautical miles

Search flights

Distance from Sakon Nakhon to Bijie

There are several ways to calculate the distance from Sakon Nakhon to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 698.409 miles
  • 1123.980 kilometers
  • 606.901 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 701.246 miles
  • 1128.546 kilometers
  • 609.366 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sakon Nakhon to Bijie?

The estimated flight time from Sakon Nakhon Airport to Bijie Feixiong Airport is 1 hour and 49 minutes.

Flight carbon footprint between Sakon Nakhon Airport (SNO) and Bijie Feixiong Airport (BFJ)

On average, flying from Sakon Nakhon to Bijie generates about 124 kg of CO2 per passenger, and 124 kilograms equals 274 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sakon Nakhon to Bijie

See the map of the shortest flight path between Sakon Nakhon Airport (SNO) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Sakon Nakhon Airport
City: Sakon Nakhon
Country: Thailand Flag of Thailand
IATA Code: SNO
ICAO Code: VTUI
Coordinates: 17°11′42″N, 104°7′8″E
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E