Air Miles Calculator logo

How far is Senai from Nashville, TN?

The distance between Nashville (Nashville International Airport) and Senai (Senai International Airport) is 9749 miles / 15689 kilometers / 8471 nautical miles.

Nashville International Airport – Senai International Airport

Distance arrow
9749
Miles
Distance arrow
15689
Kilometers
Distance arrow
8471
Nautical miles
Flight time duration
18 h 57 min
CO2 emission
1 262 kg

Search flights

Distance from Nashville to Senai

There are several ways to calculate the distance from Nashville to Senai. Here are two standard methods:

Vincenty's formula (applied above)
  • 9748.758 miles
  • 15689.105 kilometers
  • 8471.439 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9743.692 miles
  • 15680.953 kilometers
  • 8467.037 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nashville to Senai?

The estimated flight time from Nashville International Airport to Senai International Airport is 18 hours and 57 minutes.

Flight carbon footprint between Nashville International Airport (BNA) and Senai International Airport (JHB)

On average, flying from Nashville to Senai generates about 1 262 kg of CO2 per passenger, and 1 262 kilograms equals 2 783 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nashville to Senai

See the map of the shortest flight path between Nashville International Airport (BNA) and Senai International Airport (JHB).

Airport information

Origin Nashville International Airport
City: Nashville, TN
Country: United States Flag of United States
IATA Code: BNA
ICAO Code: KBNA
Coordinates: 36°7′28″N, 86°40′41″W
Destination Senai International Airport
City: Senai
Country: Malaysia Flag of Malaysia
IATA Code: JHB
ICAO Code: WMKJ
Coordinates: 1°38′28″N, 103°40′11″E