How far is Jambi from Nairobi?
The distance between Nairobi (Jomo Kenyatta International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 4613 miles / 7424 kilometers / 4009 nautical miles.
Jomo Kenyatta International Airport – Sultan Thaha Syaifuddin Airport
Search flights
Distance from Nairobi to Jambi
There are several ways to calculate the distance from Nairobi to Jambi. Here are two standard methods:
Vincenty's formula (applied above)- 4613.126 miles
- 7424.107 kilometers
- 4008.697 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4607.961 miles
- 7415.795 kilometers
- 4004.209 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nairobi to Jambi?
The estimated flight time from Jomo Kenyatta International Airport to Sultan Thaha Syaifuddin Airport is 9 hours and 14 minutes.
What is the time difference between Nairobi and Jambi?
The time difference between Nairobi and Jambi is 4 hours. Jambi is 4 hours ahead of Nairobi.
Flight carbon footprint between Jomo Kenyatta International Airport (NBO) and Sultan Thaha Syaifuddin Airport (DJB)
On average, flying from Nairobi to Jambi generates about 534 kg of CO2 per passenger, and 534 kilograms equals 1 177 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Nairobi to Jambi
See the map of the shortest flight path between Jomo Kenyatta International Airport (NBO) and Sultan Thaha Syaifuddin Airport (DJB).
Airport information
Origin | Jomo Kenyatta International Airport |
---|---|
City: | Nairobi |
Country: | Kenya |
IATA Code: | NBO |
ICAO Code: | HKJK |
Coordinates: | 1°19′9″S, 36°55′40″E |
Destination | Sultan Thaha Syaifuddin Airport |
---|---|
City: | Jambi |
Country: | Indonesia |
IATA Code: | DJB |
ICAO Code: | WIPA |
Coordinates: | 1°38′16″S, 103°38′38″E |