How far is Jambi from Chiclayo?
The distance between Chiclayo (Chiclayo International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 11806 miles / 19000 kilometers / 10259 nautical miles.
Chiclayo International Airport – Sultan Thaha Syaifuddin Airport
Search flights
Distance from Chiclayo to Jambi
There are several ways to calculate the distance from Chiclayo to Jambi. Here are two standard methods:
Vincenty's formula (applied above)- 11806.159 miles
- 19000.171 kilometers
- 10259.272 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 11807.351 miles
- 19002.089 kilometers
- 10260.307 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Chiclayo to Jambi?
The estimated flight time from Chiclayo International Airport to Sultan Thaha Syaifuddin Airport is 22 hours and 51 minutes.
What is the time difference between Chiclayo and Jambi?
The time difference between Chiclayo and Jambi is 12 hours. Jambi is 12 hours ahead of Chiclayo.
Flight carbon footprint between Chiclayo International Airport (CIX) and Sultan Thaha Syaifuddin Airport (DJB)
On average, flying from Chiclayo to Jambi generates about 1 594 kg of CO2 per passenger, and 1 594 kilograms equals 3 514 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Chiclayo to Jambi
See the map of the shortest flight path between Chiclayo International Airport (CIX) and Sultan Thaha Syaifuddin Airport (DJB).
Airport information
Origin | Chiclayo International Airport |
---|---|
City: | Chiclayo |
Country: | Perú |
IATA Code: | CIX |
ICAO Code: | SPHI |
Coordinates: | 6°47′14″S, 79°49′41″W |
Destination | Sultan Thaha Syaifuddin Airport |
---|---|
City: | Jambi |
Country: | Indonesia |
IATA Code: | DJB |
ICAO Code: | WIPA |
Coordinates: | 1°38′16″S, 103°38′38″E |