How far is Jambi from Hong Kong?
The distance between Hong Kong (Hong Kong International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 1786 miles / 2875 kilometers / 1552 nautical miles.
Hong Kong International Airport – Sultan Thaha Syaifuddin Airport
Search flights
Distance from Hong Kong to Jambi
There are several ways to calculate the distance from Hong Kong to Jambi. Here are two standard methods:
Vincenty's formula (applied above)- 1786.158 miles
- 2874.543 kilometers
- 1552.129 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1793.639 miles
- 2886.582 kilometers
- 1558.630 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hong Kong to Jambi?
The estimated flight time from Hong Kong International Airport to Sultan Thaha Syaifuddin Airport is 3 hours and 52 minutes.
What is the time difference between Hong Kong and Jambi?
The time difference between Hong Kong and Jambi is 1 hour. Jambi is 1 hour behind Hong Kong.
Flight carbon footprint between Hong Kong International Airport (HKG) and Sultan Thaha Syaifuddin Airport (DJB)
On average, flying from Hong Kong to Jambi generates about 199 kg of CO2 per passenger, and 199 kilograms equals 439 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Hong Kong to Jambi
See the map of the shortest flight path between Hong Kong International Airport (HKG) and Sultan Thaha Syaifuddin Airport (DJB).
Airport information
Origin | Hong Kong International Airport |
---|---|
City: | Hong Kong |
Country: | Hong Kong |
IATA Code: | HKG |
ICAO Code: | VHHH |
Coordinates: | 22°18′32″N, 113°54′54″E |
Destination | Sultan Thaha Syaifuddin Airport |
---|---|
City: | Jambi |
Country: | Indonesia |
IATA Code: | DJB |
ICAO Code: | WIPA |
Coordinates: | 1°38′16″S, 103°38′38″E |