How far is Singapore from Tanjung Pandan?
The distance between Tanjung Pandan (H.A.S. Hanandjoeddin International Airport) and Singapore (Singapore Changi Airport) is 383 miles / 617 kilometers / 333 nautical miles.
H.A.S. Hanandjoeddin International Airport – Singapore Changi Airport
Search flights
Distance from Tanjung Pandan to Singapore
There are several ways to calculate the distance from Tanjung Pandan to Singapore. Here are two standard methods:
Vincenty's formula (applied above)- 383.195 miles
- 616.692 kilometers
- 332.987 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 384.158 miles
- 618.243 kilometers
- 333.824 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tanjung Pandan to Singapore?
The estimated flight time from H.A.S. Hanandjoeddin International Airport to Singapore Changi Airport is 1 hour and 13 minutes.
What is the time difference between Tanjung Pandan and Singapore?
Flight carbon footprint between H.A.S. Hanandjoeddin International Airport (TJQ) and Singapore Changi Airport (SIN)
On average, flying from Tanjung Pandan to Singapore generates about 81 kg of CO2 per passenger, and 81 kilograms equals 180 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Tanjung Pandan to Singapore
See the map of the shortest flight path between H.A.S. Hanandjoeddin International Airport (TJQ) and Singapore Changi Airport (SIN).
Airport information
Origin | H.A.S. Hanandjoeddin International Airport |
---|---|
City: | Tanjung Pandan |
Country: | Indonesia |
IATA Code: | TJQ |
ICAO Code: | WIOD |
Coordinates: | 2°44′44″S, 107°45′17″E |
Destination | Singapore Changi Airport |
---|---|
City: | Singapore |
Country: | Singapore |
IATA Code: | SIN |
ICAO Code: | WSSS |
Coordinates: | 1°21′0″N, 103°59′38″E |