Air Miles Calculator logo

How far is Batam from Trincomalee?

The distance between Trincomalee (China Bay Airport) and Batam (Hang Nadim Airport) is 1660 miles / 2671 kilometers / 1442 nautical miles.

China Bay Airport – Hang Nadim Airport

Distance arrow
1660
Miles
Distance arrow
2671
Kilometers
Distance arrow
1442
Nautical miles
Flight time duration
3 h 38 min
Time Difference
1 h 30 min
CO2 emission
190 kg

Search flights

Distance from Trincomalee to Batam

There are several ways to calculate the distance from Trincomalee to Batam. Here are two standard methods:

Vincenty's formula (applied above)
  • 1659.963 miles
  • 2671.452 kilometers
  • 1442.469 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1659.109 miles
  • 2670.077 kilometers
  • 1441.726 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Trincomalee to Batam?

The estimated flight time from China Bay Airport to Hang Nadim Airport is 3 hours and 38 minutes.

Flight carbon footprint between China Bay Airport (TRR) and Hang Nadim Airport (BTH)

On average, flying from Trincomalee to Batam generates about 190 kg of CO2 per passenger, and 190 kilograms equals 419 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Trincomalee to Batam

See the map of the shortest flight path between China Bay Airport (TRR) and Hang Nadim Airport (BTH).

Airport information

Origin China Bay Airport
City: Trincomalee
Country: Sri Lanka Flag of Sri Lanka
IATA Code: TRR
ICAO Code: VCCT
Coordinates: 8°32′18″N, 81°10′54″E
Destination Hang Nadim Airport
City: Batam
Country: Indonesia Flag of Indonesia
IATA Code: BTH
ICAO Code: WIDD
Coordinates: 1°7′15″N, 104°7′8″E