Air Miles Calculator logo

How far is Long Bawan from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Long Bawan (Juvai Semaring Airport) is 9023 miles / 14521 kilometers / 7841 nautical miles.

Toronto Pearson International Airport – Juvai Semaring Airport

Distance arrow
9023
Miles
Distance arrow
14521
Kilometers
Distance arrow
7841
Nautical miles
Flight time duration
17 h 35 min
CO2 emission
1 151 kg

Search flights

Distance from Toronto to Long Bawan

There are several ways to calculate the distance from Toronto to Long Bawan. Here are two standard methods:

Vincenty's formula (applied above)
  • 9022.989 miles
  • 14521.094 kilometers
  • 7840.763 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9016.445 miles
  • 14510.562 kilometers
  • 7835.077 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Long Bawan?

The estimated flight time from Toronto Pearson International Airport to Juvai Semaring Airport is 17 hours and 35 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Juvai Semaring Airport (LBW)

On average, flying from Toronto to Long Bawan generates about 1 151 kg of CO2 per passenger, and 1 151 kilograms equals 2 537 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Long Bawan

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Juvai Semaring Airport (LBW).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Juvai Semaring Airport
City: Long Bawan
Country: Indonesia Flag of Indonesia
IATA Code: LBW
ICAO Code: WRLB
Coordinates: 3°52′1″N, 115°40′58″E