Air Miles Calculator logo

How far is Bathurst from Cleveland, OH?

The distance between Cleveland (Cleveland Hopkins International Airport) and Bathurst (Bathurst Airport) is 9598 miles / 15446 kilometers / 8340 nautical miles.

Cleveland Hopkins International Airport – Bathurst Airport

Distance arrow
9598
Miles
Distance arrow
15446
Kilometers
Distance arrow
8340
Nautical miles
Flight time duration
18 h 40 min
CO2 emission
1 239 kg

Search flights

Distance from Cleveland to Bathurst

There are several ways to calculate the distance from Cleveland to Bathurst. Here are two standard methods:

Vincenty's formula (applied above)
  • 9597.715 miles
  • 15446.025 kilometers
  • 8340.186 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9599.243 miles
  • 15448.484 kilometers
  • 8341.514 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cleveland to Bathurst?

The estimated flight time from Cleveland Hopkins International Airport to Bathurst Airport is 18 hours and 40 minutes.

Flight carbon footprint between Cleveland Hopkins International Airport (CLE) and Bathurst Airport (BHS)

On average, flying from Cleveland to Bathurst generates about 1 239 kg of CO2 per passenger, and 1 239 kilograms equals 2 731 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Cleveland to Bathurst

See the map of the shortest flight path between Cleveland Hopkins International Airport (CLE) and Bathurst Airport (BHS).

Airport information

Origin Cleveland Hopkins International Airport
City: Cleveland, OH
Country: United States Flag of United States
IATA Code: CLE
ICAO Code: KCLE
Coordinates: 41°24′42″N, 81°50′59″W
Destination Bathurst Airport
City: Bathurst
Country: Australia Flag of Australia
IATA Code: BHS
ICAO Code: YBTH
Coordinates: 33°24′33″S, 149°39′7″E