Air Miles Calculator logo

How far is Lismore from Columbus, OH?

The distance between Columbus (John Glenn Columbus International Airport) and Lismore (Lismore Airport) is 9198 miles / 14803 kilometers / 7993 nautical miles.

John Glenn Columbus International Airport – Lismore Airport

Distance arrow
9198
Miles
Distance arrow
14803
Kilometers
Distance arrow
7993
Nautical miles
Flight time duration
17 h 54 min
CO2 emission
1 177 kg

Search flights

Distance from Columbus to Lismore

There are several ways to calculate the distance from Columbus to Lismore. Here are two standard methods:

Vincenty's formula (applied above)
  • 9197.976 miles
  • 14802.707 kilometers
  • 7992.822 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9198.951 miles
  • 14804.276 kilometers
  • 7993.670 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Columbus to Lismore?

The estimated flight time from John Glenn Columbus International Airport to Lismore Airport is 17 hours and 54 minutes.

Flight carbon footprint between John Glenn Columbus International Airport (CMH) and Lismore Airport (LSY)

On average, flying from Columbus to Lismore generates about 1 177 kg of CO2 per passenger, and 1 177 kilograms equals 2 596 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Columbus to Lismore

See the map of the shortest flight path between John Glenn Columbus International Airport (CMH) and Lismore Airport (LSY).

Airport information

Origin John Glenn Columbus International Airport
City: Columbus, OH
Country: United States Flag of United States
IATA Code: CMH
ICAO Code: KCMH
Coordinates: 39°59′52″N, 82°53′30″W
Destination Lismore Airport
City: Lismore
Country: Australia Flag of Australia
IATA Code: LSY
ICAO Code: YLIS
Coordinates: 28°49′49″S, 153°15′35″E