How far is Barcaldine from Lismore?
The distance between Lismore (Lismore Airport) and Barcaldine (Barcaldine Airport) is 612 miles / 985 kilometers / 532 nautical miles.
The driving distance from Lismore (LSY) to Barcaldine (BCI) is 829 miles / 1334 kilometers, and travel time by car is about 16 hours 54 minutes.
Lismore Airport – Barcaldine Airport
Search flights
Distance from Lismore to Barcaldine
There are several ways to calculate the distance from Lismore to Barcaldine. Here are two standard methods:
Vincenty's formula (applied above)- 612.344 miles
- 985.472 kilometers
- 532.112 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 612.423 miles
- 985.600 kilometers
- 532.181 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Lismore to Barcaldine?
The estimated flight time from Lismore Airport to Barcaldine Airport is 1 hour and 39 minutes.
What is the time difference between Lismore and Barcaldine?
The time difference between Lismore and Barcaldine is 1 hour. Barcaldine is 1 hour behind Lismore.
Flight carbon footprint between Lismore Airport (LSY) and Barcaldine Airport (BCI)
On average, flying from Lismore to Barcaldine generates about 114 kg of CO2 per passenger, and 114 kilograms equals 252 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Lismore to Barcaldine
See the map of the shortest flight path between Lismore Airport (LSY) and Barcaldine Airport (BCI).
Airport information
Origin | Lismore Airport |
---|---|
City: | Lismore |
Country: | Australia |
IATA Code: | LSY |
ICAO Code: | YLIS |
Coordinates: | 28°49′49″S, 153°15′35″E |
Destination | Barcaldine Airport |
---|---|
City: | Barcaldine |
Country: | Australia |
IATA Code: | BCI |
ICAO Code: | YBAR |
Coordinates: | 23°33′55″S, 145°18′25″E |