Air Miles Calculator logo

How far is Lismore from Baltimore, MD?

The distance between Baltimore (Baltimore–Washington International Airport) and Lismore (Lismore Airport) is 9527 miles / 15332 kilometers / 8279 nautical miles.

Baltimore–Washington International Airport – Lismore Airport

Distance arrow
9527
Miles
Distance arrow
15332
Kilometers
Distance arrow
8279
Nautical miles
Flight time duration
18 h 32 min
CO2 emission
1 228 kg

Search flights

Distance from Baltimore to Lismore

There are several ways to calculate the distance from Baltimore to Lismore. Here are two standard methods:

Vincenty's formula (applied above)
  • 9527.078 miles
  • 15332.346 kilometers
  • 8278.804 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9526.953 miles
  • 15332.144 kilometers
  • 8278.695 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baltimore to Lismore?

The estimated flight time from Baltimore–Washington International Airport to Lismore Airport is 18 hours and 32 minutes.

Flight carbon footprint between Baltimore–Washington International Airport (BWI) and Lismore Airport (LSY)

On average, flying from Baltimore to Lismore generates about 1 228 kg of CO2 per passenger, and 1 228 kilograms equals 2 707 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Baltimore to Lismore

See the map of the shortest flight path between Baltimore–Washington International Airport (BWI) and Lismore Airport (LSY).

Airport information

Origin Baltimore–Washington International Airport
City: Baltimore, MD
Country: United States Flag of United States
IATA Code: BWI
ICAO Code: KBWI
Coordinates: 39°10′31″N, 76°40′5″W
Destination Lismore Airport
City: Lismore
Country: Australia Flag of Australia
IATA Code: LSY
ICAO Code: YLIS
Coordinates: 28°49′49″S, 153°15′35″E