Air Miles Calculator logo

How far is St George from Luqa?

The distance between Luqa (Malta International Airport) and St George (St George Airport (Queensland)) is 9720 miles / 15643 kilometers / 8447 nautical miles.

Malta International Airport – St George Airport (Queensland)

Distance arrow
9720
Miles
Distance arrow
15643
Kilometers
Distance arrow
8447
Nautical miles
Flight time duration
18 h 54 min
CO2 emission
1 258 kg

Search flights

Distance from Luqa to St George

There are several ways to calculate the distance from Luqa to St George. Here are two standard methods:

Vincenty's formula (applied above)
  • 9720.263 miles
  • 15643.247 kilometers
  • 8446.678 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9718.565 miles
  • 15640.515 kilometers
  • 8445.202 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luqa to St George?

The estimated flight time from Malta International Airport to St George Airport (Queensland) is 18 hours and 54 minutes.

Flight carbon footprint between Malta International Airport (MLA) and St George Airport (Queensland) (SGO)

On average, flying from Luqa to St George generates about 1 258 kg of CO2 per passenger, and 1 258 kilograms equals 2 773 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Luqa to St George

See the map of the shortest flight path between Malta International Airport (MLA) and St George Airport (Queensland) (SGO).

Airport information

Origin Malta International Airport
City: Luqa
Country: Malta Flag of Malta
IATA Code: MLA
ICAO Code: LMML
Coordinates: 35°51′26″N, 14°28′39″E
Destination St George Airport (Queensland)
City: St George
Country: Australia Flag of Australia
IATA Code: SGO
ICAO Code: YSGE
Coordinates: 28°2′58″S, 148°35′42″E