Air Miles Calculator logo

How far is Pisa from King Island, Tasmania?

The distance between King Island, Tasmania (King Island Airport) and Pisa (Pisa International Airport) is 10055 miles / 16182 kilometers / 8738 nautical miles.

King Island Airport – Pisa International Airport

Distance arrow
10055
Miles
Distance arrow
16182
Kilometers
Distance arrow
8738
Nautical miles
Flight time duration
19 h 32 min
CO2 emission
1 310 kg

Search flights

Distance from King Island, Tasmania to Pisa

There are several ways to calculate the distance from King Island, Tasmania to Pisa. Here are two standard methods:

Vincenty's formula (applied above)
  • 10054.944 miles
  • 16181.863 kilometers
  • 8737.507 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10057.373 miles
  • 16185.773 kilometers
  • 8739.618 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from King Island, Tasmania to Pisa?

The estimated flight time from King Island Airport to Pisa International Airport is 19 hours and 32 minutes.

Flight carbon footprint between King Island Airport (KNS) and Pisa International Airport (PSA)

On average, flying from King Island, Tasmania to Pisa generates about 1 310 kg of CO2 per passenger, and 1 310 kilograms equals 2 888 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from King Island, Tasmania to Pisa

See the map of the shortest flight path between King Island Airport (KNS) and Pisa International Airport (PSA).

Airport information

Origin King Island Airport
City: King Island, Tasmania
Country: Australia Flag of Australia
IATA Code: KNS
ICAO Code: YKII
Coordinates: 39°52′38″S, 143°52′40″E
Destination Pisa International Airport
City: Pisa
Country: Italy Flag of Italy
IATA Code: PSA
ICAO Code: LIRP
Coordinates: 43°41′2″N, 10°23′33″E