How far is Trieste from Wellington?
The distance between Wellington (Wellington International Airport) and Trieste (Trieste – Friuli Venezia Giulia Airport) is 11455 miles / 18435 kilometers / 9954 nautical miles.
Wellington International Airport – Trieste – Friuli Venezia Giulia Airport
Search flights
Distance from Wellington to Trieste
There are several ways to calculate the distance from Wellington to Trieste. Here are two standard methods:
Vincenty's formula (applied above)- 11455.241 miles
- 18435.424 kilometers
- 9954.332 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 11454.782 miles
- 18434.685 kilometers
- 9953.934 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Wellington to Trieste?
The estimated flight time from Wellington International Airport to Trieste – Friuli Venezia Giulia Airport is 22 hours and 11 minutes.
What is the time difference between Wellington and Trieste?
Flight carbon footprint between Wellington International Airport (WLG) and Trieste – Friuli Venezia Giulia Airport (TRS)
On average, flying from Wellington to Trieste generates about 1 536 kg of CO2 per passenger, and 1 536 kilograms equals 3 385 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Wellington to Trieste
See the map of the shortest flight path between Wellington International Airport (WLG) and Trieste – Friuli Venezia Giulia Airport (TRS).
Airport information
Origin | Wellington International Airport |
---|---|
City: | Wellington |
Country: | New Zealand |
IATA Code: | WLG |
ICAO Code: | NZWN |
Coordinates: | 41°19′37″S, 174°48′17″E |
Destination | Trieste – Friuli Venezia Giulia Airport |
---|---|
City: | Trieste |
Country: | Italy |
IATA Code: | TRS |
ICAO Code: | LIPQ |
Coordinates: | 45°49′38″N, 13°28′19″E |