How far is Trieste from Newcastle?
The distance between Newcastle (Newcastle Airport) and Trieste (Trieste – Friuli Venezia Giulia Airport) is 10067 miles / 16201 kilometers / 8748 nautical miles.
Newcastle Airport – Trieste – Friuli Venezia Giulia Airport
Search flights
Distance from Newcastle to Trieste
There are several ways to calculate the distance from Newcastle to Trieste. Here are two standard methods:
Vincenty's formula (applied above)- 10066.594 miles
- 16200.613 kilometers
- 8747.631 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 10068.004 miles
- 16202.881 kilometers
- 8748.856 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Newcastle to Trieste?
The estimated flight time from Newcastle Airport to Trieste – Friuli Venezia Giulia Airport is 19 hours and 33 minutes.
What is the time difference between Newcastle and Trieste?
The time difference between Newcastle and Trieste is 10 hours. Trieste is 10 hours behind Newcastle.
Flight carbon footprint between Newcastle Airport (NTL) and Trieste – Friuli Venezia Giulia Airport (TRS)
On average, flying from Newcastle to Trieste generates about 1 312 kg of CO2 per passenger, and 1 312 kilograms equals 2 892 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Newcastle to Trieste
See the map of the shortest flight path between Newcastle Airport (NTL) and Trieste – Friuli Venezia Giulia Airport (TRS).
Airport information
Origin | Newcastle Airport |
---|---|
City: | Newcastle |
Country: | Australia |
IATA Code: | NTL |
ICAO Code: | YWLM |
Coordinates: | 32°47′41″S, 151°50′2″E |
Destination | Trieste – Friuli Venezia Giulia Airport |
---|---|
City: | Trieste |
Country: | Italy |
IATA Code: | TRS |
ICAO Code: | LIPQ |
Coordinates: | 45°49′38″N, 13°28′19″E |