Air Miles Calculator logo

How far is Terrace from Prishtina?

The distance between Prishtina (Pristina International Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 5512 miles / 8871 kilometers / 4790 nautical miles.

Pristina International Airport – Northwest Regional Airport Terrace-Kitimat

Distance arrow
5512
Miles
Distance arrow
8871
Kilometers
Distance arrow
4790
Nautical miles

Search flights

Distance from Prishtina to Terrace

There are several ways to calculate the distance from Prishtina to Terrace. Here are two standard methods:

Vincenty's formula (applied above)
  • 5511.952 miles
  • 8870.626 kilometers
  • 4789.755 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5496.371 miles
  • 8845.551 kilometers
  • 4776.215 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Prishtina to Terrace?

The estimated flight time from Pristina International Airport to Northwest Regional Airport Terrace-Kitimat is 10 hours and 56 minutes.

Flight carbon footprint between Pristina International Airport (PRN) and Northwest Regional Airport Terrace-Kitimat (YXT)

On average, flying from Prishtina to Terrace generates about 651 kg of CO2 per passenger, and 651 kilograms equals 1 436 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Prishtina to Terrace

See the map of the shortest flight path between Pristina International Airport (PRN) and Northwest Regional Airport Terrace-Kitimat (YXT).

Airport information

Origin Pristina International Airport
City: Prishtina
Country: Kosovo Flag of Kosovo
IATA Code: PRN
ICAO Code: BKPR
Coordinates: 42°34′22″N, 21°2′8″E
Destination Northwest Regional Airport Terrace-Kitimat
City: Terrace
Country: Canada Flag of Canada
IATA Code: YXT
ICAO Code: CYXT
Coordinates: 54°28′6″N, 128°34′33″W