How far is Nanaimo from Trincomalee?
The distance between Trincomalee (China Bay Airport) and Nanaimo (Nanaimo Airport) is 8186 miles / 13174 kilometers / 7113 nautical miles.
China Bay Airport – Nanaimo Airport
Search flights
Distance from Trincomalee to Nanaimo
There are several ways to calculate the distance from Trincomalee to Nanaimo. Here are two standard methods:
Vincenty's formula (applied above)- 8186.031 miles
- 13174.140 kilometers
- 7113.466 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 8177.813 miles
- 13160.915 kilometers
- 7106.326 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Trincomalee to Nanaimo?
The estimated flight time from China Bay Airport to Nanaimo Airport is 15 hours and 59 minutes.
What is the time difference between Trincomalee and Nanaimo?
Flight carbon footprint between China Bay Airport (TRR) and Nanaimo Airport (YCD)
On average, flying from Trincomalee to Nanaimo generates about 1 026 kg of CO2 per passenger, and 1 026 kilograms equals 2 261 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Trincomalee to Nanaimo
See the map of the shortest flight path between China Bay Airport (TRR) and Nanaimo Airport (YCD).
Airport information
Origin | China Bay Airport |
---|---|
City: | Trincomalee |
Country: | Sri Lanka |
IATA Code: | TRR |
ICAO Code: | VCCT |
Coordinates: | 8°32′18″N, 81°10′54″E |
Destination | Nanaimo Airport |
---|---|
City: | Nanaimo |
Country: | Canada |
IATA Code: | YCD |
ICAO Code: | CYCD |
Coordinates: | 49°3′8″N, 123°52′12″W |