How far is Charlotte Amalie from Regina?
The distance between Regina (Regina International Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 3106 miles / 4999 kilometers / 2699 nautical miles.
Regina International Airport – Charlotte Amalie Harbor Seaplane Base
Search flights
Distance from Regina to Charlotte Amalie
There are several ways to calculate the distance from Regina to Charlotte Amalie. Here are two standard methods:
Vincenty's formula (applied above)- 3106.209 miles
- 4998.959 kilometers
- 2699.222 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3106.541 miles
- 4999.494 kilometers
- 2699.511 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Regina to Charlotte Amalie?
The estimated flight time from Regina International Airport to Charlotte Amalie Harbor Seaplane Base is 6 hours and 22 minutes.
What is the time difference between Regina and Charlotte Amalie?
Flight carbon footprint between Regina International Airport (YQR) and Charlotte Amalie Harbor Seaplane Base (SPB)
On average, flying from Regina to Charlotte Amalie generates about 347 kg of CO2 per passenger, and 347 kilograms equals 765 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Regina to Charlotte Amalie
See the map of the shortest flight path between Regina International Airport (YQR) and Charlotte Amalie Harbor Seaplane Base (SPB).
Airport information
Origin | Regina International Airport |
---|---|
City: | Regina |
Country: | Canada |
IATA Code: | YQR |
ICAO Code: | CYQR |
Coordinates: | 50°25′54″N, 104°39′57″W |
Destination | Charlotte Amalie Harbor Seaplane Base |
---|---|
City: | Charlotte Amalie |
Country: | U.S. Virgin Islands |
IATA Code: | SPB |
ICAO Code: | VI22 |
Coordinates: | 18°20′18″N, 64°56′26″W |