How far is São Jorge Island from Hebron, KY?
The distance between Hebron (Cincinnati/Northern Kentucky International Airport) and São Jorge Island (São Jorge Airport) is 2996 miles / 4821 kilometers / 2603 nautical miles.
Cincinnati/Northern Kentucky International Airport – São Jorge Airport
Search flights
Distance from Hebron to São Jorge Island
There are several ways to calculate the distance from Hebron to São Jorge Island. Here are two standard methods:
Vincenty's formula (applied above)- 2995.687 miles
- 4821.090 kilometers
- 2603.180 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2988.407 miles
- 4809.374 kilometers
- 2596.854 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hebron to São Jorge Island?
The estimated flight time from Cincinnati/Northern Kentucky International Airport to São Jorge Airport is 6 hours and 10 minutes.
What is the time difference between Hebron and São Jorge Island?
Flight carbon footprint between Cincinnati/Northern Kentucky International Airport (CVG) and São Jorge Airport (SJZ)
On average, flying from Hebron to São Jorge Island generates about 334 kg of CO2 per passenger, and 334 kilograms equals 736 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Hebron to São Jorge Island
See the map of the shortest flight path between Cincinnati/Northern Kentucky International Airport (CVG) and São Jorge Airport (SJZ).
Airport information
Origin | Cincinnati/Northern Kentucky International Airport |
---|---|
City: | Hebron, KY |
Country: | United States |
IATA Code: | CVG |
ICAO Code: | KCVG |
Coordinates: | 39°2′55″N, 84°40′4″W |
Destination | São Jorge Airport |
---|---|
City: | São Jorge Island |
Country: | Portugal |
IATA Code: | SJZ |
ICAO Code: | LPSJ |
Coordinates: | 38°39′55″N, 28°10′32″W |