How far is San Juan from Johannesburg?
The distance between Johannesburg (Lanseria International Airport) and San Juan (Fernando Luis Ribas Dominicci Airport) is 7009 miles / 11280 kilometers / 6091 nautical miles.
Lanseria International Airport – Fernando Luis Ribas Dominicci Airport
Search flights
Distance from Johannesburg to San Juan
There are several ways to calculate the distance from Johannesburg to San Juan. Here are two standard methods:
Vincenty's formula (applied above)- 7009.351 miles
- 11280.456 kilometers
- 6090.959 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 7008.832 miles
- 11279.621 kilometers
- 6090.508 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Johannesburg to San Juan?
The estimated flight time from Lanseria International Airport to Fernando Luis Ribas Dominicci Airport is 13 hours and 46 minutes.
What is the time difference between Johannesburg and San Juan?
Flight carbon footprint between Lanseria International Airport (HLA) and Fernando Luis Ribas Dominicci Airport (SIG)
On average, flying from Johannesburg to San Juan generates about 856 kg of CO2 per passenger, and 856 kilograms equals 1 888 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Johannesburg to San Juan
See the map of the shortest flight path between Lanseria International Airport (HLA) and Fernando Luis Ribas Dominicci Airport (SIG).
Airport information
Origin | Lanseria International Airport |
---|---|
City: | Johannesburg |
Country: | South Africa |
IATA Code: | HLA |
ICAO Code: | FALA |
Coordinates: | 25°56′18″S, 27°55′33″E |
Destination | Fernando Luis Ribas Dominicci Airport |
---|---|
City: | San Juan |
Country: | Puerto Rico |
IATA Code: | SIG |
ICAO Code: | TJIG |
Coordinates: | 18°27′24″N, 66°5′53″W |