Air Miles Calculator logo

How far is Spring Point from São José Do Rio Preto?

The distance between São José Do Rio Preto (São José do Rio Preto Airport) and Spring Point (Spring Point Airport) is 3404 miles / 5479 kilometers / 2958 nautical miles.

São José do Rio Preto Airport – Spring Point Airport

Distance arrow
3404
Miles
Distance arrow
5479
Kilometers
Distance arrow
2958
Nautical miles

Search flights

Distance from São José Do Rio Preto to Spring Point

There are several ways to calculate the distance from São José Do Rio Preto to Spring Point. Here are two standard methods:

Vincenty's formula (applied above)
  • 3404.481 miles
  • 5478.981 kilometers
  • 2958.413 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3416.834 miles
  • 5498.862 kilometers
  • 2969.148 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from São José Do Rio Preto to Spring Point?

The estimated flight time from São José do Rio Preto Airport to Spring Point Airport is 6 hours and 56 minutes.

Flight carbon footprint between São José do Rio Preto Airport (SJP) and Spring Point Airport (AXP)

On average, flying from São José Do Rio Preto to Spring Point generates about 383 kg of CO2 per passenger, and 383 kilograms equals 844 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from São José Do Rio Preto to Spring Point

See the map of the shortest flight path between São José do Rio Preto Airport (SJP) and Spring Point Airport (AXP).

Airport information

Origin São José do Rio Preto Airport
City: São José Do Rio Preto
Country: Brazil Flag of Brazil
IATA Code: SJP
ICAO Code: SBSR
Coordinates: 20°48′59″S, 49°24′23″W
Destination Spring Point Airport
City: Spring Point
Country: Bahamas Flag of Bahamas
IATA Code: AXP
ICAO Code: MYAP
Coordinates: 22°26′30″N, 73°58′15″W