Air Miles Calculator logo

How far is Rosh Pina from Daytona Beach, FL?

The distance between Daytona Beach (Daytona Beach International Airport) and Rosh Pina (Rosh Pina Airport) is 6480 miles / 10428 kilometers / 5631 nautical miles.

Daytona Beach International Airport – Rosh Pina Airport

Distance arrow
6480
Miles
Distance arrow
10428
Kilometers
Distance arrow
5631
Nautical miles

Search flights

Distance from Daytona Beach to Rosh Pina

There are several ways to calculate the distance from Daytona Beach to Rosh Pina. Here are two standard methods:

Vincenty's formula (applied above)
  • 6479.796 miles
  • 10428.220 kilometers
  • 5630.789 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6467.477 miles
  • 10408.395 kilometers
  • 5620.084 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Daytona Beach to Rosh Pina?

The estimated flight time from Daytona Beach International Airport to Rosh Pina Airport is 12 hours and 46 minutes.

Flight carbon footprint between Daytona Beach International Airport (DAB) and Rosh Pina Airport (RPN)

On average, flying from Daytona Beach to Rosh Pina generates about 782 kg of CO2 per passenger, and 782 kilograms equals 1 725 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Daytona Beach to Rosh Pina

See the map of the shortest flight path between Daytona Beach International Airport (DAB) and Rosh Pina Airport (RPN).

Airport information

Origin Daytona Beach International Airport
City: Daytona Beach, FL
Country: United States Flag of United States
IATA Code: DAB
ICAO Code: KDAB
Coordinates: 29°10′47″N, 81°3′29″W
Destination Rosh Pina Airport
City: Rosh Pina
Country: Israel Flag of Israel
IATA Code: RPN
ICAO Code: LLIB
Coordinates: 32°58′51″N, 35°34′18″E