Air Miles Calculator logo

How far is Hubli from Atlantic City, NJ?

The distance between Atlantic City (Atlantic City International Airport) and Hubli (Hubli Airport) is 8185 miles / 13172 kilometers / 7112 nautical miles.

Atlantic City International Airport – Hubli Airport

Distance arrow
8185
Miles
Distance arrow
13172
Kilometers
Distance arrow
7112
Nautical miles
Flight time duration
15 h 59 min
Time Difference
10 h 30 min
CO2 emission
1 026 kg

Search flights

Distance from Atlantic City to Hubli

There are several ways to calculate the distance from Atlantic City to Hubli. Here are two standard methods:

Vincenty's formula (applied above)
  • 8184.911 miles
  • 13172.337 kilometers
  • 7112.493 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8174.277 miles
  • 13155.223 kilometers
  • 7103.252 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Atlantic City to Hubli?

The estimated flight time from Atlantic City International Airport to Hubli Airport is 15 hours and 59 minutes.

Flight carbon footprint between Atlantic City International Airport (ACY) and Hubli Airport (HBX)

On average, flying from Atlantic City to Hubli generates about 1 026 kg of CO2 per passenger, and 1 026 kilograms equals 2 261 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Atlantic City to Hubli

See the map of the shortest flight path between Atlantic City International Airport (ACY) and Hubli Airport (HBX).

Airport information

Origin Atlantic City International Airport
City: Atlantic City, NJ
Country: United States Flag of United States
IATA Code: ACY
ICAO Code: KACY
Coordinates: 39°27′27″N, 74°34′37″W
Destination Hubli Airport
City: Hubli
Country: India Flag of India
IATA Code: HBX
ICAO Code: VAHB
Coordinates: 15°21′42″N, 75°5′5″E