Air Miles Calculator logo

How far is Batna from Skopje?

The distance between Skopje (Skopje International Airport) and Batna (Mostépha Ben Boulaid Airport) is 928 miles / 1494 kilometers / 807 nautical miles.

Skopje International Airport – Mostépha Ben Boulaid Airport

Distance arrow
928
Miles
Distance arrow
1494
Kilometers
Distance arrow
807
Nautical miles

Search flights

Distance from Skopje to Batna

There are several ways to calculate the distance from Skopje to Batna. Here are two standard methods:

Vincenty's formula (applied above)
  • 928.445 miles
  • 1494.187 kilometers
  • 806.796 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 926.992 miles
  • 1491.850 kilometers
  • 805.534 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Skopje to Batna?

The estimated flight time from Skopje International Airport to Mostépha Ben Boulaid Airport is 2 hours and 15 minutes.

What is the time difference between Skopje and Batna?

There is no time difference between Skopje and Batna.

Flight carbon footprint between Skopje International Airport (SKP) and Mostépha Ben Boulaid Airport (BLJ)

On average, flying from Skopje to Batna generates about 146 kg of CO2 per passenger, and 146 kilograms equals 321 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Skopje to Batna

See the map of the shortest flight path between Skopje International Airport (SKP) and Mostépha Ben Boulaid Airport (BLJ).

Airport information

Origin Skopje International Airport
City: Skopje
Country: Macedonia Flag of Macedonia
IATA Code: SKP
ICAO Code: LWSK
Coordinates: 41°57′41″N, 21°37′17″E
Destination Mostépha Ben Boulaid Airport
City: Batna
Country: Algeria Flag of Algeria
IATA Code: BLJ
ICAO Code: DABT
Coordinates: 35°45′7″N, 6°18′30″E