Air Miles Calculator logo

How far is Jijiga from Luanda?

The distance between Luanda (Luanda Quatro de Fevereiro Airport) and Jijiga (Jijiga Airport) is 2396 miles / 3856 kilometers / 2082 nautical miles.

The driving distance from Luanda (LAD) to Jijiga (JIJ) is 3667 miles / 5902 kilometers, and travel time by car is about 91 hours 18 minutes.

Luanda Quatro de Fevereiro Airport – Jijiga Airport

Distance arrow
2396
Miles
Distance arrow
3856
Kilometers
Distance arrow
2082
Nautical miles

Search flights

Distance from Luanda to Jijiga

There are several ways to calculate the distance from Luanda to Jijiga. Here are two standard methods:

Vincenty's formula (applied above)
  • 2396.234 miles
  • 3856.365 kilometers
  • 2082.271 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2397.852 miles
  • 3858.969 kilometers
  • 2083.677 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luanda to Jijiga?

The estimated flight time from Luanda Quatro de Fevereiro Airport to Jijiga Airport is 5 hours and 2 minutes.

Flight carbon footprint between Luanda Quatro de Fevereiro Airport (LAD) and Jijiga Airport (JIJ)

On average, flying from Luanda to Jijiga generates about 263 kg of CO2 per passenger, and 263 kilograms equals 580 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luanda to Jijiga

See the map of the shortest flight path between Luanda Quatro de Fevereiro Airport (LAD) and Jijiga Airport (JIJ).

Airport information

Origin Luanda Quatro de Fevereiro Airport
City: Luanda
Country: Angola Flag of Angola
IATA Code: LAD
ICAO Code: FNLU
Coordinates: 8°51′30″S, 13°13′52″E
Destination Jijiga Airport
City: Jijiga
Country: Ethiopia Flag of Ethiopia
IATA Code: JIJ
ICAO Code: HAJJ
Coordinates: 9°19′57″N, 42°54′43″E