Air Miles Calculator logo

How far is Niigata from Larnaca?

The distance between Larnaca (Larnaca International Airport) and Niigata (Niigata Airport) is 5519 miles / 8882 kilometers / 4796 nautical miles.

The driving distance from Larnaca (LCA) to Niigata (KIJ) is 7502 miles / 12074 kilometers, and travel time by car is about 169 hours 27 minutes.

Larnaca International Airport – Niigata Airport

Distance arrow
5519
Miles
Distance arrow
8882
Kilometers
Distance arrow
4796
Nautical miles

Search flights

Distance from Larnaca to Niigata

There are several ways to calculate the distance from Larnaca to Niigata. Here are two standard methods:

Vincenty's formula (applied above)
  • 5518.855 miles
  • 8881.737 kilometers
  • 4795.754 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5506.534 miles
  • 8861.907 kilometers
  • 4785.047 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Larnaca to Niigata?

The estimated flight time from Larnaca International Airport to Niigata Airport is 10 hours and 56 minutes.

Flight carbon footprint between Larnaca International Airport (LCA) and Niigata Airport (KIJ)

On average, flying from Larnaca to Niigata generates about 652 kg of CO2 per passenger, and 652 kilograms equals 1 438 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Larnaca to Niigata

See the map of the shortest flight path between Larnaca International Airport (LCA) and Niigata Airport (KIJ).

Airport information

Origin Larnaca International Airport
City: Larnaca
Country: Cyprus Flag of Cyprus
IATA Code: LCA
ICAO Code: LCLK
Coordinates: 34°52′30″N, 33°37′29″E
Destination Niigata Airport
City: Niigata
Country: Japan Flag of Japan
IATA Code: KIJ
ICAO Code: RJSN
Coordinates: 37°57′21″N, 139°7′15″E