Air Miles Calculator logo

How far is Omitama from Larnaca?

The distance between Larnaca (Larnaca International Airport) and Omitama (Ibaraki Airport) is 5650 miles / 9092 kilometers / 4909 nautical miles.

The driving distance from Larnaca (LCA) to Omitama (IBR) is 7509 miles / 12084 kilometers, and travel time by car is about 169 hours 48 minutes.

Larnaca International Airport – Ibaraki Airport

Distance arrow
5650
Miles
Distance arrow
9092
Kilometers
Distance arrow
4909
Nautical miles

Search flights

Distance from Larnaca to Omitama

There are several ways to calculate the distance from Larnaca to Omitama. Here are two standard methods:

Vincenty's formula (applied above)
  • 5649.635 miles
  • 9092.206 kilometers
  • 4909.399 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5637.293 miles
  • 9072.344 kilometers
  • 4898.674 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Larnaca to Omitama?

The estimated flight time from Larnaca International Airport to Ibaraki Airport is 11 hours and 11 minutes.

Flight carbon footprint between Larnaca International Airport (LCA) and Ibaraki Airport (IBR)

On average, flying from Larnaca to Omitama generates about 670 kg of CO2 per passenger, and 670 kilograms equals 1 476 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Larnaca to Omitama

See the map of the shortest flight path between Larnaca International Airport (LCA) and Ibaraki Airport (IBR).

Airport information

Origin Larnaca International Airport
City: Larnaca
Country: Cyprus Flag of Cyprus
IATA Code: LCA
ICAO Code: LCLK
Coordinates: 34°52′30″N, 33°37′29″E
Destination Ibaraki Airport
City: Omitama
Country: Japan Flag of Japan
IATA Code: IBR
ICAO Code: RJAH
Coordinates: 36°10′51″N, 140°24′53″E