Air Miles Calculator logo

How far is Kikai from Shanghai?

The distance between Shanghai (Shanghai Pudong International Airport) and Kikai (Kikai Airport) is 525 miles / 846 kilometers / 457 nautical miles.

The driving distance from Shanghai (PVG) to Kikai (KKX) is 2243 miles / 3609 kilometers, and travel time by car is about 117 hours 44 minutes.

Shanghai Pudong International Airport – Kikai Airport

Distance arrow
525
Miles
Distance arrow
846
Kilometers
Distance arrow
457
Nautical miles

Search flights

Distance from Shanghai to Kikai

There are several ways to calculate the distance from Shanghai to Kikai. Here are two standard methods:

Vincenty's formula (applied above)
  • 525.388 miles
  • 845.530 kilometers
  • 456.550 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 524.734 miles
  • 844.477 kilometers
  • 455.981 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Kikai?

The estimated flight time from Shanghai Pudong International Airport to Kikai Airport is 1 hour and 29 minutes.

Flight carbon footprint between Shanghai Pudong International Airport (PVG) and Kikai Airport (KKX)

On average, flying from Shanghai to Kikai generates about 102 kg of CO2 per passenger, and 102 kilograms equals 226 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shanghai to Kikai

See the map of the shortest flight path between Shanghai Pudong International Airport (PVG) and Kikai Airport (KKX).

Airport information

Origin Shanghai Pudong International Airport
City: Shanghai
Country: China Flag of China
IATA Code: PVG
ICAO Code: ZSPD
Coordinates: 31°8′36″N, 121°48′18″E
Destination Kikai Airport
City: Kikai
Country: Japan Flag of Japan
IATA Code: KKX
ICAO Code: RJKI
Coordinates: 28°19′16″N, 129°55′40″E