Air Miles Calculator logo

How far is Qinhuangdao from Tottori?

The distance between Tottori (Tottori Airport) and Qinhuangdao (Qinhuangdao Beidaihe Airport) is 875 miles / 1409 kilometers / 761 nautical miles.

The driving distance from Tottori (TTJ) to Qinhuangdao (BPE) is 1379 miles / 2220 kilometers, and travel time by car is about 29 hours 43 minutes.

Tottori Airport – Qinhuangdao Beidaihe Airport

Distance arrow
875
Miles
Distance arrow
1409
Kilometers
Distance arrow
761
Nautical miles

Search flights

Distance from Tottori to Qinhuangdao

There are several ways to calculate the distance from Tottori to Qinhuangdao. Here are two standard methods:

Vincenty's formula (applied above)
  • 875.409 miles
  • 1408.834 kilometers
  • 760.710 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 873.734 miles
  • 1406.138 kilometers
  • 759.254 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tottori to Qinhuangdao?

The estimated flight time from Tottori Airport to Qinhuangdao Beidaihe Airport is 2 hours and 9 minutes.

Flight carbon footprint between Tottori Airport (TTJ) and Qinhuangdao Beidaihe Airport (BPE)

On average, flying from Tottori to Qinhuangdao generates about 142 kg of CO2 per passenger, and 142 kilograms equals 312 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tottori to Qinhuangdao

See the map of the shortest flight path between Tottori Airport (TTJ) and Qinhuangdao Beidaihe Airport (BPE).

Airport information

Origin Tottori Airport
City: Tottori
Country: Japan Flag of Japan
IATA Code: TTJ
ICAO Code: RJOR
Coordinates: 35°31′48″N, 134°10′1″E
Destination Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E