Air Miles Calculator logo

How far is Bangda from Tokyo?

The distance between Tokyo (Narita International Airport) and Bangda (Qamdo Bamda Airport) is 2514 miles / 4046 kilometers / 2184 nautical miles.

The driving distance from Tokyo (NRT) to Bangda (BPX) is 3670 miles / 5907 kilometers, and travel time by car is about 71 hours 49 minutes.

Narita International Airport – Qamdo Bamda Airport

Distance arrow
2514
Miles
Distance arrow
4046
Kilometers
Distance arrow
2184
Nautical miles

Search flights

Distance from Tokyo to Bangda

There are several ways to calculate the distance from Tokyo to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 2513.817 miles
  • 4045.596 kilometers
  • 2184.447 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2508.738 miles
  • 4037.423 kilometers
  • 2180.034 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tokyo to Bangda?

The estimated flight time from Narita International Airport to Qamdo Bamda Airport is 5 hours and 15 minutes.

Flight carbon footprint between Narita International Airport (NRT) and Qamdo Bamda Airport (BPX)

On average, flying from Tokyo to Bangda generates about 277 kg of CO2 per passenger, and 277 kilograms equals 610 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tokyo to Bangda

See the map of the shortest flight path between Narita International Airport (NRT) and Qamdo Bamda Airport (BPX).

Airport information

Origin Narita International Airport
City: Tokyo
Country: Japan Flag of Japan
IATA Code: NRT
ICAO Code: RJAA
Coordinates: 35°45′52″N, 140°23′9″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E