Air Miles Calculator logo

How far is Legazpi from Harbin?

The distance between Harbin (Harbin Taiping International Airport) and Legazpi (Legazpi Airport) is 2241 miles / 3607 kilometers / 1948 nautical miles.

Harbin Taiping International Airport – Legazpi Airport

Distance arrow
2241
Miles
Distance arrow
3607
Kilometers
Distance arrow
1948
Nautical miles

Search flights

Distance from Harbin to Legazpi

There are several ways to calculate the distance from Harbin to Legazpi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2241.239 miles
  • 3606.924 kilometers
  • 1947.583 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2248.013 miles
  • 3617.827 kilometers
  • 1953.470 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Harbin to Legazpi?

The estimated flight time from Harbin Taiping International Airport to Legazpi Airport is 4 hours and 44 minutes.

What is the time difference between Harbin and Legazpi?

There is no time difference between Harbin and Legazpi.

Flight carbon footprint between Harbin Taiping International Airport (HRB) and Legazpi Airport (LGP)

On average, flying from Harbin to Legazpi generates about 245 kg of CO2 per passenger, and 245 kilograms equals 540 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Harbin to Legazpi

See the map of the shortest flight path between Harbin Taiping International Airport (HRB) and Legazpi Airport (LGP).

Airport information

Origin Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E
Destination Legazpi Airport
City: Legazpi
Country: Philippines Flag of Philippines
IATA Code: LGP
ICAO Code: RPLP
Coordinates: 13°9′27″N, 123°44′6″E