Air Miles Calculator logo

How far is Petersburg, AK, from Harbin?

The distance between Harbin (Harbin Taiping International Airport) and Petersburg (Petersburg James A. Johnson Airport) is 4036 miles / 6496 kilometers / 3508 nautical miles.

Harbin Taiping International Airport – Petersburg James A. Johnson Airport

Distance arrow
4036
Miles
Distance arrow
6496
Kilometers
Distance arrow
3508
Nautical miles

Search flights

Distance from Harbin to Petersburg

There are several ways to calculate the distance from Harbin to Petersburg. Here are two standard methods:

Vincenty's formula (applied above)
  • 4036.429 miles
  • 6496.003 kilometers
  • 3507.561 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4024.250 miles
  • 6476.403 kilometers
  • 3496.978 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Harbin to Petersburg?

The estimated flight time from Harbin Taiping International Airport to Petersburg James A. Johnson Airport is 8 hours and 8 minutes.

Flight carbon footprint between Harbin Taiping International Airport (HRB) and Petersburg James A. Johnson Airport (PSG)

On average, flying from Harbin to Petersburg generates about 461 kg of CO2 per passenger, and 461 kilograms equals 1 016 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Harbin to Petersburg

See the map of the shortest flight path between Harbin Taiping International Airport (HRB) and Petersburg James A. Johnson Airport (PSG).

Airport information

Origin Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E
Destination Petersburg James A. Johnson Airport
City: Petersburg, AK
Country: United States Flag of United States
IATA Code: PSG
ICAO Code: PAPG
Coordinates: 56°48′6″N, 132°56′42″W