Air Miles Calculator logo

How far is Richards Bay from Portland, OR?

The distance between Portland (Portland International Airport) and Richards Bay (Richards Bay Airport) is 10634 miles / 17114 kilometers / 9241 nautical miles.

Portland International Airport – Richards Bay Airport

Distance arrow
10634
Miles
Distance arrow
17114
Kilometers
Distance arrow
9241
Nautical miles
Flight time duration
20 h 38 min
CO2 emission
1 402 kg

Search flights

Distance from Portland to Richards Bay

There are several ways to calculate the distance from Portland to Richards Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 10634.303 miles
  • 17114.252 kilometers
  • 9240.957 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10634.842 miles
  • 17115.120 kilometers
  • 9241.425 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Portland to Richards Bay?

The estimated flight time from Portland International Airport to Richards Bay Airport is 20 hours and 38 minutes.

Flight carbon footprint between Portland International Airport (PDX) and Richards Bay Airport (RCB)

On average, flying from Portland to Richards Bay generates about 1 402 kg of CO2 per passenger, and 1 402 kilograms equals 3 091 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Portland to Richards Bay

See the map of the shortest flight path between Portland International Airport (PDX) and Richards Bay Airport (RCB).

Airport information

Origin Portland International Airport
City: Portland, OR
Country: United States Flag of United States
IATA Code: PDX
ICAO Code: KPDX
Coordinates: 45°35′19″N, 122°35′52″W
Destination Richards Bay Airport
City: Richards Bay
Country: South Africa Flag of South Africa
IATA Code: RCB
ICAO Code: FARB
Coordinates: 28°44′27″S, 32°5′31″E