Air Miles Calculator logo

How far is Belfast from Split?

The distance between Split (Split Airport) and Belfast (Belfast International Airport) is 1269 miles / 2042 kilometers / 1103 nautical miles.

The driving distance from Split (SPU) to Belfast (BFS) is 1696 miles / 2730 kilometers, and travel time by car is about 30 hours 55 minutes.

Split Airport – Belfast International Airport

Distance arrow
1269
Miles
Distance arrow
2042
Kilometers
Distance arrow
1103
Nautical miles

Search flights

Distance from Split to Belfast

There are several ways to calculate the distance from Split to Belfast. Here are two standard methods:

Vincenty's formula (applied above)
  • 1268.837 miles
  • 2041.996 kilometers
  • 1102.590 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1266.348 miles
  • 2037.989 kilometers
  • 1100.426 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Split to Belfast?

The estimated flight time from Split Airport to Belfast International Airport is 2 hours and 54 minutes.

Flight carbon footprint between Split Airport (SPU) and Belfast International Airport (BFS)

On average, flying from Split to Belfast generates about 165 kg of CO2 per passenger, and 165 kilograms equals 363 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Split to Belfast

See the map of the shortest flight path between Split Airport (SPU) and Belfast International Airport (BFS).

Airport information

Origin Split Airport
City: Split
Country: Croatia Flag of Croatia
IATA Code: SPU
ICAO Code: LDSP
Coordinates: 43°32′20″N, 16°17′52″E
Destination Belfast International Airport
City: Belfast
Country: United Kingdom Flag of United Kingdom
IATA Code: BFS
ICAO Code: EGAA
Coordinates: 54°39′27″N, 6°12′56″W