Within the next 10–15 years Internet trafﬁc is expected to grow up to 50 times its present value, and all
the trafﬁc will be carried IP based in the upper layer . The current IP based solutions are not suitable
for providing reliable connections, because of the slow recovery mechanisms. Therefore, the protection
in the lower, optical layer is more and more important, because already a short-therm outage leeds to the
loss of huge amount of data (in order of terabytes). Furthermore, the customers require higher QoS from
the service providers, by using different real time applications.
Nowadays the most widespread technology in the optical backbone is the 1+1 dedicated protection,
which sends the data simultaneously on two edge and node disjoint paths, ensuring immediate restoration
of the connection, if one of the paths falls out. On the other hand, because of the compexity of the
network and the protection of the mulitiple link failures, requiered by high QoS, this technology will
be applied only in a limited way, not to mention its high capacity demand. The Generalized Dedicated
Protection (GDP- Generalized Dedicated Protection) has been proposed, in order to eliminate these
disadvantages . The GDP suits felxible the demands of customers and provides reliable connectivity.
Based on the optical equipments available at the network nodes, different GDP problems can be
formulated. The solution can be in the form of non-bifurcated or bifurcated ﬂows. The non-bifurcated
method is simple, but has a poor resource utilization. On the other hand, by the bifurcated solution, in
exchange for low bandwidth reservation, we have to use network coding with high complexity to ensure
resilient and robust protection.
In my work I present a new GDP solution, that combines the advantages of the aforementioned ones.
The method I suggest, allows data splitting (bifurcation), but pays regard to the practical feasibility, too.
Considering the most recent network coding results, I place a special emphasis on the case, when the
ﬂow can be divided in two parts. I analyzed the proposed method in terms of node cost and formulated a
conjecture regarding this matter.
Using simulations, I investigate the case, when the data is splitted in more than two parts, increasing
the complexity. I investigate how close we can come to the optimal solution, that can be computed in
polinomial time, but it’s difﬁcult to implement in practice and requires complex coding and decoding for
immediate recovery .