WDM

 

One of the things that has cropped up from time to time in my digging into data transmission over fibre is the increasing use of wave division multiplexing - this is where the optical output of two or more lasers producing light streams at different wavelengths are multiplexed onto the same fibre. The result is two or more completely separate data channels over the same fibre.

WDM, CWDM, and DWDM have evolved as technology has evolved, so first off, here`s the basic wavelengths that have been historically used in data carrying over fibre.

 

The first type of WDM

The first sort of wave division multiplexing used a 1310nm wavelength and a 1550nm wavelength on the same fibre, to produce two channels on the same fibre. Because of the separation, the exact wavelengths used wasn`t important, as long as they were close to 1310nm`s and 1550nm`s.

 

Building on that - 10GBase-LX4

When the IEEE produced the 802.3 standard for 10G ethernet, the first specification for 10G ethernet was 10GBase-LX4, which was for four seperate channels carried over one fibre, using a simple form of wave division multiplexing around the 1310nm wavelength.

They specified the four wavelengths as -

These four wavelengths are not the same as specified in the next development of wave division multiplexing.

 

The ITU and CWDM

In 2002, the ITU produced a standard called ITU G.694.2 - this used 18 different wavelengths, from 1270nm`s up to 1610nm`s in 20nm steps. So now the distinction between 1310nm and 1550nm had been removed, it is regarded as one big range of wavelengths.

This type of wave division multiplexing is known as coarse wave division multiplexing - or CWDM.

I`m not sure if it is part of G.694.2, but this range of wavelengths is also described in bands -

I`m not sure about the purpose of the definition of the bands, many of the wavelengths that the ITU specify in G.694.2 actually bridge band boundaries, so they partly lie in two bands.

One of the wavelengths specified in G.694.2, at 1380nm, lies right on top of what used to be known as the water peak - the water peak is a wavelength that is particularily affected by OH- ( hydroxyl) impurities in the fibre - and it used to be assumed that this wavelength could not be used in optical fibre transmission. However as a result of big advances in fiber manufacturing technologies, hydroxyl impurities have been all but been eliminated, so the 1380nm wavelength can now be used.

However it does suggest that not all of the already installed single mode fibre is capable of supporting CWDM.

Fibre that has eliminated the hydroxyl impurities is known as ZWPF - zero water peak fibre.

There are another two water peak wavelengths - these are at 950nm and 2730nm, but maybe they are outwith the wavelengths of interest for data transfer over fibre.

CWDM was aimed at the MAN type of market, and is specified for distances up to 50kms. The relatively wide spacing between the wavelengths means that the lasers and end devices can be operated without having to have a regard for tight frequency stability, which reduces costs.

 

The ITU and DWDM

Also in 2002, the ITU produced another standard, this time for DWDM - dense wave division multiplexing. It is ITU G.694.1.

This specifies a set of wavelengths which are much closer together than those in CWDM, however it gets much more confusing than that - whereas G.694.2 is specified with 20nm gaps between the wavelengths, in G.694.1, the gaps aren`t specified in terms of wavelengths, the gaps are specified in terms of the frequency of the light.

In addition, G694.1 doesn`t have just one set of gaps, it has three - at least there are three that I know about. The sets of gaps are called grids.

At this point I am struggling a bit, as I don`t have access to the actual ITU Standard, so from various websites I have extracted the following information. It may or may not be correct, and it may be out of date. But it will be sort of right - I hope !

First of all, the grids :-

To give some idea of how dense these channels are compared to CDWM, here are the approximate wavelength separations for the various grids.

So you can see that on the 25GHz grid, the wavelength separation is just one hundreth part of the wavelength separation in CDWM. So very much higher specifications are required for the lasers, and they must be kept in a strictly controlled environment, with tight control on temperature. Costs are therefore much higher than for CDWM.

However it is important to remember that the channels in DWDM are specified in terms of the optical frequency, not the optical wavelength, so the wavelength separations shown in this list are approximate, and are just there to show the difference between CDWM and DWDM.

Some of the websites about DWDM show the same bands that I`ve listed above, in the CWDM section - ie, the O, E, S, C, and L bands.

A few of the websites also refer to another kind of banding - which subdivides the C band -

There does seem to be a bit of inconsistency in the way different websites show the ITU grid. Most of them agree that the ITU DWDM grids lie within the C band. One website shows 43 different frequencies in the 100GHz grid, another website shows 45 different frequencies. A third website shows 50 different frequencies in the 100GHz grid, along with 100 different frequencies in the 50GHz grid.

In approximate terms, the 100GHz grid ranges in wavelength from about 1528 nm`s up to 1560 nm`s - so 1550 nm`s is sitting somewhere in the middle. These correspond to an optical frequency range from about 196THz down to 191THz.

The third of the websites mentioned above also show a 50 level 100Ghz grid and 100 level 50GHz grid in the L band, and also in the S band. Whether these additional grids are part of the ITU G.694.1 standard I don`t know. It is quite possible that the ITU have updated the G.694.1 standard since its original publishing in 2002 - certainly the technology has advanced considerably since 2002.

However it means that there would be 150 different channels in a continuous block, with 100GHz separation, and 300 different channels in a continuous block, with 50GHz separation.

There are certainly several manufacurers offering equipment for the C band and for the L band, though I haven`t found many offering equipment which uses the S band.

There also seems to be differences in how the various websites call each of the frequency channels - one website numbers them 17 up to 61, another numbers them 1 up to 50.

 

Problems with CDWM and DWDM

CWDM and DWDM aren`t without their problems - one of these is the creation of spurious frequencies which are built out of the wanted frequencies - this is a problem that exists outside of fibre, it exists anywhere that there are several channels which are close in frequency. The cellular mobile phone industry suffers from it as well, and can result in weird dead spots in coverage, even though there are several base stations within range.

Particularily with DWDM and its very close frequencies, the basic problem is that if you have three frequencies that are quite close, say - F1, F2, and F3 - then these three frequencies can combine to produce a fourth frequency Fx = F1 + F2 - F3, and Fx can actually be the same frequency as another wanted channel.

Now of course it isn`t just Fx that is produced, you will also get

If the channels are evenly spaced, then all three of Fx, Fy, and Fz will fall on wanted channels. In addition, if there are more than three signals, then you can get many more spurious frequencies.

Some sources about fibre suggest that you can also get in-band spurious frequencies created from just two real frequencies, however I`m not sure about this one.

Certainly you can get beat frequencies from two close frequencies, but they would normally expect to be way outside the frequency band of interest.

But anyway, it can be a significant problem, causing interference for the wanted frequencies, as well as absorbing power from the wanted frequencies. If you are not using all of the available channels, then it can be a useful technique to choose the frequencies to use so that the gaps between them are all different. The spurious frequencies then fall out-of-band, or in-between the frequencies being used.

 

Limitation of CDWM and DWDM

CDWM and DWDM are being "sold" as the best way to get high bandwith across a fibre, and the figures look quite impressive - 160 channels of 10Gb/s per channel should produce 1600Gb/s of bandwith.

For systems like SONET/SDH, it is maybe possible to combine time shared multiplexing and wave division multiplexing to utilise that kind of bandwidth.

However the biggest growth in fibre traffic is in ethernet carriage, and it is not easy or very effective to divide up ethernet traffic onto separate channels. So CDWM and DWDM aren`t going to slow down the development of ethernet speeds to 40G, 100G, and beyond.

Perhaps CWDM and DWDM are going to be more useful in carrying a mixture of technologies on the same fibre - SONET/SDH on some channels, different ethernets for different customers on other channels - or maybe for long distance transmission of VLAN`s or different types of signal ( VoIP, video, data, etc ) - all on the one fibre.

 

Another limitation of DWDM

There is another limitation with DWDM in particular - as DWDM heads down towards 25GHz frequency separation, and even 12.5GHz separation, in a drive towards carrying more and more channels on the one fibre, at the same time the bit rates that need to be carried are going up.

We are getting close to the situation that the bit rates that need to be carried are too fast for the frequency separation, using the existing commonly used modulation methods. Typically, a bit rate of 10Gb/s needs a channel width of about 25GHz. So the smallest channel separation that will carry 10G ethernet is 25GHz.

As 40G ethernet becomes more widespread, then the smallest channel separation that can carry it is 100GHz.

When 100G ethernet arrives, it is going to need channel separations of 250GHz.

So having hundreds of very close channels isn`t neccessarily going to be an advantage as bit rates go up, unless it is possible to develop systems that can carry bit rates that are as high, or higher, than the channel separations.

In fact this is already happening - for data rates up to 10Gb/s, fibre typically uses NRZ coding - ie, for a "1", the laser is switched on, and a burst of light is sent down the fibre. For a "0", the laser is switched off, so no light is sent.

Quite apart from its bandwith requirements, there is a snag with this type of modulation, in that if the data contains a string of 1`s or 0`s, then the laser is either always on, or always off, and the receiver has no edges to detect the clock rate, so the data signal has to be scrambled in some way so that strings of 1`s or 0`s don`t happen.

For data rates above 10G, developers are looking at various forms of phase shift keying - in these, the laser is always on and always producing light, but the phase of the light is changed for 1`s and 0`s.

PSK - phase shift keying - is a moving staircase - bit rates keep on going up, and developers keep on developing ways of carrying more and more data within the same channel separation.

So the potential now exists for the carriage of 40G within existing 50GHz spaced channel fibre installations, 100G is being worked on, and no doubt development will go on and on.

 

A problem with small channel separation

However this isn`t the end of the story - because now another problem arises - optical beat frequencies causing interference with the data signal.

As described above, if you have two close frequencies, then you can get two beat frequencies - F1 + F2, and F1 - F2.

If you have an optical frequency of F1, and you have a channel separation of 50GHz, then F2, the next channel, will be F2 = F1 + 50GHz.

The subtractive beat frequency will be

So now we have got an optical beat frequency which is getting quite close to our 40GHz data modulating signal. And the closer that developers get to carrying data rates which are close or equal to the optical channel separation, the more pronounced this interference will be. So this is another factor that will have to taken on board.

 

Inefficient use of light

Despite all the advances described above, current data transmission over fibre is still making very inefficient use of the potential data transfer ability of light waves.

If you think about the data rate - say 40Gb/s, and then think about the frequency of the light - say about 193 THz, or 193,000 GHz - you realise that each bit of data is using something like 5000 cycles of light. You actually only need a few cycles, so most of that 5000 cycles is wasted.

A significant part of the reason for this is the finite time it takes to switch a laser on and off, this is one of the biggest limitations.

There are now various people looking at ways to use the light more efficiently - two of them I have come across so far are

Solitons are certainly not new - they were first observed in water - in 1834. The first observed one was caused by a boat on a canal, and a solitary wave carried on down the canal for several miles. They didn`t fit in with conventional wave theory of the time, so they got a bit lost.

It was in 1973 that they were first looked at in connection with light transmission in fibre, and by 1988, researchers were able to transmit solitons over several thousand kilometres of fibre.

So solitons, along with OTDM, provide plenty of scope for further development of data transfers over fibre.

 

 

 

 

 

website design by ron-t

 

website hosting by freevirtualservers.com

 

© 2024   Ron Turner

 

+                                   +  

 

Link to the W3C website.   Link to the W3C website.