19.11.2014 Views

mohatta2015.pdf

signal processing from power amplifier operation control point of view

signal processing from power amplifier operation control point of view

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

MORE DETAILS 157<br />

minimum sequence metric and subtract it from all sequence metrics before forming<br />

symbol metrics.<br />

Revisiting the Alice and Bob example, we would take the sequence metrics in<br />

Table 6.1, identify 40 as the minimum value, and then subtract it from all sequence<br />

metrics, giving the normalized sequence metrics in Table 7.4.<br />

Table 7.4<br />

Example of normalized sequence metrics<br />

Index Hypothesis Metric<br />

~WWW~<br />

~ +1 +1 +1 Ö<br />

2 +1 +1 -1 640<br />

3 +1 -1 +1 428<br />

4 +1 -1 -1 348<br />

5 -1 +1 +1 396<br />

6 -1 +1 -1 1036<br />

7 -1 -1 +1 104<br />

8 -1 -1 -1 24<br />

It turns out for a lot of operating scenarios, MLSD and MAPSD provide similar<br />

performance in terms of sequence and symbol error rate. Only at very low SNR<br />

does each provide an advantage in terms of the error rate it minimizes. However, as<br />

we will see in the next section, MAPSD is helpful in understanding soft information<br />

generation.<br />

7.2.2 Soft information<br />

To explore soft information further, we start with the log-likelihood ratio, a common<br />

way of representing soft bit information. Then, an example of an encoder and<br />

decoder is introduced. Finally, the notion of joint demodulation and decoding is<br />

considered.<br />

7.2.2.1 The log-likelihood ratio Notice that we had two soft values when using<br />

MAPSD symbol metrics but only one soft value when using other forms of equalization.<br />

This is because the other equalization approaches were producing a soft<br />

value proportional to the log of the ratio of the two bit likelihoods. We call this<br />

ratio the log-likelihood ratio (LLR).<br />

LLR = log U{ Sm = -iwJ<br />

(7·8)<br />

= log(Pr{ Sm = +l|r})-log(Pr{ Sm = -l|r}). (7.9)<br />

Here is a summary of the advantages of using LLRs.<br />

1. We only need to find something proportional to the likelihood, as any scaling<br />

factors divide out when taking the ratio.<br />

2. We only need to store one soft value per bit, rather than two.<br />

3. The sign of the LLR gives the detected symbol value.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!