Jim Killingsworth

Un­der­stand­ing the Dis­crete Fouri­er Trans­form

For the longest time, the Fouri­er trans­form re­mained a bit of a mys­tery to me. I knew it in­volved trans­form­ing a func­tion in the time do­main in­to a rep­re­sen­ta­tion in the fre­quen­cy do­main. And I knew it had some­thing to do with si­nu­soidal waves. But I did­n’t un­der­stand what it meant to have a fre­quen­cy do­main rep­re­sen­ta­tion of a func­tion. As it turns out, it’s quite a sim­ple thing once you re­al­ize what the fre­quen­cy val­ues rep­re­sen­t. In this post, I ex­plain the dis­crete Fouri­er trans­form by work­ing through a set of ex­am­ples.

The For­ward Trans­form and the In­verse Trans­form

Sup­pose we have a func­tion in the time do­main. The val­ue varies as a func­tion of time. Let’s as­sume we can rep­re­sent this func­tion as a con­tin­u­ous sum of a po­ten­tial­ly in­fi­nite num­ber of si­nu­soidal waves of var­i­ous fre­quen­cies, am­pli­tudes, and phase shift­s. The Fouri­er trans­form lets us de­com­pose this func­tion in­to its si­nu­soidal com­po­nents on the fre­quen­cy spec­trum:

Figure 1

The for­mu­la above is what we can think of as the for­ward trans­for­m. It takes a func­tion in the time do­main and gives us back a func­tion in the fre­quen­cy do­main. We can al­so take a func­tion in the fre­quen­cy do­main and map it back to the time do­main us­ing the in­verse trans­for­m:

Figure 2

The for­ward and in­verse Fouri­er trans­forms above op­er­ate on con­tin­u­ous func­tion­s. The in­puts are as­sumed to be con­tin­u­ous func­tions across the time do­main and fre­quen­cy spec­trum, re­spec­tive­ly. In prac­ti­cal sce­nar­ios, how­ev­er, we are more like­ly to be deal­ing with a set of dis­crete sam­ples tak­en at reg­u­lar in­ter­val­s. And for this, we need to fo­cus our at­ten­tion on the dis­crete Fouri­er trans­for­m. Here is the for­ward trans­for­m:

Figure 3

In­stead of an in­te­gral, we have a sum. And the sum is based on dis­crete sam­ples of the func­tion in the time do­main. No­tice that the com­plex ex­po­nen­tial ker­nel is still the same as be­fore. Nat­u­ral­ly, we have a cor­re­spond­ing in­verse trans­form that goes with it:

Figure 4

I think the dis­crete trans­form is more in­tu­itive and eas­i­er to un­der­stand. We’ll work through some ex­am­ples in the fol­low­ing sec­tion­s. I want to point out here that the no­ta­tion I’m us­ing might dif­fer some­what from the no­ta­tion con­ven­tions used in text­books and oth­er re­sources on this top­ic. To each their own. For each of our ex­am­ples, we’re go­ing to use a sam­pling fre­quen­cy of eight sam­ples per sec­ond in the time do­main:

Figure 5

We’re on­ly go­ing to take sam­ples over a fi­nite time win­dow of one sec­ond. That means we’re go­ing to have a to­tal of eight da­ta points. The fol­low­ing ta­ble shows the dis­crete points in time at which we’ll sam­ple the in­put func­tion:

Figure 6

We want to round trip our in­put func­tion from the time do­main to the fre­quen­cy do­main and then back again to the time do­main. We need to choose some dis­crete fre­quen­cies. Here are the eight dif­fer­ent fre­quen­cies we’ll use:

Figure 7

We could have cho­sen a dif­fer­ent set of dis­crete points on the time do­main to sam­ple the in­put func­tion. Like­wise, we don’t nec­es­sar­i­ly have to use this spe­cif­ic set of fre­quen­cies. But these val­ues seem to make the most sense in this con­text as they will like­ly yield the most in­tu­itive re­sults as we work through the ex­am­ples in the fol­low­ing sec­tion­s.

Ex­am­ple 1

For our first ex­am­ple, let’s just use a sim­ple sine wave. And let’s use a sine wave with a pe­ri­od of one sec­ond. That means it starts at ze­ro and fin­ish­es a com­plete cy­cle pre­cise­ly one sec­ond lat­er. Here is the func­tion de­f­i­n­i­tion:

Figure 8

To fore­shad­ow what is to come, I think it’s worth point­ing out here that a sine wave can al­so be un­der­stood as a co­sine wave in which the phase is shift­ed by a quar­ter of a pe­ri­od. Here is how you can de­fine a sine wave us­ing the co­sine func­tion:

Figure 9

The two func­tion de­f­i­n­i­tions above are equiv­a­len­t. Keep that in the back of your mind as you read fur­ther. And while this is a con­tin­u­ous func­tion, we on­ly want to per­form our trans­form op­er­a­tions on dis­crete sam­ples of the in­put func­tion. Here are the sam­ple val­ues at each of the dis­crete points on the time do­main:

Figure 10

No­tice that at points where the func­tion val­ue is ze­ro, the func­tion might eval­u­ate as ei­ther a pos­i­tive ze­ro or a neg­a­tive ze­ro. Now, in pure math­e­mat­ic­s, there may be no such thing as a signed ze­ro. But if you are work­ing with a com­put­er sys­tem that us­es IEEE 754 float­ing-point num­ber­s, there most cer­tain­ly is such a thing as a signed ze­ro. Just keep that in mind. Here is a plot of the sine wave func­tion, along with the sam­ple val­ues:

Figure 11

Once we have tak­en the sam­ple val­ues of the in­put func­tion, we are ready to ex­e­cute the dis­crete Fouri­er trans­for­m. Here is how to ap­ply the for­ward trans­for­m:

Figure 12

This for­mu­la takes the form of a con­tin­u­ous func­tion on the fre­quen­cy do­main. But what we re­al­ly want to do here is take dis­crete sam­ples of this func­tion, one for each of our cho­sen fre­quen­cies. Here are the val­ues:

Figure 13

The val­ues on the fre­quen­cy do­main are com­plex num­bers with both re­al and imag­i­nary com­po­nents. Again, keep in mind that ze­ro val­ues might be pos­i­tive or neg­a­tive when us­ing a com­put­er. The fol­low­ing plots show a vi­su­al rep­re­sen­ta­tion of both ax­es of the com­plex plane:

Figure 14
Figure 15

The com­plex val­ues above are rep­re­sent­ed us­ing the Carte­sian for­m. While this is a com­mon way to ex­press com­plex num­ber­s, I think it makes more sense to use the po­lar form when dis­cussing Fouri­er trans­form­s. Con­sid­er the Carte­sian form with dis­tinct re­al and imag­i­nary co­ef­fi­cients:

Figure 16

We want to ex­press this us­ing the po­lar for­m:

Figure 17

The po­lar form of a com­plex num­ber has co­ef­fi­cients for the mag­ni­tude and the phase an­gle. We can cal­cu­late these co­ef­fi­cients based on the re­al and imag­i­nary parts used in the Carte­sian for­m. The fol­low­ing for­mu­la shows how to com­pute the mag­ni­tude com­po­nen­t:

Figure 18

Com­put­ing the phase an­gle is a bit more in­volved. If the com­plex val­ue lies on the first or fourth quad­rant of the com­plex plane, it’s a sim­ple arc­t­an­gent cal­cu­la­tion. But if it lies on the sec­ond or third quad­rant or on the bor­der be­tween quad­rants, we need to treat those val­ues as spe­cial cas­es. Here is the for­mu­la for com­put­ing the phase an­gle:

Figure 19

For this de­f­i­n­i­tion, the prin­ci­pal val­ue of the phase an­gle lies in the fol­low­ing range:

Figure 20

We could have cho­sen a dif­fer­ent range for the prin­ci­pal val­ue, but this one seems to be the most wide­ly used con­ven­tion. In cas­es where both re­al and imag­i­nary val­ues are ze­ro, the phase an­gle is un­de­fined. It does­n’t mat­ter be­cause the mag­ni­tude is ze­ro. But if you’re us­ing a com­put­er to cal­cu­late the phase, you might want to be care­ful what con­crete re­sult you get in this case. Con­sid­er what hap­pens if you use the .NET two-ar­gu­ment arc­t­an­gent func­tion to com­pute the phase when the re­al and imag­i­nary co­ef­fi­cients are both ze­ro:

Figure 21

Do you see why I men­tioned the signed ze­ros? You can’t as­sume that a pos­i­tive ze­ro will al­ways be treat­ed the same as a neg­a­tive ze­ro. If you’re not care­ful, you might wind up with some very un­ex­pect­ed re­sult­s. I sus­pect oth­er soft­ware li­braries that pro­vide math­e­mat­i­cal func­tions have sim­i­lar nu­ances. There are oth­er sub­tleties re­lat­ed to this that are be­yond the scope of this post. For now, let’s just treat ze­ro and near-ze­ro com­plex val­ues as hav­ing a phase an­gle of ze­ro. De­pict­ed be­low are the re­sults of our Fouri­er trans­form ex­pressed in po­lar for­m:

Figure 22

The mag­ni­tude com­po­nent is al­ways a pos­i­tive or ze­ro val­ue, so we need not wor­ry about the sign. Just like we did with the re­al and imag­i­nary parts ear­lier, we can plot the val­ues of the mag­ni­tude and phase com­po­nents across the fre­quen­cy do­main:

Figure 23
Figure 24

Note the sym­me­try and re­verse sym­me­try of the mag­ni­tude and phase com­po­nents, re­spec­tive­ly. We’ll ex­plore that in the next sec­tion. For the time be­ing, let’s just con­vert these val­ues in the fre­quen­cy do­main back to the time do­main us­ing the in­verse trans­for­m:

Figure 25

As shown above, we can eas­i­ly con­sol­i­date the ex­po­nents when ex­press­ing our com­plex val­ues in po­lar for­m. And us­ing Euler’s for­mu­la, we can take it a step fur­ther:

Figure 26

Since the orig­i­nal val­ues in the time do­main are re­al num­ber­s, not com­plex num­ber­s, we can ex­pect the re­sults of the in­verse trans­form to be re­al num­bers as well. We can sim­ply drop the imag­i­nary part from the for­mu­la above:

Figure 27

This is the in­verse trans­form ex­pressed in a form that us­es the mag­ni­tude and phase co­ef­fi­cients. In this par­tic­u­lar ex­am­ple, all but two of the val­ues in the fre­quen­cy do­main are ze­ro. Thus, we can fur­ther sim­pli­fy like so:

Figure 28

If we plot this func­tion on a chart and take dis­crete sam­ples at the eight points in time we de­cid­ed up­on ear­lier, here is what it looks like:

Figure 29

This func­tion is def­i­nite­ly not the same as the sine wave func­tion we start­ed with. How­ev­er, it does have the same val­ues as our orig­i­nal in­put func­tion at the sam­ple points. And that’s the thing that mat­ters when deal­ing with dis­crete Fouri­er trans­form­s.

Ex­am­ple 2

In the pre­vi­ous ex­am­ple, we not­ed the sym­me­try of the trans­formed val­ues on the fre­quen­cy do­main. For the mag­ni­tude, the low­er half of the fre­quen­cy spec­trum is a mir­ror im­age of the up­per half. A sim­i­lar pat­tern oc­curs with the phase an­gle, ex­cept the two halves are negat­ed mir­ror im­ages of one an­oth­er. What hap­pens if we con­sol­i­date the val­ues on­to the low­er or up­per half of the fre­quen­cy spec­trum? In this ex­am­ple, let’s use the same in­put func­tion we used in the pre­vi­ous sec­tion. We’ll start with the trans­formed val­ues on the fre­quen­cy do­main:

Figure 30

These are the same com­plex val­ues on the fre­quen­cy do­main we saw in the pre­vi­ous sec­tion. Now let’s per­form some mod­i­fi­ca­tions to these val­ues. We want to dou­ble the val­ues on the low­er half of the fre­quen­cy spec­trum and ze­ro out the val­ues on the up­per half. But we want to avoid mod­i­fy­ing the val­ue for the ze­ro fre­quen­cy. We al­so want to avoid al­ter­ing the mid­dle val­ue if the num­ber of sam­ples is an even num­ber. To do this, it helps to de­fine some de­mar­ca­tion points:

Figure 31

Since we are di­vid­ing in­te­gers here, we can use floor di­vi­sion to get in­te­ger re­sult­s. These points give us an up­per bound for the low­er half of the range and a low­er bound for the up­per half. Here is how we mod­i­fy the val­ues on the fre­quen­cy do­main:

Figure 32

This for­mu­la dou­bles the val­ues in the low­er part of the fre­quen­cy range and clears the val­ues in the up­per part. Here is a vi­su­al rep­re­sen­ta­tion of the mod­i­fied val­ues:

Figure 33
Figure 34

Af­ter per­form­ing the mod­i­fi­ca­tion, all but one of the val­ues in the fre­quen­cy do­main is a nonze­ro val­ue. Here is what our in­verse trans­form looks like in this in­stance:

Figure 35

Plot­ting this func­tion on a chart with dis­crete sam­ples at the pre­de­ter­mined sam­ple points on the time do­main, here is what it looks like:

Figure 36

This is iden­ti­cal to our orig­i­nal sine wave func­tion. But the in­verse trans­form gives us a for­mu­la based on the co­sine func­tion. In this case, the phase of the co­sine wave is shift­ed by a quar­ter of a pe­ri­od, mak­ing it equiv­a­lent to a sine wave. We can re­pro­duce the orig­i­nal in­put func­tion us­ing on­ly val­ues from the low­er half of the fre­quen­cy spec­trum. But what hap­pens if we on­ly use val­ues from the up­per half of the fre­quen­cy spec­trum in­stead? Con­sid­er the fol­low­ing:

Figure 37

This for­mu­la clears the val­ues in the low­er part of the fre­quen­cy range and dou­bles the val­ues in the up­per part. Here is a vi­su­al rep­re­sen­ta­tion of the mod­i­fied val­ues:

Figure 38
Figure 39

Anal­o­gous to the pre­vi­ous in­stance, all but one of the val­ues in the fre­quen­cy do­main is a nonze­ro val­ue. Here is what our in­verse trans­form looks like in this in­stance:

Figure 40

Plot­ting this func­tion on a chart with dis­crete sam­ples at the pre­de­ter­mined sam­ple points on the time do­main, here is what it looks like:

Figure 41

As you can see, we have an en­tire­ly dif­fer­ent func­tion on the time do­main when us­ing on­ly the up­per fre­quen­cies. But the val­ues at the sam­ple points are still the same as when we used on­ly the low­er fre­quen­cies. Be­cause the val­ues on each half of the fre­quen­cy range are mir­ror im­ages of one an­oth­er, we on­ly need half of these val­ues to com­plete a full cir­cle back to the time do­main.

Ex­am­ple 3

Sup­pose we have a func­tion that is the su­per­im­po­si­tion of sev­er­al dif­fer­ent si­nu­soidal waves. And each one of those waves can have an am­pli­tude, phase off­set, and fre­quen­cy val­ue that is en­tire­ly in­de­pen­dent of all the oth­er­s. Here is a plot of our func­tion:

Figure 42

Now let’s say that maybe we don’t even know how this in­put func­tion is de­fined; we just know the val­ues at the sam­ple points. Here are the sam­ples val­ues:

Figure 43

Us­ing these sam­ples val­ues, we can ap­ply the for­ward trans­form to find the equiv­a­lent rep­re­sen­ta­tion on the fre­quen­cy do­main:

Figure 44

Let’s as­sume the in­put func­tion con­sists of si­nu­soidal waves with fre­quen­cies on­ly on the low­er half of the fre­quen­cy spec­trum. We can mod­i­fy the val­ues in the fre­quen­cy do­main as shown in the pre­vi­ous sec­tion:

Figure 45

Here is a vi­su­al rep­re­sen­ta­tion of the mod­i­fied val­ues in the fre­quen­cy do­main:

Figure 46
Figure 47

Here are the cal­cu­lat­ed re­sults for each of the pre­de­fined fre­quen­cies:

Figure 48

Us­ing these val­ues in the fre­quen­cy do­main, we can now ap­ply the in­verse trans­form to pro­duce a suit­able func­tion in the time do­main:

Figure 49

And since there are on­ly three nonze­ro val­ues in the ta­ble above, we can rep­re­sent this func­tion as the sum of three dis­tinct si­nu­soidal waves:

Figure 50

Thus, we have a func­tion that is the su­per­im­po­si­tion of three dif­fer­ent si­nu­soidal waves, each with a dif­fer­ent am­pli­tude, phase off­set, and fre­quen­cy. The fol­low­ing plots give a vi­su­al rep­re­sen­ta­tion of each of these three si­nu­soidal com­po­nents:

Figure 51
Figure 52
Figure 53

Adding these three wave­forms to­geth­er gives us a func­tion that re­pro­duces our orig­i­nal in­put. The beau­ty of the dis­crete Fouri­er trans­form is that it al­lows us to de­com­pose a func­tion in­to dis­crete si­nu­soidal com­po­nents. This can be use­ful if you want to am­pli­fy the val­ues on some fre­quen­cy ranges while at­ten­u­at­ing the val­ues on oth­er­s. Ad­just­ments can be made on the fre­quen­cy do­main that might oth­er­wise be dif­fi­cult or im­pos­si­ble to do in the time do­main.

Ex­am­ple 4

What hap­pens if you shift a func­tion ver­ti­cal­ly by some amoun­t? How does this af­fect the val­ues in the fre­quen­cy do­main? Con­sid­er the fol­low­ing in­put func­tion:

Figure 54

This is the same sine wave func­tion we used in the first two ex­am­ples, ex­cept it has been shift­ed up­ward­s. Here is a plot of this func­tion:

Figure 55

Now we can ap­ply the for­ward trans­for­m. Like we did in pre­vi­ous ex­am­ples, let’s as­sume we’re on­ly deal­ing with the fre­quen­cies on the low­er part of the spec­trum and mod­i­fy the val­ues ac­cord­ing­ly. Here are the val­ues on the fre­quen­cy do­main:

Figure 56
Figure 57

No­tice the mag­ni­tude val­ue for the ze­ro fre­quen­cy. This val­ue rep­re­sents the ver­ti­cal shift in the up­ward di­rec­tion. Ap­ply­ing the in­verse trans­for­m, we have the fol­low­ing:

Figure 58

Now let’s plug in the con­crete val­ues and sim­pli­fy, tak­ing in­to con­sid­er­a­tion the fol­low­ing:

Figure 59

Here is the sim­pli­fied ver­sion of the in­verse trans­for­m:

Figure 60

The co­sine of ze­ro is one. Ef­fec­tive­ly, this means that a co­sine wave with a fre­quen­cy of ze­ro is a hor­i­zon­tal line. Per­haps you can think of it as a si­nu­soidal wave with an in­fi­nite wave­length. In this case, the hor­i­zon­tal line is scaled by the mag­ni­tude, giv­ing us our ver­ti­cal off­set.

Ex­am­ple 5

We’ve seen what hap­pens if you shift a func­tion ver­ti­cal­ly in the up­ward di­rec­tion. But what if you move it down in­stead of up? Con­sid­er the fol­low­ing in­put func­tion:

Figure 61

Here we have the same sine wave func­tion used in the pre­vi­ous ex­am­ple, ex­cept it has been shift­ed down­wards in­stead of up­ward­s. Here is a plot of this func­tion:

Figure 62

Now we can ap­ply the for­ward trans­for­m. Like we did in pre­vi­ous ex­am­ples, let’s as­sume we’re on­ly deal­ing with the fre­quen­cies on the low­er part of the spec­trum and mod­i­fy the val­ues ac­cord­ing­ly. Here are the val­ues on the fre­quen­cy do­main:

Figure 63
Figure 64

No­tice the phase off­set for the ze­ro fre­quen­cy in ad­di­tion to the mag­ni­tude val­ue. This phase val­ue re­vers­es the ver­ti­cal shift to the down­ward di­rec­tion. Here is the in­verse trans­for­m:

Figure 65

Now let’s plug in the con­crete val­ues and sim­pli­fy, tak­ing in­to con­sid­er­a­tion the fol­low­ing:

Figure 66

Here is the sim­pli­fied ver­sion of the in­verse trans­for­m:

Figure 67

As men­tioned pre­vi­ous­ly, a co­sine wave with a ze­ro fre­quen­cy is es­sen­tial­ly a hor­i­zon­tal line. But in this case, the phase is off­set by half of a pe­ri­od, giv­ing us a neg­a­tive val­ue. This val­ue, scaled by the mag­ni­tude, gives us the ap­pro­pri­ate off­set in the down­ward di­rec­tion.

Ex­am­ple 6

Sup­pose we have a func­tion that can­not be rep­re­sent­ed as the sum of a fi­nite num­ber of si­nu­soidal waves. A slop­ing line might be an ex­am­ple of such a func­tion. What would the val­ues on the fre­quen­cy do­main look like? What would hap­pen if we trans­formed them back to the time do­main? Con­sid­er this func­tion as our ex­am­ple:

Figure 68

Here is a vi­su­al rep­re­sen­ta­tion of this func­tion, along with the sam­ple val­ues at the sam­ple points on the time do­main:

Figure 69

Us­ing the sam­ple val­ues, we can ap­ply the for­ward trans­form the same way we did in the pre­vi­ous ex­am­ples. And like we did be­fore, let’s mod­i­fy the val­ues in the fre­quen­cy do­main so that we’re on­ly us­ing the fre­quen­cies in the low­er part of the spec­trum. Here is the re­sult:

Figure 70
Figure 71

Once we have com­put­ed the val­ues on the fre­quen­cy do­main, we can turn around and ap­ply the in­verse trans­form to com­plete a round trip back to the time do­main. Here is the re­sult of the in­verse trans­for­m:

Figure 72

In this case, we can­not re­store the orig­i­nal in­put func­tion based on the val­ues in the fre­quen­cy do­main. But as you can see from the chart above, we can sure­ly ap­prox­i­mate it. And we can do so in a way that re­pro­duces the sam­ple val­ues at the sam­ple points in the time do­main. How­ev­er, the sud­den drop-off on the right-hand side of the chart is some­thing worth tak­ing note of. The ap­prox­i­ma­tion does­n’t seem to lend it­self to fore­cast­ing if the da­ta is­n’t in­her­ent­ly si­nu­soidal.

Fast Fouri­er Trans­forms

In the ex­am­ples above, we used a naive method for com­put­ing the dis­crete Fouri­er trans­form and its cor­re­spond­ing in­verse trans­for­m. While sim­ple and easy to un­der­stand, this is prob­a­bly not the most ef­fi­cient com­pu­ta­tion method. There are a va­ri­ety of fast Fouri­er trans­form al­go­rithms that are like­ly to be much more com­pu­ta­tion­al­ly ef­fi­cien­t. And there are pre­ex­ist­ing soft­ware li­braries that im­ple­ment them. Some of these li­braries are even de­signed to run on GPU hard­ware. If you’re work­ing with larg­er da­ta set­s, us­ing one of these fast im­ple­men­ta­tions can sig­nif­i­cant­ly im­prove run­time per­for­mance.

Ac­com­pa­ny­ing source code is avail­able on GitHub.

Com­ments

Show comments