Reconsidering the impossible
I grabbed the latest issue of IEEE Signal Processing Magazine out of my mailbox the other day. As I was flipping through the pages I noticed a familiar face: Doug Williams, who is the Special Sections Area Editor for the magazine. (I know Doug because he joined the faculty at Georgia Tech as I was finishing up my Ph.D. there.) He wrote the "From the Editor" column this month, entitled "Periodically Reconsidering the Impossible." His point is that people who recognize outdated assumptions can "sometimes initiate remarkable progress" in a particular field. One example he gives is the fast Fourier transform:
By the late 1990s, very few people were developing new fast Fourier transform (FFT) algorithms, and the field was considered to be very mature. Researchers had long since determined exactly how many arithmetic operations were necessary to perform a discrete Fourier transform (DFT), and optimized FFT software was widely available. However, these algorithms were based on assumptions made in the 1970s that minimizing additions and multiplications would yield the quickest implementations. Computation speed on modern processors is much more dependent on efficient pipelining and memory access. Thus, the field was ripe for a pair of computer scientists to reformulate the problem, and, in 1998, Matteo Frigo and Steven G. Johnson presented "FFTW: An Adaptive Architecture for the FFT" at the 1998 ICASSP. Their software, which can actually be found at http://www.fftw.org, is actually a collection of FFT algorithms that can self-optimize to choose the best combination for a particular computing platform.
—IEEE Signal Processing Magazine, vol. 25, no. 2, March 2008, p. 2
Doug's point about arithmetic operation counting is very true. Today whenever we look at improving performance in MATLAB, memory access patterns and processor utilization tend to dominate the discussion.
It didn't take long for The MathWorks to incorporate FFTW into MATLAB (in version 6). The FFTW code would have horrified DSP researchers from a couple of decades ago. Among other things, it was (GASP!) explicitly recursive, which was a no-no back then.
Doug's column introduces the special issue on compressive sampling, which is a dramatically different look at sampling theory. I'm looking forward to reading through it.
To leave a comment, please click here to sign in to your MathWorks Account or create a new one.