Skip to main content
Written By:

Gene Frantz, Professor in the Practice, Rice University and Alan Gatherer, Editor in Chief, CTN

Published: 14 Apr 2017


CTN Issue: April 2017

A note from the editor:

Recently my team was looking for some contract labor to write some really tight DSP (that's Digital Signal Processor which we shall call DSPors going forward) code. Trouble is there are few people left that are able do this kind of work and it got me thinking about how DSP (that's Digital Signal Processing which we shall call DSPing from now on) is evolving and whether the new generation of communication engineers will be up to the challenge. I turned to Will Strauss, the famous tracker of all things DSP and he suggested I talk to a mutual friend, Gene Frantz who, if not quite the father of DSP, certainly did a lot of babysitting and diaper changing in his role at Texas Instruments (remember Speak and Spell? Yup, that was him). Gene became the only Principal Fellow at TI for a while and is now a Professor at Rice University. So well qualified to the challenge. Anyway, this is what we came up with. Comments welcome as always.

Alan Gatherer, Editor-in-Chief

The Future of Digital Signal Processors

Gene Frantz
Gene Frantz

Gene Frantz
Professor in the Practice, Rice University

Alan Gatherer
Alan Gatherer

Alan Gatherer
Editor in Chief, CTN

About 25 years ago Will Strauss, the president of Forward Concepts wrote an article entitled “DSP is dead; long live DSP”[1].  Since then many changes in the industry and academia have happened.  Raw performance of all processors has accelerated, portability has become the norm and cloud computing, IoT and big data have opened the door to a whole new world of opportunities. At the same time C programming has faded in favor of more abstract scripting languages like Python and we now see devices that were traditionally called microprocessors include powerful signal processing extensions into their architectures. So we thought it might be time to check the pulse of DSP one more time.

To start our investigation Gene asked Professors Burris, Johnson, Orchard and Baraniuk, who have been in the world of DSP throughout their careers, for their inputs. It was somewhat of a Pandora's box at that point. So we will try to organize their responses along with our own thoughts and those of Will Strauss, into some main points:

Is the Teaching of DSPing in Decline? If So Why and How?

This covers courses, books and now MOOCs (Massive Open Online Courses) on signals and systems. Designing filters and understanding the role of FFT in signal analysis for instance. There was a general feeling that why we are teaching DSPing has changed to accommodate those who want to go out and be entrepreneurs and innovators of things and who only need to know enough to use DSPing and maybe DSPors to achieve their goals. In the past we taught students to be detailed implementers and skilled practitioners. DSPing was the center of their careers, not just one of many skills to be picked up. Today we know of one DSPing class that uses Matlab compiled on Android devices as the practical section of the class! This may become an issue when new technologies, such as compressed sensing, come along and engineers in the field realize that their basic theory is not enough to understand, never mind implement, this new technology. Don Johnson saw these issues in graduate students and we just quote him here: “I find that graduate students don’t understand Fourier transform symmetries: (why does the output of the FFT always show a very high frequency component even though the signal is supposedly low pass?), circular convolution (I’ll perform ideal low pass filtering by taking the signal’s FFT, set all the high frequencies to zero [wherever they are], and take the inverse transform), and frequency-domain filtering in two dimensions. In statistical signal processing, students don’t know that applying the magical EM algorithm amounts to suboptimal (finding the global maximum is not guaranteed) maximum likelihood estimation. How do I know the noise is indeed Gaussian? What if it isn’t; can I still just use Gaussian-based methods and not get hurt badly?"

If DSPing and/or DSPors Are No Longer Taught (Or with Less Emphasis), Why?

It seems there are many more things to learn in 4 years than there were 25 years ago. What was called Electrical Engineering is becoming part of all engineering disciplines in that engineering disciplines now use microprocessors, wireless communications, signal processing, power management and sensor fusion as tools. Colleges teach these topics as an introduction for all engineers and don’t have time to go into a great deal of detail on why they work.  We don’t particularly want to have two versions of the DSP curriculum, one for Electrical and Computer Engineers and one for everyone else.  Basically, we don’t need the students to know the theory, but only the practice. As a result we are no longer graduating theologians but practitioners. (Gene wishes to point out that this is not true at Rice). Also DSPing is seen as a hard topic and students conclude that, ”if there are so many things to learn, let’s take the easier ones so we can graduate with a higher GPA.”  Basically, DSPing is an elective and not a requirement and DSPors are hardly taught at all.

Secondly there is far less support for teaching DSPors that there was even a decade ago. It used to be that DSPors were a hot trend and companies that specialized in them would provide lots of equipment and cash to set up shiny labs filled with computers connected to the latest and greatest DSPor boards. This is no longer the case. There are some universities that specialize in teaching CUDA and openCL programming for Graphics Processor Units (GPUs), but DSPors have fallen by the wayside in academia. We should point out that this may only be true in the USA and some of our foreign colleagues think that the US has given up on their leadership of efficient programming of embedded systems, preferring to outsource this kind of work to places like India and China. If true this would be a worrying situation for innovation in the US, but maybe a good thing for the rest of the world.

Is a MATLAB Level of Understanding Enough?

Yes, in many cases if the goal is as a precursor for communications theory or control theory classes. The detail of how to design a filter, or implement an FFT is left to specialized classes. Even more worryingly, the details of the tremendously important numerical rounding effects on the different ways of developing the critical baseline functions like FFT, matrix factorization, IIR filters and so on may be left to very specialized and advanced classes, possibly in numerical analysis, rather than signal processing. These details are hidden behind a function call in MATLAB and the double precision accuracy of that function. Not that many engineers worry about rounding effects on a daily basis, but a lack of the basic theory is surely a gap in their education and we have seen it lead to some bad decisions in DSPing development and DSPor programming. Some universities no longer teach assembly code or low level languages like C and there is little emphasis on designing programs to run efficiently. There are a couple of classes we know of at Stanford and MIT where they teach efficient programming and they see performance improvements at the end of the class of 1000X over the kind of code that students will write coming into the class. However the era of efficient and architecture aware programming is not yet dead. In a recent keynote at Stanford John Hennessey predicted that the end of Moore's Law and Dennard scaling would lead to a new era of architectures he called Domain Specific Architectures (DSA) in alignment with the trend towards Domain Specific Languages. His point was that the future of processing even for deep learning, was in specialization and that programmers would have to have a deeper understanding of the architectures they are programming for. The recent paper by Google [2] on their Tensor Processing Unit seems to confirm this. Much of what is being said now was gospel for DSPor in the 1990s. So maybe we will start to teach this stuff again.

Maybe DSP Is Not Dead But Has Just Left the Building, Moving into New and More Profitable Areas?

Will would be the first to point out, as he does expertly every month in his newsletter, that DSPs are all over the place (mainly in cellular handsets), but are hidden from the view of all but a few programmers. However, old DSPing techniques are constantly being re-discovered in new areas (see the article on faster than Nyquist in this news feed for example) in many cases as a result of the overall advancement of technology. Big Data, Cloud Computing and the Internet of Things (IoT), in particular have expanded the application reach of DSPing concepts, introducing them to a broader group of innovative people. IoT may even lead to a resurgence in DSPors. Neural networking for instance, just fancy adaptive filtering, used to be solidly in the DSPing camp but is now a sexy big data technology and is often introduced as a new technology. Classical DSP was based on models from Physics but Machine Learning is based on models from data and hence Digital Signal Processing engineers have morphed into Data Scientists. DSPing remains on the front end of all of this for the pre-processing of signals into data. As we said above, the end of Moore and Denning may bring back DSPor type systems with a vengeance in the next few years, maybe specifically for deep learning.

A Conclusion?

In the future, the term DSP on a resume may distinguish the creative person from the executor.  But there will be new terms that will also make this distinction such as Data Science, Neural Networks, image recognition, etc. The future engineer will need to be more of a generalist and universities have recognized this in their programs. There will still be signals and data which need to be cost effectively processed and hardware that needs to do that.  To make that happen, new theories and practices will continue to be created and we need qualified people to do this. Funding for DSPing however has moved towards Data Science and we should probably just accept this fact and not complain. The architectural bases of DSP expanded to all processors once it was understood that the real breakthrough was the ability to do math efficiently.  Now most microprocessors and microcontrollers do high performance math. The philosophy behind DSPors may be returning as the internet matures and asks for more efficient processing engines. The opportunities to use the concepts and architectures of DSP to solve new problems is mind boggling.  So DSPing didnt die, but it may be living under an assumed name and even in a different department. DSPors are looking mighty sick but we believe are set for a miraculous rediscovery. So all you old DSP programmers, dont retire just yet. You have a new generation to teach. And Long Live DSP!


  1. Strauss, “DSP is dead; long live DSP”, EE Times on line 6/24/2002
  2. "In-Datacenter Performance Analysis of a Tensor Processing Unit TM" to appear at the 44th International Symposium on Computer Architecture (ISCA), Toronto, Canada, June 26, 2017.

Statements and opinions given in a work published by the IEEE or the IEEE Communications Society are the expressions of the author(s). Responsibility for the content of published articles rests upon the authors(s), not IEEE nor the IEEE Communications Society.


Thanks Gene and Alan, a very good, general summary of the current status of DSP. It seems that now DSP will be taught in university engineering programs somewhat similar to linear systems and electromagnetics: necessary background, but no longer a "career path". In addition, I'd like to emphasize that academic and commercial DSP were interdependent and inseparable. Much of the recent "de-emphasis" on DSP is due to lack of transition and modernization on the part of former industry DSP leaders. In particular I will focus on Texas Instruments, having worked closely with them for 30+ years, from the early 1980s through 2016. In 2015 Texas Instruments had a multi $B revenue opportunity to move their DSPs into AI, both real-time inference and training. By that time, their most advanced devices were essentially multicore CPUs, each core about equivalent to an Intel Xeon E5-2260 for calculation intensive processing. They had available 3rd party hardware and software, including PCIe cards with 64 c66x cores, dual GbE ports, Linux drivers and concurrent users (even VM support), c66x speech and image recognition software -- all fully operational. These combined hw+sw solutions had significant competitive advantages vs. GPUs, including SIMD, mixed precision, lower PCIe latency, concurrent threads (e.g. multiuser), direct I/O (i.e. actually on the card), extremely high per-thread throughput for matrix and convolution operations, very robust Linux cmd line tools, and far lower SWaP. At that time, TI was neck-and-neck with Nvidia and FPGA guys in the eyes of customers, and it was anyone's future market for mixed core AI processing. But ... the decision required TI executives to fully engage and promote their solutions in the server market, as without server cards and an AI roadmap they -- and their third parties -- had no business case. The world's embedded system developers had moved to developing everything in servers and only deciding later which target embedded SoCs and cores to port to Edge and IoT products. They required "Edge Native" and open source ways that made their development and data flow seamless with cloud applications. Basically the world had become server first and TI was still server optional, promoting "EVM boards", JTAG emulators, and Windows IDEs. It was an outdated strategy, woefully behind the times. Through 2016-2017 customers lost faith TI would ever make the jump server centric solutions, and I watched firsthand as customers abandoned their TI chip projects, one by one. Looking back now, that 3 year time-frame seems to have been the end of the line for thriving commercial DSPs and a third-party ecosystem. TI continues to offer effective DSP solutions in automotive and other niche areas, but unless they rediscover executive boldness and a vision to embrace servers -- and everything that entails, including high performance PCIe cards, open source, careful, detailed software engineering (i.e. not offshoring it), AI and 5G R&D, and more -- they do not have a comeback path and neither does DSP, at least not as we knew it.

Submitted by jbrower@signal… on 3 March 2020

Sign In to Comment