Okay, I'll chime in.
But let me preface this by saying there have been some incredible progress in the field of error control coding in the last 15-20 years or so. So much so that modern forward error correction codes are pretty darn close to Shannon's limit. So in my opinion, there isn't much room left for major breakthroughs that bring fame and glory like there was in times past. That said, there's still opportunity for incremental advancements with new and improved communication systems of the future. That's just my opinion though. Although that really doesn't answer your question.
Any time a new communication systems is developed, there may be effort put into the error control coding that is designed specifically for that system, given the systems parameters such as available bandwidth, power constraints, acceptable latency, channel characteristics (AWGN vs. multipath fading, etc.) and maybe a few other things that I can't think of off the top of my head. And applications of such systems can be as simple as one gadget communicating with some other gadget (picture a flash drive communicating with a computer) or something more impressive like a new, deep space satellite communicating with Earth. If you do choose this path, realize that there are other engineers/mathematicians/information theorists that have been around awhile and want a piece of the action. So you may find some competition. Maybe instead of "competition" I should replace that word with "collaboration." Still, I feel that this doesn't really answer your question either. So now for the answer.
So what uses coding theory? Answer: pretty much any modern piece of technology that communicates with any other piece of technology uses coding theory to some extent -- sometimes a lot. I guarantee you that whatever gadget it is that you are using to read this very post uses coding theory up the yin-yang.
Everything from hard drives (HDD); solid state drives (SDD); flash drives; PCIe bus or any computer bus that allows communications between different parts of the computer such as the graphics processor and the CPU; USB; Bluetooth; CDs; DVDs; Blu-ray; wireless LAN (all permutations of IEEE 802.11); cable modems; DSL; satellite dish TV; all modern cell phone technologies such as CDMA, WCDMA, TD-SCDMA, WiMAX, and LTE; low Earth orbit satellites; geosynchronous satellites; and yes, deep space satellite communication systems, all utilize coding theory.
Oh, and ISBN, the system used to uniquely distinguish a given book [yes, an old fashion paper book] from other books.
So, there you go.
Given that you are able to do that, I can think of at least three, maybe up to four or five, ways that your gadget is using coding theory right now, allowing you to read this.
[Edit: I suppose you were actually asking "who," not "what." As memory serves me, organizations such as NASA's Jet Propulsion Laboratory (JPL) have made major contributions to the field of coding theory over the last several decades. Also, private companies such as Qualcomm, among others, have made advancements. (Qualcomm was co-founded by Andrew Viterbi, creater of the Viterbi algorithm, after all.) And of course a very large portion of the advancement of the field is done at universities by academia.]