I take two issues with this suggestion of consciousness or subjective experience:
First, the search criteria:
This article, and at least one other that it cites (#21, "Neural correlate of subjective sensory experience gradually builds up across cortical areas"), implicitly tie "subjective" or "conscious" experience with the information processing that ultimately supports such an experience.
Let's take a situation where you are browsing the internet and run across a very alarming report - and as a result you feel subjectively alarmed. Would it be useful to point to the internet or your browser software or the electronics in you computer as critical components of this subjective experience? I would say not.
What about your basic neural image processing and language skills required to read and decode the alarming reports? Again, I would say not.
What about when you tie that report to other knowledge you remember and deduce that this report could have a bearing on your personal well-being? Here I would say perhaps but probably not precisely. At that level of information processing, the content of consciousness still does not quite match that information environment. I would not argue that absolutely nothing conscious propagates at this level, but that even if it does, it would not be the kind of consciousness we are familiar with.
Once the alarming situation has been identified as personally important, what about the process of discovering possible activities (sharing the news, buying up toilet paper, whatever) that could predictably improve the situation? I would say "yes". That is the stage of information processing where what we are typically conscious of closely matches the content of our subjective experience.
Secondly, the character of what is shared with mammals:
This one is tougher to describe. But let's say I found exactly where in the human brain consciousness occurs, the precise "gate". (As an aside: I am not suggesting that we only have one such gate - but I do suggest that only one is needed at a time).
If I then take that gate, replicate it, and incorporate it into my own circuit design, I could claim that such as circuit had a "subjective experience" - even if it's sole purpose in the circuitry was relatively mundane - say generating improved Google search results.
So my complaint about the article is that it does not address this - leaving it up to the reader to imagine what it might be like to be that bird when in fact, even if there is a subjective element in the bird's information processing, it would not make the bird's subjective experience (presuming there is one) similar to mammals.