I think that it is not simply interesting, but it is also very encouraging. It also seems to be supported by "Facts on the Ground".
The general image of Christianity is that it is judgmental, hypocritical, too political and out of touch with reality and that image seems to be a direct reflection of the Truth™ of Christianity as so often practiced and marketed in the US.
The encouraging things is that as that perception takes hold, particularly among Christians themselves, perhaps a new reformation will take place.
There are many indications that is exactly what is happening.
Aslan is not a
Tame Lion