At the start of Obama's terms it was said he wanted to shift America's focus from looking across the Atlantic to across the Pacific. Trump seems obsessed with painting Europe as The Enemy. The differences in approach to Religion and Gun Control, to quote the high profile ones, are getting bigger. Is the West, as a unified concept, dead?