Would you agree that JFK's assassination was the end the America? At least, the America that is often romanticized as beautiful, free, mighty; the America invoked by naive but well-meaning Republicans? His death and the ensuing radical social change of the 60's seem to have dealt a fatal blow on the America that we once knew.
I've been doing a lot of contemplation and research about this particular view of the USA that now seems like a dimly remembered dream. A nation now fabled like a kingdom long swallowed by twilight. So easy is it to slip into "land of the free, home of the brave" when really we are not either of those anymore. Maybe I'm looking at this from a philosophical perspective though and Veeky Forums can fill me in with a more objective basis.
If it matters at all, I'm a far right libertarian that believes the Unabomber did nothing wrong: I strongly believe that humanity must dismantle the industrial system if we want to survive to 2100. And I'll disclose that I like Trump for his belligerent personage but I don't see his administration improving the nation by any significant amount. Hopefully my question and position make sense.