Has society always been bad?
Has society always been this fucked up?
Where trips to the grocery store cause paranoia that every person behind you is trying to kidnap your children?
Where going for a run by yourself, outside, means having to take self defense classes or carrying a weapon to defend off rapists?
Where trick r treating in neighborhoods isn’t safe anymore?
Where people with children abuse, neglect, and kill them?
Where you can’t go to the mall without thinking perhaps someone in there is so angry they are going to start shooting at any minute?
Where the news is nonstop reporting negativity going on in the world?
Has society always been this scary to be in? Or is it just we are hearing more about it now?
Does the media feed into mental illnesses such as depression, social anxiety, and paranoia?
With all the negative shit we hear and see, how can we live in a way where we don’t have to be afraid of every stranger we meet?
Has it always been this bad? Will it ever get better? Can it get better?