Alright, we are going to show our ignorance. We would like for those in the know to shed some light on some things. We appreciate all input and all opinions as we are very concerned about the over all perceptions of people in general.
When did the government of the United States of America, decide that we did not have the right of free will to make choices that only affect us? We mean, why is there a LAW that states we have to wear a seat belt when we are in a motor vehicle? The only entities that should be concerned if we wear a seat belt or not is our health insurance company and our auto insurance company, if we have personal injury as an option. That’s it. And they shouldn't be able to tell us we have to wear a seat belt. They may charge us differently for wearing it or not. But it will be our decision whether or not to wear it. Did the government “cow down” to the insurance companies and pass this to appease them? If so, when did the government start taking orders from industry? We hear you say, “They did it to reduce health cost and drains on Medicaid / Medicare / Social Security”. When did the government become a wet nurse? When did the people of the United States quit taking responsibility for their actions? Why is it that every time there is a problem, the government is supposed to be the fixer?
And when did the government start TELLING us what is in our best interest? If we are not mistaken, the politicians are OUR PROXIES! They are to forward OUR OPINIONS and DESIRES. When did they get the idea that they ruled the PEOPLE and the PEOPLE bowed to THEM? Or are they just trying to run the country because WE just don’t want the RESPONSIBILITY of THINKING FOR OURSELVES? Have WE become a nation content to let others lead? To have others tell us what is right and what is wrong? To have the freedom to choose and the freedom to act taken away from us because we just can’t be bothered with thinking for ourselves?
WE THE PEOPLE OF THE UNITED STATES OF AMERICA, not THOSE WE ELECT, are the government. Those we elect are our proxies. When we forget that WE are responsible. When WE, through freewill or negligence, give away OUR authority to others, WE will have no rights other than those handed out at the discretion of others.
And speaking of people, when did they change? When did they go from “I can work and achieve a better life” to “give me some of yours so we can be equal”? When did the socialist attitude of “I deserve and need” replace “I can strive and achieve”? We remember a time when a person would take any job available to avoid going on “assistance”. Now people have no qualms about getting “assistance”. People would rather take a handout than take a job “below them”. Where did kids today get the idea that the world owes them? That with out education and experience, they deserve 6 figure jobs? That some one will always be there to pick them up and dust them off and tell them everything will be OK? Why is reality a foreign concept to them? Is it parents wanting to be friends and not parents? Is it the parents that work all day, come home and don’t want to be bothered with raising kids? Is it the parents that spend so little TIME with there kids that they basically let the kids run wild, bailing them out of any trouble and coming to their defense when ever they screw up because they feel guilty about not being there all the other time? Is it because the kids have never learned that there are consequences for their actions? And when did we start rewarding just for participating? What’s the incentive to get better? We just don’t understand. That is probably why we decided not to have kids.
There is more to come, but I have run out of time, but the rant will continue as soon as I get back.
Until then, tell me what you think. Have you noticed these things? Is it just our imagination?
Wednesday, October 22, 2008
When did America change?
Posted by Running Behind at 4:25 PM
Subscribe to:
Comment Feed (RSS)
|