Does anyone really believe that the country will be any better off when Republicans control Congress? As long as lawyers make and alter the law, politicians are controlled by contributors and Americans allow it, there will never be long term improvement in the condition of this country.
The President is slowly taking away Congress' authority and bypassing them by appointing Czars. He's choosing Supreme Court Justices who favor government instead of the Constitution. Government is taking over more and more of the private sector, even though they can't run the post office efficiently. They are the only entity in this country legally able to extort money out of business.
Unions have historically been linked to organized crime, but now they are partners with our government. Along with the National Guard, ACORN and the Boy Scouts, everything should be in place to monitor the workplace, business and the personal lives of American families.
People who are poor, sick and hungry are much more easily controlled. Dependence on our government is the quickest and easiest way to live in poverty; become unemployed, homeless and starve. If you don't believe me, just look around. Spend a day at the grocery store and watch people shop, you would be surprised at what you see.
Republican politicians will not fix America's problems. Only the American people can fix it, and it won't be pretty. The problem seems to be that by the time they get around to addressing the problem, repair may not be an option
But as always, I could be wrong,