For decades, one of the promises of the American way has been that each current generation has assumed that their children would be better off. In the era of Joe Biden, that bright, shining hope is no more.
The sad state of America was revealed in a new Wall Street Journal-NORC Poll that was published this week.
The survey found that Americans are negative on nearly every aspect of America, and feel that their children will be worse off, not better off.
According to the paper’s report, one woman’s response was typical:
“No matter how much they increase your pay, everything else is going up,” said Kristy Morrow, a coordinator for a hospital who lives in Big Spring, Texas. “I do fear that for the kids.”
Ms. Morrow, 37, said she’s concerned her children will be worse off because deep divisions in America have left people unable to fix the country’s problems. The single mother of two young boys and an adult daughter, who earns about $45,000 a year, said she traded her Chevrolet Tahoe for a GMC Terrain to lower her gas costs and is teaching her boys the importance of spending money on needs, not wants.
The paper adds this very worrisome paragraph:
For more than three decades, NORC has asked Americans whether life for their children’s generation will be better than it has been for their own using its General Social Survey. This year 78% said they don’t feel confident that is the case, the highest share since the survey began asking the question every few years in 1990. White respondents were more likely to say they are not confident than Black and Hispanic respondents.
One reason Americans worry the next generation will fall behind is that they are losing faith in the power of a college education to move them up the economic ladder. Some 56% of respondents said that a four-year college degree wasn’t worth the cost because people often graduate without specific job skills and with heavy debt. Meanwhile, 42% of respondents said it was worth it because people have a better chance to get a good job and earn more. That marked a reversal from the last time the question was asked in 2017, when a narrow plurality viewed college as worth the investment.
Since the day this country was founded, Americans expected that their children would be better off. And until now, they were right. America has steadily improved in every way since the founding… until the last 30 years or so when the Democrat Party went full-out socialist.
Democrats are responsible for ALL of this, folks.
The Democrat Party is thoroughly destroying this nation in every single sector.