Are we really surprised that there's a correlation between winning percentage and times ranked? Rankings #1 factor are wins and losses. Preseason polls generally factor in previous season results, hence successful teams also have high expectations. It's no surprise that Notre Dame is more often "overrated", because they've done well. The most "overrated" teams are also the most successful teams. So yes, winning teams also are ones that get ranked high and sometimes too high.
Why weekly rating are worthless when view historical success is because end of season ratings are a much better way to see how successful a season was.
In 1984, Texas was ranked in the top 5 10 times. They finished unranked.
In 1972, Texas was ranked in the top 5 once. They finished at #3.
Which season was more successful? Obviously 1972. How do we know? Because of the final rank. If we go by "number of time top 5", we'd say 1984, and be wrong.
Now, maybe Texas looked and played like a top 5 team more often in 1984. Teams go get better or worse during the season. Maryland almost climbed into the rankings this year and probably deservedly so, but injuries killed them. All in all, their season was deserving of a non-ranking, despite the promise they had. So, while game ranking can be worthwhile, measuring the season as a whole counting up times ranked is pure guesswork, which is unnecessary when you have the final poll.
So, "times ranked" is a worthless measure in terms of yearly success.