I want your opinions on the lack of increase in Pintrest's PR
-
Many months ago, a fellow marketer at my company introduced me to Pintrest, claiming that it would be good for our business. Pintrest was very much unknown by many just a few short months ago. Since then, I have seen it take off like wildfire, with excessive media coverage, registrations, and people putting the button on their sites. It must have thousands more backlinks now than it did six months ago--high quality ones too, as it's had coverage in virtually every major new media outlet.
I want your opinion as to why it has remained a PR6 site this entire time. It was a PR6 site then and it still is now. I know the increase in PR is algorithmic, but come on! Can people share their experiences they've had link building for those higher PR sites? How much harder does it get?
-
So here's something interesting. If those PR toolbars are so behind, why is it alreay showing http://www.buildmyrank.com/ to be a PR0 when it was just de-indexed by Google a couple weeks ago?
-
Do to the fact that Google's PageRank (PR) is updated so randomly, I believe that is why many people use other Metrics such as SEOmoz's Page Authority (PA) and Domain Authority (DA) to really track a sites progress.
PR is just a number we show to clients when they ask, but focus on PA and DA and explain to the clients why these numbers are much more important. These are updated monthly and are pretty much inline with Google's algorithms.
-
Sometimes it's been as long as six months between toolbar pagerank updates.
Nope, no way to check the real PR. The answer you'll get from many people is to ignore those green pixels and carry on with other things.
-
Oh! A few times per year! I didn't know that. I figured it was once a month or maybe once every other month. So would the same be true of a site like this? http://www.prchecker.info/
Is there no way to check the real PR?
-
Keep in mind you're only seeing the Toolbar Page Rank, which updates only a few times a year, and isn't an exact reflection of the constantly-updated real PR that's internally calculated by Google.
-
This is a really interesting question. I'd never really thought about it.
Perhaps it's just Google being spiteful as it's people use it more than G+ (Not really what I think but I don't understand why.
Maybe Google throttle back PR increases for new sites. I suppose as Pinterest is relatively new, although it has many good quality links, Google may not let achieve a greater PR than 6 for the first couple of years!? This is just wild speculation by the way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blacklisted website no longer blacklisted, but will not appear on Google's search engine.
We have a client who before us, had a website that was blacklisted by Google. After we created their new website, we submitted an appeal through Google's Webmaster Tools, and it was approved. One year later, they are still unable to rank for anything on Google. The keyword we are attempting to rank for on their home page is "Day in the Life Legal Videos" which shouldn't be too difficult to rank for after a year. But their website cannot be found. What else can we do to repair this previously blacklisted website after we're already been approved by Google? After doing a link audit, we found only one link with a spam score of 7, but I highly doubt that is what is causing this website to no longer appear on Google. Here is the website in question: https://www.verdictvideos.com/
Intermediate & Advanced SEO | | rodneywarner0 -
We used to speak of too many links from same C block as bad, have CDN's like CloudFlare made that concept irrelevant?
Over lunch with our head of development, we were discussing the way CloudFlare and other CDN's help prevent DDOS attacks, etc. and I began to wonder about the IP address vs. the reverse proxy IP address. Before we would look to see commonalities in the IP as a way that search engines would modify the value to given links and most link software showed this. For ahrefs, I know they still show common IPs using the C block as the reference point. I began to get curious about what was the real IP when our head of dev said, that is the IP from CloudFlare... So, I ran a site in ahrefs and we got an older site we had developed years ago that showed up as follows: Actos-lawsuit.org 104.28.13.57 and again as 104.28.12.57 (duplicate C block is first three sets of numbers are the same and obviously, this has a .12 and a .13 so not duplicate.) Then we looked at our host to see what was the IP shown there: 104.239.226.120. So, this really begs a question of is C Block data or even IP address data still relevant with regard to links? What do the search engines see when they look for IP address now? Yes, I have an opinion, but would love to hear yours first!
Intermediate & Advanced SEO | | RobertFisher0 -
Should I change client's keyword stuffed URLs?
Hi Guys, We currently have a client that offers reviews and preparation classes for their industry (online and offline). One of the main things that I have noticed is how all of their product landing page urls are stuffed with keywords. I have read changing url's will impact up to 25% traffic and to not mess with url's unless it is completely needed. My question is, when url's are stuffed with keywords and make the url length over 200 characters, should I be focusing on a more structured url system?
Intermediate & Advanced SEO | | EricLee1230 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Keyword research when the site's subject is low volume
Hey guys, what do you do when you planning a new website and doing keyword research for a site when the avg. search volumes are relatively low. We set up run contact centres for UK charities including voice, webchat, sms, email and response fulfillment etc. It seems that people aren't really searching that often for this 'sexy subject'. Average volumes for searches with some intent/qualifier range from between 10-100 monthly searches. What sort of strategies would you adopt in this scenario? Do you optimise for what you can and then make a large focus on other digital marketing tactics such as content marketing, social media, email marketing etc. Thanks for your time guys Leo
Intermediate & Advanced SEO | | Leo_Woodhead0 -
Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Hi everyone,
Intermediate & Advanced SEO | | AxialDev
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords. Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French. Background info: In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites. Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE). We have a lot of sites on our C-Block, some of poor quality. We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox. We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business. Only a third of our organic visits come from Canada. What are our options? Change domain and delete the current one? Disallow the blog except for a few good articles, hoping it helps Google understand what we really do. Keep donating to Adwords? Any help greatly appreciated!
Thanks!2 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0