Should I be deleting spam trackbacks/pingbacks?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel="prev" / "next"
Hi guys, The tech department implemented rel="prev" and rel="next" on this website a long time ago.
Intermediate & Advanced SEO | | AdenaSEO
We also added a canonical tag to the 'own' page. We're talking about the following situation: https://bit.ly/2H3HpRD However we still see a situation where a lot of paginated pages are visible in the SERP.
Is this just a case of rel="prev" and "next" being directives to Google?
And in this specific case, Google deciding to not only show the 1st page in the SERP, but still show most of the paginated pages in the SERP? Please let me know, what you think. Regards,
Tom1 -
Staging/Development Site Indexed?
So, my company's site has been pretty tough to try to get moving in the right direction on Google's SERPs. I had believed that it was mainly due to having a shortage of back links and a horrible home page load time. Everything else seems to be set up pretty well. I was messing around and used the site: Google search operator for our staging site. I found stage.site.com and a lot of our other staging pages in the search results. I have to think that this is the problem and causing a duplicate content penalty of the entire site. I guess I now need to 301 redirect the entire site? Has anyone every had this issue before and have fixed it? Thanks for any help.
Intermediate & Advanced SEO | | aua0 -
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Deleting Outdated News Pages??
Hi everyone, I'm currently doing a full content audit for my company, in preparation for a website redesign. I've discovered thousands of pages (dating all the way back to 2009) with thin, outdated, and irrelevant content. ie: real estate news and predictions that are now super old news. According to analytics, these older pages aren't receiving any traffic, so I think the best course of action is to delete these pages & add 404 redirects. In my opinion, this should be a big priority, because these pages are likely already hurting our domain authority to some extent & it's just a matter of time before we're really penalized by Google. Some members of my team have a different opinion -- they worry that deleting 1000 pages could hurt our rankings, and they want to wait and discuss the issue further in 3Q or 4Q (once the site redesign is completed and we have time to focus on it). Am I wrong to think that waiting is a very bad idea? Google will notice that we've done a major site redesign--we've written all new copy, optimized the UX & content organization to make info easier to find, created new lead magnets, optimized images, etc.-- but we didn't bother to update 1000 pages of outdated content that no one is looking at...won't that look bad? Do you agree that we should delete/merge all outdated content now, rather than waiting until after the site redesign? Or am I overreacting? Thanks so much for your help!
Intermediate & Advanced SEO | | JCon7110 -
Does anyone know how dynamic/personalized website content affects SEO?
A client using Marketo has inquired about personalizing their website content to be personalized based on a persona. To be clear, I'm talking about key website pages, maybe even the Home page, not PPC/campaign specific landing pages. For example, areas of on the site would change to display content differently to a CEO vs a sales person. I'm new to marketing automation and don't exactly know how this piece works. Hoping someone here has experience or can provide pros/cons guidance. How would search engines work with this type of page? Here's Marketo's site explaining what it does: https://docs.marketo.com/display/public/DOCS/Web+Personalization+-+RTP
Intermediate & Advanced SEO | | Flock.Media0 -
Duplicate/ <title>element too long issues</title>
I have a "duplicate <title>"/"<title> element too long" issue with thousands of pages. In the future I would like to automate these in a way that keeps them from being duplicated AND too long. The solution I came up with was to standardize these monthly posts with a similar, shorter, <title>, but then differentiate by adding the month and the year of the post at the end of each <title>. Hundreds of these come out every week, so it is hard to sit there and come up with a unique <title> every time. With this solution the <title> tags would undoubtedly be short enough, however my primary concern is, would simply adding the month and year at the end of each <title> be enough for Google/Moz to decide it is not a duplicate? How much variation is enough for it not to be deemed a duplicate <title>? </p></title>
Intermediate & Advanced SEO | | Brian_Dowd0 -
SEO within the URL /
If I were optimizing for 'marketing success' and my URL structure was domain.com/marketing/success would that count? I'm not sure if the '/' affects the keyword term. My assumption is that it does, but I wasn't 100% sure. Thanks!
Intermediate & Advanced SEO | | KristinaWitmer0 -
Restructuring/Removing 301 Redirects Due To Newly Optimized Keywords
Just to be clear, this is for one unique page on a website. Also, please see my diagram attached. Let's say that a page's URL was originally /original. So, you optimize the page for a new keyword (keyword 1), and therefore change the URL to /keyword-1. A 301 redirect would then be placed... /original > /keyword-1 However, let's say 6 months down the road you realize that the keyword you optimized the page for (keyword 1) just isn't working. You research for a new keyword, and come up with (keyword 2). So, you'd like to rename the page's URL to /keyword-2. After placing a redirect from the current page (keyword 1) to the 'now' new page (keyword 2), it would look like this... /original > /keyword-1 > /keyword-2 We know that making a server go through more than one redirect slows the server load time, and even more 'link-juice' is lost in translation. Because of this, would it make sense to remove the original redirect and instead place redirects like this? /original > /keyword-2 /keyword-1 > /keyword-2 To me, this would make the most sense for preserving SEO. However, I've read that removing 301 redirects can cause user issues due to browsers caching the now 'removed' redirect. Even if this is ideal for SEO, could it be more work than it's worth? Does anyone have any experience/input on this? If so, I greatly appreciate your time! oDvLl.jpg
Intermediate & Advanced SEO | | LogicalMediaGroup1