Impact of wiping content on a subdomain
-
Hi,
I've been asked to look at the impact of bulk deleting content on a blog subdomain and how it could impact the SEO of a linked www subdomain.
Can deleting content on one subdomain have a negative impact on other linked subdomains?
Thanks
-
I am not 100% in agreement with Alick. Google struggles a bit with subdomains, it sometimes treats them as separate websites sometimes as the same website. John Mueller informed Distilled of same circa 12 months ago.
I would be cautious going down the path suggested, can you not migrate the content and re-direct?
Hope that helps.
-
Hi,
AFAIK Google considers Subdomain as independent site so deleting content won't make any impact of linked www subdomain but i would suggest you to wait for more response from other community members.
Second before deleting any content you must check Google Analyics whether you are getting traffic from those posts or not and if you are getting significant traffic you shouldn't delete the post.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content site not penalized
Was reviewing a site, www.adspecialtyproductscatalog.com, and noted that even though there are over 50,000 total issues found by automated crawls, including 3000 pages with duplicate titles and 6,000 with duplicate content this site still ranks high for primary keywords. The same essay's worth of content is pasted at the bottom of every single page. What gives, Google?
White Hat / Black Hat SEO | | KenSchaefer0 -
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
How do I make a content calendar to increase my rank for a key word?
I've watched more than a few seminars on having a content calendar. Now I'm curious as to what I would need to do to increase ranking for a specific keyword in local SEO. Let's say I wanted to help them increase their rank for used trucks in buffalo, NY. Would I regularly publish blog posts about used trucks? Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
Dublicated content
I have someone to write new pages for my site. How do I know the pages she is writing is not duplicated from other other website. is there any website or software to do this? What is the best way to check? Thank you
White Hat / Black Hat SEO | | SinaKashani0 -
Content within a toggle, Juice or No Juice?
Greetings Mozzers, I recently added a significant amount of information within a single page utilizing toggles to hide the content from a user and for them to see it they must click to reveal. Since technically the code is reading "display:none" to start, would that be considered "Black Hat" or "Not There" to crawlers? It isn't displayed in any sort of spammy way. It is more for the UX of the visitor that toggles were utilized. Thoughts and advice is greatly appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Syndicated content outperforming our hard work!
Our company (FindMyAccident) is an accident news site. Our goal is to roll our reporting out to all 50 states; currently, we operate full-time in 7 states. To date, the largest expenditure is our writing staff. We hire professional
White Hat / Black Hat SEO | | Wayne76
journalists who work with police departments and other sources to develop written
content and video for our site. Our visitors also contribute stories and/or
tips that add to the content on our domain. In short, our content/media is 100% original. A site that often appears alongside us in the SERPs in the markets where we work full-time is accidentin.com. They are a site that syndicates accident news and offers little original content. (They also allow users to submit their own accident stories, and the entries index quickly and are sometimes viewed by hundreds of people in the same day. What's perplexing is that these entries are isolated incidents that have little to no media value, yet they do extremely well.) (I don't rest my bets with Quantcast figures, but accidentin does use their pixel sourcing and the figures indicate that they are receiving up to 80k visitors a day in some instances.) I understand that it's common to see news sites syndicate from the AP, etc., and traffic accident news is not going to have a lot of competition (in most instances), but the real shocker is that accidentin will sometimes appear as the first or second result above the original sources??? The question: does anyone have a guess as to what is making it perform so well? Are they bound to fade away? While looking at their model, I'm wondering if we're not silly to syndicate news in the states where we don't have actual staff? It would seem we could attract more traffic by setting up syndication in our vacant states. OR Is our competitor's site bound to fade away? Thanks, gang, hope all of you have a great 2013! Wayne0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0