Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there a way to prevent Google Alerts from picking up old press releases?
-
I have a client that wants a lot of old press releases (pdfs) added to their news page, but they don't want these to show up in Google Alerts. Is there a way for me to prevent this?
-
Thanks for the post Keri.
Yep, the OCR option would still make the image option for hiding "moo"
-
Harder, but certainly not impossible. I had Google Alerts come up on scanned PDF copies of newsletters from the 1980s and 1990s that were images.
The files recently moved and aren't showing up for the query, but I did see something else interesting. When I went to view one of the newsletters (https://docs.google.com/file/d/0B2S0WP3ixBdTVWg3RmFadF91ek0/edit?pli=1), it said "extracting text" for a few moments, then had a search box where I could search the document. On the fly, Google was doing OCR work and seemed decently accurate in the couple of tests I had done. There's a whole bunch of these newsletters at http://www.modelwarshipcombat.com/howto.shtml#hullbusters if you want to mess around with it at all.
-
Well that is how to exclude them from an alert that they setup, but I think they are talking about anyone who would setup an alert that might find the PDFs.
One other idea I had, that I think may help. If you setup the PDFs as images vs text then it would be harder for Google to "read" the PDFs and therefore not catalog them properly for the alert, but then this would have the same net effect of not having the PDFs in the index at all.
Danielle, my other question would be - why do they give a crap about Google Alerts specifically. There has been all kinds of issues with the service and if someone is really interested in finding out info on the company, there are other ways to monitor a website than Google Alerts. I used to use services that simply monitor a page (say the news release page) and lets me know when it is updated, this was often faster than Google Alerts and I would find stuff on a page before others who did only use Google Alerts. I think they are being kind of myopic about the whole approach and that blocking for Google Alerts may not help them as much as they think. Way more people simply search on Google vs using Alerts.
-
The easiest thing to do in this situation would be to add negative keywords or advanced operators to your google alert that prevent the new pages from triggering the alert. You can do this be adding advanced operators that exclude an exact match phrase, a file type, the clients domain or just a specific directory. If all the new pdf files will be in the same directory or share a common url structure you can exclude using the "inurl:-" operator.
-
That also presumes Google Alerts is anything near accurate. I've had it come up with things that have been on the web for years and for whatever reason, Google thinks they are new.
-
That was what I was thinking would have to be done... It's a little complicated on why they don't want them showing up in Alerts. They do want them showing up on the web, just not as an Alert. I'll let them know they can't have it both ways!
-
Robots.txt and exclude those files. Note that this takes them out of the web index in general so they will not show up in searches.
You need to ask your client why they are putting things on the web if they do not want them to be found. If they do not want them found, dont put them up on the web.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the safest way to redirect for best SEO benefits?
What is the safest way to redirect for best SEO benefits? Example: loodgieter-aanhuis.nl -> loodgieters-ambacht.nl Does someone have any technical information on how to (root) redirect for best SEO practices?
On-Page Optimization | | hans-keeren0 -
Solved Recommended title length for Google search results
I read the recommended title length is 50-60 characters depending on alphabets, etc,.
On-Page Optimization | | Mike555
Anyways, my question is, is there any harm of having longer title?
If all my important keywords are within the 50-60 characters that will show up on search results, I can still make the title longer, it's just that those keywords outside won't have any effect on search results?0 -
Google Reviews Plugin - Does This Impact Negatively On SEO By Diluting Optimisation
I know optimisation is now considered 'old hat' but like many old hats not only is it comfortable but it is (in my experience) still functional and working in ranking websites. Yes there are plenty of other drivers, but I still consider optimisation to be important, hence the question Google Reviews Plugin - Does This Impact Negatively On SEO By Diluting Optimisation? From my (limited in many ways) understanding this puts hundreds if not thousands of extra words on a page - so this must surely be reducing the amount of optimisation? And then could it actually lead to a decline in rankings? Has anyone had any experience in this, I would love to use the Google Reviews plugin but just wanted to be sure first... Many thanks KT
On-Page Optimization | | Markkc1 -
Does Google penalize you for reindexing multiple URLS?
Hello, Just a quick, question! I was wanting to know if multiple page indexing (site overhaul) could cause a drop in organic traffic ranking or be penalized by Google for submitting multiple pages at one time. Thanks
On-Page Optimization | | InternetRep0 -
SEO value of old press releases (as content)?
Howdy Moz Community, I'm working with a client on migrating content to a new site/CMS and am wondering whether anyone has thoughts on the value of old press releases. I'm familiar with the devaluation of press release links from early 2013, but I'm wondering more about their value as content. Does importing old press releases (3-5 years old) create contextual depth of content that has some value for the site as a whole (even though the news contained within is useless)? Or, do these old press releases just create clutter and waste time (in migration). The site has a wealth of additional content (articles and videos), so the press releases wouldn't be covering up for thin content. I'm just wondering whether there's any best practices or a general rule of thumb. Thanks!
On-Page Optimization | | MilesMedia0 -
Is the HTML content inside an image slideshow of a website crawled by Google?
I am building a website for a client and i am in a dilemma whether to go for an image slideshow with HTML content on the slides or go for a static full size image on the homepage. My concern is that HTML content on the slideshow may not get crawled by Google and hence may not be SEO friendly.
On-Page Optimization | | aravinn0 -
Best way to separate blogs, media coverage, and press releases on WordPress?
I'm curious what some of your thoughts are on the best way to handle the separation of blog posts, from press releases stories, from media coverage. With 1 WordPress installation, we're obviously utilizing the Posts for these types of content. It seems obvious to put press releases into a "press release" category and media coverage into a "media coverage" category.... but then what about blog posts? We could put blog posts into a "blog" category, but I hate that. And what about actual blog categories? I tried making sub-categories for the blog category which seemed like it was going to work, until the breadcrumbs looked all crazy. Example: Homepage > Blog > Blog > Sub-Category Homepage = http://www.example.com First 'Blog' = http://www.example.com/blog Second 'Blog' = http://www.example.com/category/blog Sub-Category = http://www.example.com/category/blog/sub-category This just doesn't seem very clean and I feel like there has to be a better solution to this. What about post types? I've never really worked with them. Is that the solution to my woes? All suggestions are welcome! EDIT: I should add that we would like the URL to contain /blog/ for blog posts /media-coverage/ for media coverage, and /press-releases/ for press releases. For blog posts, we don't want the sub-category to be in the URL.
On-Page Optimization | | Philip-DiPatrizio0 -
Does Google index dynamically generated content/headers, etc.?
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name> We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages. My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content? The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL? Any best practices we should know about?
On-Page Optimization | | editabletext0