Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Removing Media from Wordpress
-
I've run the seomoz on page report and found an interesting issue.
I'm using wordpress and it seems that every picture I add to my articles seem to be added as separate pages to the site.
I'm having to go to each and every picture and creating a meta tag and description to it. I still get duplicate content issues with the same.
On my Disqus system, I get the same pictures added just as a page or article would look like.
What can I do to avoid this?
-
I actually never learned to enjoy coffee, so it's no sacrifice at all. But you'll have to pry my Earl Grey from my cold dead hands
-
Ha! I'm trying to quit. _Again. _
-
Great that you got what you were looking for, Alex. Looks like you got a bunch of useful info from the responses. Would be great if you could designate the replies you found were "good answers", both to reward the author and to help future readers of the question.
And I had a mug of tea in your honour - not going near that coffee stuff
Paul
-
You're welcome Alex! I am having a coffee right now
Thanks so much to Paul and Brett too!
-
I don't know what to say. Sometimes there are easy solutions to seemingly impossible problems.
I can't say how grateful I am for you guys and the seomoz community. You have saved me from having to deal with about 10000 pages that had totally stumped me.
Have a coffee latte on me and thank you.
Alex
-
Hi Alex
What both Brett and Paul said is correct. Let me break it down for you a little more.
When Adding Media
- when you add your image - DON'T select "attachment page" - screenshot
Here's what each option does;
- custom URL - you can link the image to anything you want - an external link, another internal post, or to the source of the image itself if you need to attribute credit.
- media file - this is for if you click on the image, it just opens the actual image file. this is totally harmless, and in some cases many people like it, as it allows you to see the image larger. this is the one I usually choose by default.
- attachment page - that's the one giving you problems. it creates a new page for every image. so I never use this choice.
- none - no link as used at all. you can't click on the picture or anything.
Noindex Media
With the Yoast SEO plugin, you can also noindex media pages. See this screenshot for how to do so.
The means, no matter if you still end up with some media/photo pages indexed and missing titles etc, it won't be an issue.
Recap
OK - here's your steps;
- Moving forward, don't select "attachment page" when uploading media.
- Use Yoast to noindex media pages.
- Use Yoast to add a title and description template to media pages, in case they still get indexed.
Hope that helps!
-Dan
-
This situation is quite easy to resolve if you are using the Yoast WordPress SEO plugin.
Under the plugin's settings, go to the Permalink Settings page. The second section down offers the option to "Redirect attachment URLs to parent post URL." Put a checkmark in the box, click Save, and from now on, those individual image-only pages will automatically redirect back to the post on which they appear.
No idea if the same option is available in other SEO plugins - I've so long standardised on Yoast's plugin I'm not up to date on the others any more.
Brett's point about changing where images link will also work, but the plugin approach above means you don't have to go back an retroactively edit all your images.
Hope that helps?
Paul
-
I think when you are adding images to your site, you are setting them to Link to: Current Url rather than Media File or None. If you go back and edit your old images and make sure they are linking to the image or have no iink at all, I think you should be fine. Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Removing Breadcrumbs Detrimental for SEO?
We have full navigational breadcrumbs on our site for the menu and the brand menu. i.e. Home > Clothing > Jackets Brand > Brand Name > Brand Jackets There's been talk of removing this and having it like Chico's does, where on item pages they just have a link at the top to previous category (i.e. you're on a shirt product page and at the top it says "Back to Tops" instead of listing Home > Clothing > Tops) Is doing something like this detrimental to SEO? From what I've read Breadcrumbs are for user experience but I just want to be sure.
Technical SEO | | AliMac260 -
Removing a large number of unnecessary pages from a site
Hi all, I got a big problem with my website. I have a lot of page, duplicate page made from various combinations of selects, and for all this duplicate content we've be hit by a panda update 2 years ago. I don't want to bring new content an all of these pages, about 3.000.000, because most of them are unnecessary. Google indexed all of them (3.000.000), and I want to redirect the pages that I don't need anymore to the most important ones. My question, is there any problem in how google will see this change, because after this it will remain only 5000-6000 relevant pages?
Technical SEO | | Silviu0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Wordpress versus html and google ranking
My current SEO has always recommended that I take my site to wordpress. I really don't want to move to wordpress. I don't like it... I just like writing code in raw html, css, and script. I feel like I have more control that way. Wordpress just seems like a platform for blogs (I have my blog in wordpress). My question is, do wordpress websites typically rank better? Is there benefit to moving to it?
Technical SEO | | CalicoKitty20000 -
Remove html file extension and 301 redirects
Hi Recently I ask for some work done on my website from a company, but I am not sure what they've done is right.
Technical SEO | | ulefos
What I wanted was html file extensions to be removed like
/ash-logs.html to /ash-logs
also the index.html to www.timports.co.uk
I have done a crawl diagnostics and have duplicate page content and 32 page title duplicates. This is so doing my head in please help This is what is in the .htaccess file <ifmodule pagespeed_module="">ModPagespeed on
ModPagespeedEnableFilters extend_cache,combine_css, collapse_whitespace,move_css_to_head, remove_comments</ifmodule> <ifmodule mod_headers.c="">Header set Connection keep-alive</ifmodule> <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews</ifmodule> DirectoryIndex index.html RewriteEngine On
# Rewrite valid requests on .html files RewriteCond %{REQUEST_FILENAME}.html -f RewriteRule ^ %{REQUEST_URI}.html?rw=1 [L,QSA]
# Return 404 on direct requests against .html files RewriteCond %{REQUEST_URI} .html$
RewriteCond %{QUERY_STRING} !rw=1 [NC]
RewriteRule ^ - [R=404] AddCharset UTF-8 .html # <filesmatch “.(js|css|html|htm|php|xml|swf|flv|ashx)$”="">#SetOutputFilter DEFLATE #</filesmatch> <ifmodule mod_expires.c="">ExpiresActive On
ExpiresByType image/gif "access plus 1 years"
ExpiresByType image/jpeg "access plus 1 years"
ExpiresByType image/png "access plus 1 years"
ExpiresByType image/x-icon "access plus 1 years"
ExpiresByType image/jpg "access plus 1 years"
ExpiresByType text/css "access 1 years"
ExpiresByType text/x-javascript "access 1 years"
ExpiresByType application/javascript "access 1 years"
ExpiresByType image/x-icon "access 1 years"</ifmodule> <files 403.shtml="">order allow,deny allow from all</files> redirect 301 /PRODUCTS http://www.timports.co.uk/kiln-dried-logs
redirect 301 /kindling_firewood.html http://www.timports.co.uk/kindling-firewood.html
redirect 301 /about_us.html http://www.timports.co.uk/about-us.html
redirect 301 /log_delivery.html http://www.timports.co.uk/log-delivery.html redirect 301 /oak_boards_delivery.html http://www.timports.co.uk/oak-boards-delivery.html
redirect 301 /un_edged_oak_boards.html http://www.timports.co.uk/un-edged-oak-boards.html
redirect 301 /wholesale_logs.html http://www.timports.co.uk/wholesale-logs.html redirect 301 /privacy_policy.html http://www.timports.co.uk/privacy-policy.html redirect 301 /payment_failed.html http://www.timports.co.uk/payment-failed.html redirect 301 /payment_info.html http://www.timports.co.uk/payment-info.html1 -
Exclude Child URLs from XML Sitemap Generator (Wordpress)
Hi all, I was recommended the XML Sitemap Generator for Wordpress by the very helpful Keith Bloemendaal and John Pring - however I can't seem to exclude child URLs. There is a section Exclude items and a subsection Exclude posts. I have tried inputting the URLs for the pages I don't want in the sitemap, however that didn't work. So I read that you have to include a list of "IDs" - not sure where on earth to find that info, tried the page name and the post= number from the URL, however neither worked. I hope somebody can point me in the right direction - and apologies, I am a Wordpress novice, and I got no answers from the Wordpress forums so turned right back to SEOmoz! Cheers.
Technical SEO | | markadoi840 -
Subdomain Removal in Robots.txt with Conditional Logic??
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
Technical SEO | | ErnieB0 -
Which is the best wordpress sitemap plugin
Does anyone have a recommendation for the best xml sitemap plugin for wordpress sites or do you steer clear of plugins and use a sitemap generator then load it up to the root manually?
Technical SEO | | simoncmason0