Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!

    | kirmeliux
    0

  • Dear Mozzerz We run this e-commerce website (superstar.dk) where we are selling all different kinds of wristwatches from different brand names (Casio, Garmin, Suunto etc). We just bought another website selling watches (xxx.com) and therefore we would like to move some of the content from superstar.dk to the new website xxx.com, making superstar.dk into a more niche website. So we are basically taking a brand with all the products in it and shutting it down on superstar.dk and instead launching it on xxx.com. Superstar.dk will still be running, just with a more niche product- and brand selection. So my question is, should we redirect all the old product categories that we are shutting down to the new website on another TLD where we are opening them again and the same for the products (e.g. superstar.dk/garmin -> xxx.com/garmin)? Or would it be better to keep the redirects within the same website/TLD (e.g. superstar.dk/garmin -> superstar.dk)? A few examples:
    superstar.dk/garmin -> xxx.com/garmin
    superstar.dk/suunto -> xxx.com/suunto
    etc..
    superstar.dk/product1 -> xxx.com/product1
    superstar.dk/product2 -> xxx.com/product2
    etc.

    | superstardenmark
    0

  • I have been seeing conflicting opinions about how Google would treat links using 'onclick'. For the example provided below: Would Google follow this link and pass the appropriate linking metrics(it is internal and points to a deeper level in our visnav)? =-=-=-=-=-=-= <div id='<a class="attribute-value">navBoxContainer</a>' class="<a class="attribute-value">textClass</a>"> <div id="<a class="attribute-value">boxTitle</a>" onclick="<a class="attribute-value">location.href='bla</a>h.example.com"> <div class="<a class="attribute-value">boxTitleContent</a>" title="<a class="attribute-value">Text Here</a>"><a href<a class="attribute-value">Text Here</a>"><a ="blah.exam.cpleom">Text Herea>div> ``` =-=-=-=-=-=-= An simple yes/no would be alright, but any detail/explination you could provide would be helpful and very much appreciated. Thank you all for your time and responses.

    | TLM
    0

  • A relatively new site I'm working on has been hit really hard by Panda, due to over optimization of 301 external links which include exact keyword phrases, from an old site.  Prior to the Panda update, all of these 301 redirects worked like a charm, but now all of these 301's from the old url are killing the new site, because all the hyper-text links include exact keyword matches.  A couple weeks ago, I took the old site completely down, and removed the htaccess file, removing the 301's and in effect breaking all of these bad links.  Consequently, if one were to type this old url, you'd be directed to the domain registrar, and not redirected to the new site.  My hope is to eliminate most of the bad links, that are mostly on spammy sites, that aren't worth linking to.  My thought is these links would eventually disappear from G. My concern is that this might not work, because G won't re-index these links, because once they're indexed by G, they'll be there forever.  My fear is causing me to conclude I should hedge my bets, and just disavow these sites using the disavow tool in WMT.  IMO, the disavow tool is an action of last resort, because I don't want to call attention to myself, since this site doesn't have a manual penalty inflected on it.  Any opinions or advise would be greatly appreciated.

    | alrockn
    0

  • _The company I work for has a website www.example.com that ranks very well in English speaking countries - US, UK, CA. For legal reasons, we now need to create www.example.co.uk to be accessible and rank in google.co.uk. Obviously we want this change to be as smooth as possible with little effect on rankings in the UK. We have two options that we're talking through at the moment - Use the hreflang tag on both the .com and the .co.uk to tell Google which site to rank in each country. My worry with this is that we might lose our rankings in the UK as it will be a brand new site with little to no links pointing to it. 301 redirect to the .co.uk based on UK IP addresses. I'm skeptical about this. As a 301 passes most of the link juice, I'm not sure how Google would treat this type of thing - would the .com lose ranking? So my questions are - would we lose ranking in the UK if we use option 1? Would option 2 work? What would you do? Any help is appreciated._

    | awestwood
    0

  • So the general view on satellite sites is that they're not worth it because of their low authority and the amount of link juice they provide. However, I have an idea that is slightly different to the standard satellite site model. A client's website is in a particular niche, but a lot of websites that I have identified for potential links are not interested because they are a private commercial company. Many are only interested in linking to charities or simple resource pages. I created a resource section on the website, but many are still unwilling to link to it as it is still part of a commercial website. The website is performing well and is banging on the door of page one for some really competitive keywords. A few more links would make a massive difference. One idea I have is to create a standalone resource website that links to our client's website. This would be easy to get links from sites that would flat out refuse to link to the main website. This would increase the authority of the resource and result in more link juice to the primary website. Now I know that the link juice from this website will not be as good as getting links directly to the primary website, but would it still be a good idea? Or would my time be better spent trying to get a handful of links directly to the client's website? Alternatively, I could set up a sub-domain to set up the resource, but I'm not sure that this would be as successful.

    | maxweb
    0

  • Looking for recommendations for a reliable & experienced contractor to help with a link cleanup project.  We've identified the problem links, we just need someone to assist with the actual outreach.  Would appreciate any suggestions.

    | MattBarker
    0

  • I have two sites: Site A, and Site B. Both sites are hosted on the same IP address, and server using IIS 7.5. Site B has an SSL cert, and Site A does not. It has recently been brought to my attention that when requesting the HTTPS version of Site A (the site w/o an SSL cert), IIS will serve Site B... Our server has been configured this way for roughly a year. We don't do any promotion of Site A using HTTPS URLs, though I suppose somebody could accidentally link to or type in HTTPS and get the wrong website. Until we can upgrade to IIS8 / Windows Server 2012 to support SNI, it seems I have two reasonable options: Move Site B over to its own dedicated IP, and let HTTPS requests for Site A 404. Get another certificate for Site A, and have it's HTTPS version 301 redirect to HTTP/non-ssl. #1 seems preferable, as we don't really need an SSL cert for Site A, and HTTPS doesn't really have any SEO benefits over HTTP/non-ssl. However, I'm concerned if we've done any SEO damage to Site A by letting our configuration sit this way for so long. I could see Googlebot trying https versions of websites to test if they exist, even if there aren't any ssl/https links for the given domain in the wild... In which case, option #2 would seem to mostly reverse any damage done (if any). Though Site A seems to be indexed fine. No concerns other than my gut. Does anybody have any recommendations? Thanks!

    | dsbud
    0

  • Hello Mozzers, What values are accepted in parameter field in webmaster? I want to block URL's with + in them. Parameters does not seem to accept + as a valid value Thanks!

    | MozAddict
    0

  • Howdy Moz, Just received a message in Google Webmaster Tools about a CMS update: "Joomla Update Available As of the last crawl of your website, you appear to be running Joomla 1.5. One or more of the URLs found were: http://www.website/custom-url/article5034 Google recommends that you update to the latest release. Older or unpatched software may be vulnerable to hacking or malware that can hurt your users. To download the latest release, visit the Joomla download page. If you have already updated to the latest version of Joomla, please disregard this message. If you have any additional questions about why you are receiving this message, Google has provided more background information in a blog post about this subject." Read through the associated blog post. According to the post a generator meta tag is created in Joomla that notes the CMS version. Here's the oddity: The site was on Joomla 1.5 over 2 years ago. 1 Year ago it was updated to Joomla 2.5. About a week ago it was converted completely to Wordpress. According to GWT the last date the Google bot accessed the site was the day before (5/1/14) the email. I went through the code, css/html, and the database and found no reference of Joomla 1.5. Has anyone seen this message? If so, how did you rectify it? Were there any adverse effects on rankings?

    | AaronHenry
    0

  • The question is simple but I don't understand the answer. I found a webpage that was linking to my personal site. The page was indexed in Google. However, there was no cache option and I received a 404 from Google when I tried using cache:www.thewebpage.com/link/. What exactly does this mean? Also, does it have any negative implication on the SEO value of the link that points to my personal website?

    | mRELEVANCE
    0

  • Ok I created breadcrumbs on my site long ago.  They popup every time which what i'm concerned about.  I wanted them to pop up for searches that lead to my homepage, but the breadcrumbs pop up for every product and every page on my site.  My question is does this hurt searches for the link part of it.  As you know search engines check title link and descriptions.  Will my link not be counted in or is the breadcrumb purely cosmetic.  Just want to know if i'm limiting my rank. Here's a product that should show its own breadcrumb type "Bia Brazil BT3341 BKS-Sexy" in google to see example.  I should be number one under ads My code that creates breadcrumbs is as follows: Tops Shorts Best Fit By Brazil Capris Bia Brazil Sexy Leggings Sexy Workout Clothes 15% OFF I REMOVED CODE FROM WEBSITE FOR NOW UNTIL I FIGURE OUT.  HOPEFULLY SOMEONE ANSWERS BEFORE search results change thx.

    | realmccoy101
    0

  • A company I work for has two numbers... one for the std call centre and one for tracking SEO. Now, if local citation/business directory listings have the same address but different numbers, will this affect local/other SEO results? Any help is greatly appreciated! 🙂

    | geniusenergyltd
    0

  • I have a wordpress.com blog and have recently set up google authorship (at least I think I have). If I add a new post what happens to the old post in terms of authorship? Is the solution opening a new page for each article? If so does the contribution link in google+ pick up all pages if you only have home link? many thanks

    | harddaysgrind
    1

  • Hi all and thanks for your help in advance. I've been asked to take a look at a site, http://www.yourdairygold.ie as it currently does not appear for its brand name, Your Dairygold on Google Ireland even though it's been live for a few months now. I've checked all the usual issues such as robots.txt (doesn't have one) and the robots meta tag (doesn't have them). The even stranger thing is that the site does rank on Yahoo! and Bing. Google Webmaster Tools shows that Googlebot is crawling around 150 pages a day but the total number of pages indexed is zero. It does appear if you carry out a site: search on Google however. The site is very poorly optimised in terms of title tags, unnecessary redirects etc which I'm working on now but I wondered if you guys had any further insights. Thanks again for your help.

    | iProspect-Ireland
    0

  • Does anyone have recommendations for any particular site searches for large e-commerce sites based on Magento? Some (hopeful) requirements: Possibility to segment product pages and blog content on results page Doesn't cause any major SEO or technical issues Understands semantic search Ability to filter results Ability to sort (e.g. by price, popularity, new in stock) It'd be really useful to see examples and know if there are any particular issues to be aware of. Thanks. 🙂

    | Alex-Harford
    0

  • Hi All, I seem to be losing a 'firefighting' battle with regards to various errors being reported on the Moz crawl report relating to; Duplicate Page Content Missing Page Title Missing Meta Duplicate Page Title While I acknowledge that some of the errors are valid (and we are working through them), I find some of them difficult to understand... Here is an example of a 'duplicate page content' error being reported; http://www.bolsovercruiseclub.com (which is obviously our homepage) Is reported to have 'duplicate page content' compared with the following pages; http://www.bolsovercruiseclub.com/guides/gratuities http://www.bolsovercruiseclub.com/cruise-deals/cruise-line-deals/holland-america-2014-offers/?order_by=brochure_lead_difference http://www.bolsovercruiseclub.com/about-us/meet-the-team/craig All 3 of those pages are completely different hence my confusion... This is just a solitary example, there are many more! I would be most interested to hear what people's opinions are... Many thanks Andy

    | TomKing
    0

  • I have an odd page url, generated by a link from an external website, it has: %5Cu0026size=27.4KB%5Cu0026p=dell%20printers%20uk%5Cu0026oid=333302b6be58eaa914fbc7de45b23926%5Cu0026ni=21%5Cu0026no=24%5Cu0026tab=organic%5Cu0026sigi=11p3eqh65%5Cu0026tt=Dell%205210n%20A4%20Mono%20Laser%20Printer%20from%20Printer%20Experts%5Cu0026u=fb ,after a .jpg image url, and I can't get it redirect using the redirect 301 in .htaccess to the properly image url as I use to do with the rest of not found urls eg: /15985.jpg%5Cu0026size=27.4KB%5Cu0026p=dell%20printers%20uk%5Cu0026oid=333302b6be58eaa914fbc7de45b23926%5Cu0026ni=21%5Cu0026no=24%5Cu0026tab=organic%5Cu0026sigi=11p3eqh65%5Cu0026tt=Dell%205210n%20A4%20Mono%20Laser%20Printer%20from%20Printer%20Experts%5Cu0026u=fb to just: /15985.jpg

    | Status
    0

  • Hi there, I apologize if I'm too vague, but this is a tough issue describe without divulging too much of our project. I'm working on a new project which will provide information results in sets of 3. Let's say someone wants to find 3 books that match their criteria, either through their organic search which leads them to us, or through their internal search on our site. For instance, if they're looking for classic movies involving monsters, we might display Frankenstein, Dracula, and The Mummy. We'd list unique descriptions about the movies and include lots of other useful information. However, there are obviously many more monster movies than those 3, so when a user refreshes the page or accesses it again, a different set of results show up. For this example, assume we have 5 results to choose from. So it's likely Google will index different results shuffled around. I'm worried about this causing problems down the line with ranking. The meat and potatoes of the page content are the descriptions and information on the movies. If these are constantly changing, I'm afraid the page will look "unstable" to Google since we have no real static content beyond a header and title tag. Can anyone offer any insight to this? Thanks!

    | kirmeliux
    0

  • I've found a lot of mixed info on this topic so I thought I'd ask the experts (Moz community). If I'm adding tracking parameters to URLs to monitor organic traffic will this affect the rank/value of the original clean URL? If so, would best practice be to 301 redirect the tracked URL to the original:
    i.e. redirect www.example.com/category/?DZID=Organic_G_NP/SQ&utm_source=Organic&utm_medium=Google TO www.example.com/category Thanks for your help!
    -Reed

    | IceIcebaby
    0

  • Just started some work for a new client, I created a Google+ page and a connected YouTube page, then proceeded to claim a listing for them on google places for business which automatically created another Google+ page for the business listing. What do I do in this situation? Do I delete the YouTube page and Google+ page that I originally made and then recreate them using the Google+ page that was automatically created or do I just keep both pages going? If the latter is the case, do I use the same information to populate both pages and post the same content to both pages? That doesn't seem like it would be efficient or the right way to go about handling this but I could be wrong.

    | goldbergweismancairo
    0

  • I've come across various places that give statistics for things like "Video search results have a higher click-through than plain text results. " and
    "Video is 50 times more likely to get organic page ranks in Google than plain text results" How true are these and does anyone have a definitive guide to video SEO?

    | Gordon_Hall
    0

  • I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?

    | WebServiceConsulting.com
    0

  • Hi Guys, I have a website that received unnatural link message. We had started link removal process, disavowed not approachable links and file reconsideration four times but all the times Google sent some samples of unnatural links and rejected our reconsideration. Last week we again disavowed non approachable links and planned to file reconsideration after a week but today when i tried to file reconsideration, Manual action message was disappeared. We haven't received any message from Google. The same case was happened with one more of our site earlier. Manual action disappeared means it has been revoked or something else??

    | RuchiPardal
    0

  • We've implemented BazaarVoice with the latest Cloud SEO. As an eComm site, BV helps us manage our own reviews along with currating reviews from vendors on product pages that don't have any. Only a maximum of 7 reviews are displayed at one time and any additional are on a "next" page. BV has asked to include a query string (?bvrrp=...) on our canonical tags that would allow SEs to read the additional reviews. For example, the current canoncial URL will go from this http://www.sitename.com/item/product-name/123456789 to http://www.sitename.com/item/product-name/123456789**?bvrrp=Main_Site/reviews/product/2/123456789.htm** Having more crawlable UGC is advantagous but I'm skeptical about adding this. Just looking for any guidance. Thanks! WMCA

    | WMCA
    0

  • Hello 🙂 please can someone tell me how and in what way I can contact other websites related to my site and asking them, can they write a article or similar about my content . And can they link to me, quote me in their articles Or how to take my content to these other blogs, sites and show it off to them ? Thank you

    | Ivek99
    0

  • Hello Moz community! We are building out a site for a web hosting/web design company. I am wondering if we should just have home/categories/pages or if we should have home/categories/sub-categories/pages. I am am not sure if by adding the additional level we can create a bunch of mini-hubs within the categories. For example: Home/Web hosting/Business Web Hosting/Small Business Web Hosting I don't know if these mini-hubs within the category are a good idea or if I should keep it as flat as possible? Any thoughts on this?

    | YouAndWhatArmy
    0

  • We have been getting the same response from Google after several reconsideration requests. THE SITUATION:
    Our site displays 3 distinct product lines. We separated each product line by use of SUB-DOMAINS. All 3 product lines are integrated as part of the main NAVBAR. The product list pages run off the sub-domain; however, product detail pages run off  the MAIN-DOMAIN. GWT
    Google has taken manual action because the MAIN-DOMAIN.COM links to PRODUCT-A.DOMAIN.COM on every single page. I attempted several times to explain; but without success. It's only one SUB-DOMAIN causing a problem. The other 2 SUB-DOMAINS are setup the exact same what without issue. This week, we simply added a NO-FOLLOW on the link to the SUB-DOMAIN causing the issue; we will see if this helps. Anyone else ever experience this?

    | SpadaMan
    0

  • I'm working on an ecommerce website that has a few snags and issues with it's coding. They're using https, and when you access the website through domain.com, theres a 301 redirect to http://www.domain.com and then this, in turn, redirected to https://www.domain.com. Would this have a deterimental effect or is that considered the best way to do it. Have the website redirect to http and then all http access is redirected to the https URL? Thanks

    | jasondexter
    0

  • Since April 16 (when Jews ate Matzah) Google hurt one of our clients badly. They are well-known and beloved brand with hundreds of employees and locations across USA.
    I can’t see any signal of organic update, or penalty (neither Google Places). No message on GWT Nothing has been changed on and off site. All keywords' ranking are looking like this All tools showing good analysis: MOZ, Barracuda, MajesticSeo Content is good and not duplicated, etc. Do one of you is aware of significant Google update?
    What do you think/suggest?

    | Elchanan
    0

  • Hello everyone. We are a furniture company, selling sofas, mattress, outdoor furniture, many BBQ's in the future - Separate things, but all related in a way. I was thinking it would make us look like an 'authority' to have a separate website for everything and be more specialised and also look more specialised. What would be better for SEO?? Also (sneeking in a second question!), I have around 50 sofa designs - is it ok if the Meta description is the same for each one or should I change a word or 2 around? Many thanks!!

    | cowhidesdirect
    0

  • Hi Moz crew, We've got a bit of a riddle on our hands here at Flightpath. You see, we're an agency that specializes in digital services like web design, social media and SEO. Unfortunately, we seem to have been hit by an a algorithmic penalty on February 28th, 2014. This is a first for us - we've never had to deal with a penalty (manual or algorithmic) for our site or any of our client sites. Here's the situation: We were averaging around 1,500 impressions per day before the drop. Since 2/28, we see closer to 250 impressions per day. No manual action notice in WMT Branded keywords did not lose rank. It was primarily our the service-oriented keywords that we lost rank on (ex: "digital agency", "digital agency nyc", "social media agency nyc", "web agency new york" - we were page 1 for all of these, though "digital agency" wasn't as secure as the others). Backlink profile looks ok. We did a clean-up (disavowed a few hundred domains) as soon as we noticed the drop, but there wasn't anything in there egregiously offensive. There definitely wasn't anything NEW that was problematic. Not a lot of non-branded anchor text at all. No major changes to the site in 2014 Any ideas? The site is http://www.flightpath.com And here's a horrifying WMT screen grab: i.imgur.com/EY4OBG1.jpg UPDATE: We recovered nearly all of our missing rank/traffic/impressions for a 3-day period between 4/15 and 4/17. WMT Screenshot: http://imgur.com/V1fI1MQ During our brief recovery, we did lose a small amount of rank (just a few positions, only for a handful of keywords) compared to where we were pre-crisis. That makes sense though, we were pretty ruthless in disavowing domains and almost surely caught a few "positive" links along with the bad ones. Aside from that, it appeared to be a full recovery - every single one of our generic keywords was back for just over 48 hours. Any ideas? Was Google rolling out a new algorithm tweak, only to pull it back due to bugs? Or was it the opposite: Google rolling back the update that hurt our site to fix a few bugs before pushing it live again?

    | f1_path
    0

  • Hey Guys,I recently used the URL parameter tool in WBT to mark different urls that offers the same content.I have the parameter "?source=site1" , "?source=site2", etc...It looks like this: www.example.com/article/12?source=site1The "source parameter" are feeds that we provide to partner sites and this way we can track the referral site with our internal analytics platform.Although, pages like:www.example.com/article/12?source=site1 have canonical to the original page www.example.com/article/12, Google indexed both of the URLs
    www.example.com/article/12?source=site1andwww.example.com/article/12Last week I used the URL parameter tool to mark "source" parameter "No, this parameter doesnt effect page content (track usage)" and today I see a 40% decrease in my crawl stats.In one hand, It makes sense that now google is not crawling the repeated urls with different sources but in the other hand I thought that efficient crawlability would increase my crawl stats.In additional, google is still indexing same pages with different source parameters.I would like to know if someone have experienced something similar and by increasing crawl efficiency I should expect my crawl stats to go up or down?I really appreciate all the help!Thanks!

    | Mr.bfz
    0

  • Hello, We want to increase the downloads of and spread the word about this iPhone app: https://itunes.apple.com/us/app/dynamic-spin-release/id434567327?mt=8 What are your suggestions? Here's our two main websites in this case: nlpca.com dynamicSpinRelease.com There's also a promotion in the op right of this page: http://www.nlpca.com/DCweb/dynamicspinrelease6.html Thanks!

    | BobGW
    0

  • Moz - Open Site Explorer using the following setup: Tab: Inbound Links
    Show: "all"
    from: "Only Internal" I have run a number of random tests and have noticed the following results in the link anchor text. [No Anchor Text]
    company name
    website url
    Home
    etc. What is the best practice and naming convention to be used? Regards Mark

    | Mark_Ch
    0

  • Dear Friends, One of my customers was hit by the Panda, we were working on improve the tiny content on several pages and the remaining pages were: 1 NOINDEX/FOLLOW 2. Removed from sitemap.xml 3. Un-linked from the site (no one page on the site link to the pour content) As conclusion we can't see any improvement, my question is should I remove the pour content pages (404)? What is your recommendation? Thank you for your time Claudio

    | SharewarePros
    0

  • Hi, I see that many good websites have backlinks from very good blogs/sites which are relative. What I noticed that everyone use their real name or generic name in comments. They do not use the keyword for the name. So later they get backlinks with anchor text of their names... So, my question is this good technique ? Do I have any benefits from these backlinks for my website ? With such a technique, whether it is enough just to leave your real name or may I periodically put the keyword for the name ? Thank you

    | Ivek99
    0

  • Hello, I have deleted certain backlinks manually a few months ago. they dont longer exist. but even today Google Webmaster tools and MOZ backlink tool shows this backlinks. Why ? what i need to do to remove them completely ? Thank you 🙂

    | Ivek99
    0

  • Hello I am curious if all of these new extensions for domains are worth it? So say you are a home builder and you bought homebuilder.construction - where as construction is a new extension, does this help seo? Or is it all just a big sales gimmick? Thank you for your thoughts

    | Berner
    1

  • This is a site with handcrafted home décor products. Is it better seo to have 1 product for each spoon style and use attributes to show the display options? The decorative display details are important. How do you deal with the important details of the display options? One set of products is measuring spoon sets. For each spoon there are 4 different display options: wood or pewter and hang on a tall post or hang on a wide wall hanger. And there are 8 different spoon styles. The pewter displays have design details matched to the spoon style. (eg. the fish spoon has fishes on it's hanger, the fleur de lys has fleur de lys on it) So the attributes do not change the product itself but instead change the look of how they are displayed. At present each display option/ spoon style is shown as a separate item. That makes 32 different products on the category page. What do you feel is the best way to deal with showing style details and consolidating pages for better seo? Best wishes from Vermont Handcrafter

    | stephenfishman
    0

  • I want to a make new website. Can you please advise me what all things are involved which I should keep in mind before and during the website preparation. Like how to make pages, what to include in website, best way to create pages etc. Please provide me the link where I can study all the above information. I am planning to create global printing website.

    | AlexanderWhite
    0

  • Howdy Moz, Still kind lost of review markup and the best way to get it implemented into a Wordpress site. Any suggestions on a good tutorial that walks you through the process?

    | AaronHenry
    0

  • Schemas looks like an important thing when it comes to structuring your website and ensuring the crawl bots get all the details. I've been reading a lot of articles around the web and most of them are saying that schemas are important but very few websites are using it. Why so? Are the schemas on schema.org there to stay or am I wasting my time?

    | Shreyans92
    0

  • Hi,back in June 2013, our company received a notice of unnatural links which resulted in 'a manual spam action' from Google.A reconsideration request was filed a week later which received the following response from Google:_'We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.'_Naturally we are confused by what seems to be an error in Google's communication.We are also left questioning whether it was necessary to remove the links Google stated were unnatural.Since the notice was received, we have struggled to recover traffic even after implementing Google best practices.  Some clarity on the issue would be greatly appreciated.My URL is: www.homefurnitureland.co.uk

    | users_engaged
    0

  • If my URL structure is the same for the desktop and mobile experience, is there any benefit to creating a mobile sitemap, considering that the sitemap for our desktop site covers the same URLs?

    | edmundsseo
    0

  • Hello, A business owner and design decision was made on a published article page to have a summary sentence/paragraph placed prominently with a unique font treatment in the article header along with the article's main imagery. Historical content that does not have this summary migrated with "the first sentence of the article" used for this introduction/summary sentence/paragraph. In both cases, where there is a unique summary and where the first sentence is used, the article text normally begins below a graphical element below the summary element.  Thus, when the first sentence was used for the summary, the first sentence will repeat, relatively close together on each page where this happens. The question is:  How much risk would i be taking on in allowing the first sentence of these articles to get repeated in close proximity on the page. I wanted to get some other perspectives on this unique situation. Thanks,

    | JennyTTGT
    0

  • Hi All, I have attached a simple website model.
    Page A is the home page attracting 1000 visitors per month.
    One click away is Page B with 400 visitors per month, so on and so forth. You get an idea of the flow and clicks required to get to various pages. I have purposely placed Pages E-G to be 3 clicks away as they yield very little traffic. 1] Is this the best way to distribute link juice?
    2] Should I point Pages C + D back to page A to influence its Page Rank (PA) Any other useful advice would be appreciated. Thanks Mark vafnchI

    | Mark_Ch
    0

  • So, my client is thinking of purchasing several gTLDs with second level keywords important to us.  Stuff like this...we don't want .popsicles, just the domain with the second level keyword. Those cost anywhere from $20-30 right now: grape.popsicles cherry.popsicles rocket.popsicles companyname.popsicles The thinking is that it's best to be defensive, not let a competitor get the gTLD with our name in it (agreed) and not let them capitalize on a keyword-rich gTLD (hmm). The theory was that we or a competitor could buy this gTLD and redirect it to our relevant page for, say, cherry popsicles. They wonder if that would help that gTLD page rank well - and sort of work in lieu of AdWords for pages that are not ranking well. I don't think this will work. A redirected page shouldn't rank better that the page it links to...unless Google gave it points for Exact Match in the URL.  Do you think they will -- does Google grade any part of a URL that redirects? Viewing this video from Matt Cutts, I surmise that a gTLD would be ranked like any other page -- if its content, inbound links, etc. support a high DA, well, ok then, you get graded like every domain. In the case of a redirect, the page would not be indexed as a standalone so that is a moot point, right? So, any competitor buying a gTLD with the hopes of ranking well against us would have to build up pagerank in that new domain...and for our purposes I see that being hugely difficult for anyone - even us.  Still, a defensive purchase of some of these might not be a bad idea since it's a fairly low cost investment. Other thoughts?

    | Jen_Floyd
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.