Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi guys, I've been working on a site for quite a while and it has a really good link profile, excellent content, no errors or penalties (as far as I can tell) but for some reason it consistently ranks below a lot of thin poor quality websites with spammy EMDs and a few obviously paid links from old-skool business directories etc.  It has a significantly higher DA and linking root domains that almost all of them. Also it just bounces around from #40 to #28 to#35 to #40 to #28 on a weekly basis for many of our primary keywords. There just seems to be no logic to this and it goes against everything I know and everything we're taught.  (I should probably point out that I've been doing this quite a while and have a number of other sites ranking extremely well in quite a few different verticals), Has anyone ever experienced anything like this and what did you do? Before I throw in the towel it would be good to hear from others and try and understand why this happens and if there is anything else I can try to help my client and fix it. Many thanks in advance.

    | Blaze-Communication
    0

  • I've been in the process of creating a tourism-based website for the state of Kansas. I'm a photographer for the state, and have inked a nice little side income to my day job as a web designer by selling prints from Kansas (along with my travels elsewhere). I'm still in the process of developing it, but it's at least at a point that I need to really start thinking about SEO factor of the amount of backlinks I have from it going back to my main photography website. The Kansas site is at http://www.kansasisbeautiful.com and my photography website is http://www.mickeyshannon.com. This tourism website will serve a number of purposes: To promote the state and show people it's not just a flat, boring place. To help promote my photography. The entire site is powered by my photography. To sell a book I'm planning to publish later this year/early next year of Kansas images. To help increase sales of photography prints of my work. What I'm worried about is the amount of backlinks I have going from the Kansas site to my photography site. Not to mention every image is hosted on my photography domain (no need to upload to two domains when one can serve the same purpose). I'm currently linking back to my site on most pages via a little "Like the Photos? Buy a print" link in the top right corner. In addition, when users get to the website map, all photo listings click back to a page on my photography site that they can purchase prints. And the main navigation also has a link for "Photos" that takes them to my Kansas photo galleries on my photography website as well. The question I have: Is it really bad SEO-wise to have anywhere from 1 to 10+ backlinks on every page from one domain (kansasisbeautiful.com) linking back to mickeyshannon.com? Would I be better served moving all of the content from kansasisbeautiful into a subdirectory on my photography site (mickeyshannon.com/kansas/) and redirecting the entire domain there? I haven't actually launched this website yet, so I'm trying to make the right call before pushing it to the public. Any advice would be appreciated!

    | msphoto
    0

  • I recently swapped my domain from www.davescomputers.com to www.computer-help.com . Originally www.computer-help.com was 301 re-directing to www.davescomputers.com ...however my long term goal is to eventually rebrand my business so I decided to utilize the other domain by swapping the main domain. Is consistant blogging the best way to get Google to re-index the entire website? My focus has been quality posts and sharing them with vairus social profiles I created.

    | DavidMolnar
    0

  • I am considering redirecting my domain name from www.nyc-officespace-leader.com to www.metro-manhattan.com. My company name is Metro Manhattan Office Space, Inc. so the new domain will be more consistent our identity. The Metro domain was registered with GoDaddy five years ago but has only been used for email and for forwarding (entering www.metro-manhattan.com will forward visitor to www.nyc-officespace-leader.com). What is the likely hood that redirecting to the Metro-manhattan.com domain will result in a drop in traffic and ranking? I asked this question a year ago and the results were mixed. But one year is an eternity for Google. I am hoping that re-directs work better now and that if this is implemented correctly there will be no ranking/traffic/domain authority loss. Thoughts?? Thanks, Alan

    | Kingalan1
    0

  • I was on a site today and as i scrolled down and viewed the other posts that were below the top one i read, i noticed that each post below the top one had its own unique URL. I have not seen this and was curious if this method of infinite scrolling is SEO friendly. Will Google's spiders scroll down and index these posts below the top one and index them? The URLs of these lower posts by the way were the same URLs that would be seen if i clicked on each of these posts. Looking at Google's preferred method for Infinite scrolling they recommend something different - https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html . Welcome all insight. Thanks! Christian

    | Sundance_Kidd
    0

  • Recently I have been promoting custom long form content development for major brand clients.  For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible.  However there is a "read more" link that expands the entire content piece.
    I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content.  However I'm starting to wonder.  Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content?

    | RosemaryB
    1

  • Hi, everybody In the Moz ranking tool for one of our client's (the client sells sport equipment) account, there is a trend where more and more of their landing pages are product pages instead of category pages. The optimal landing page for the term "sleeping bag" is of course the sleeping bag category page, but Google is sending them to a product page for a specific sleeping bag.. What could be the critical factors that makes the product page more relevant than the category page as the landing page?

    | Inevo
    0

  • Hi everyone, I've looked for an answer to this but I can't find one. Hopefully someone can help! I have a new client that is a builder. They currently have a .co.uk domain (e.g. businessname.co.uk) Would it help them if the website was businessname_.builders_ instead? Thanks, Alex

    | WebsiteAbility
    0

  • Hey there Mozzers, If you have a website for example www.example.com and you wanted to target Australia and UK and you owned the .com.au and .co.uk. Would that be ok if everything redirected to the .com ? I know that having the .com.au is a signal for Google but the redirection is causing me troubles. Would that be huge of a difference if everything redirected to the .com version of the site?

    | AngelosS
    0

  • Hi Mozers, Apparently I'm using unique content in the short description area and it displays on the pages next to the product photo which is great how it is, but adding informational description repeating on every product page going to hurt us in SEO? A. See here an actual product - (flagged for thin content in OSE)
    B. This is how i would like to set each product page to improve them: See here a sample product with additional information/content.
    Here's my question: Setting my product pages to the B version would be considered as duplicate content by google?

    | melinmellow
    0

  • Hi everyone, Had a bit of a concern today, my weekly report has come through and my crawl issues have skyrockets by over 400! It says my metas and titles are missing but when I check through the site manually they seem to all still be there, I'm getting the same problem when I use screaming frog to crawl the site. I would really appreciate an explanation from someone as to why this is happening as I am quite confused about the situation. Thank you people Charlie Our website is www.homelogic.co.uk 🙂

    | MintySEO
    0

  • Hello, We have set breadcrumbs on some of our pages (example: https://www.globecar.com/en/car-rental/locations/canada/qc/montreal/airport-yul) for testing purposes and for some reasons they are still not showing up on Google: http://screencast.com/t/BSHQqkP69r6F Yet when I test the page with Google Structured Data Testing tool all is good: http://screencast.com/t/Fzlz3zae Any ideas? Thanks, Karim

    | GlobeCar
    0

  • Usually, foreigners use a local computer (system language), on a local google but are searching in their native language for topics related with their native language. It's a mess for us and google it seems. In example, if a venezuelan person living in France (Paris), on a french AZERTY computer (not literally... 🙂 there is no such thing as a french computer), on Google.fr, search for a venezuelan arepa recipe using the keyword "receta arepa" in spanish of course, he is expecting the result to be in spanish. How could we optimize our pages with HREF or other tricks to rank as high as possible with that configuration? This is obviously just an example, but it represents the problem I am facing.

    | rootsalad
    0

  • I have merged domains before and it went rather smoothly following the  Moz Guide - https://moz.com/blog/save-your-website-with-redirects .  I've got a new challenge ahead of me though in that a client is buying the blog subdirectory associated with another domain.  So it's the blog only, not the complete domain therefore a change of address for a site section doesn't exist. I believe the course of action will be the same except we'll just skip the change of address step since the original owner wants to maintain the TLD.  Part of the contract is that we'll get the content which will be ported over to our domain and he'll maintain the 301's as requested and into perpetuity. Our domain is not brand new and has some credible links. Anyone encounter a transition of a partial domain before?  Thanks for your help/suggestions.

    | seoaustin
    0

  • HI I want to find out how important people think it is to have the keyword as the first word in the meta title? Is this something that would even make a difference?

    | BeckyKey
    0

  • Hi guys, I have a client site that I've recently come onboard with that was published late last year, not really optimized for anything, and in a moderately, but not very, competitive, search space. Early April we optimized the home page and a couple of other pages and have since built about 5-6 (high quality, partial match) links to it, and a press release was done mid last month. The only other thing we did was change the site from non-www to www and set this as the preferred domain in Search Console. Over 6 weeks since that all began, and we're still not on the radar at all for any of our main keywords - nowhere. The only thing we are really ranking for is our brand name, but this is the wrong (press release, not home!) page, and it's bouncing a lot. All of the pages seem to be indexed, and we are ranking for one other (inconsequential) keyword, but 99 is the highest it has reached. An SEO friend told me to build some citations, but this is not a local business, nor are we trying to rank locally. Can anyone please suggest why it might be taking so long, and what else I could try? I imagine more links will help, but results from our outreach are hard to predict, so if there were another safe link type that could help me figure out whether this domain is in trouble or not ASAP, that would be ideal. Thanks very much in advance for any help you can provide. Ulla

    | Ullamalm
    0

  • Hello Moz World! I have been reading up on robots.txt files, and I understand the basics. I am looking for a deeper understanding on when to deploy particular tags, and when a page should be disallowed because it will affect SEO. I have been working with a software company who has a News & Events page which I don't think should be indexed. It changes every week, and is only relevant to potential customers who want to book a demo or attend an event, not so much search engines. My initial thinking was that I should use noindex/follow tag on that page. So, the pages would not be indexed, but all the links will be crawled. I decided to look at some of our competitors robots.txt files. Smartbear (https://smartbear.com/robots.txt), b2wsoftware (http://www.b2wsoftware.com/robots.txt) & labtech (http://www.labtechsoftware.com/robots.txt). I am still confused on what type of tags I should use, and how to gauge which set of tags is best for certain pages. I figured a static page is pretty much always good to index and follow, as long as it's public. And, I should always include a sitemap file. But, What about a dynamic page? What about pages that are out of date? Will this help with soft 404s? This is a long one, but I appreciate all of the expert insight. Thanks ahead of time for all of the awesome responses. Best Regards, Will H.

    | MarketingChimp10
    0

  • The websites that use Yotpo review solution can display product galleries like this //imgur.com/4dHUh7O - orginal source page: http://skibox.fr/fr/veste-de-pluie-dynastar-long-shell.html Every product in the gallery generates a link to https://yotpo.com  such as https://yotpo.com/go/eAaQNjJh This generate a huge amount of links detected in Google Search Console (GWMT) of yotpo.com And every of those links redirects 301 to a page of the website using Yotpo review solution. Example: https://yotpo.com/go/eAaQNjJh redirects to  http://skibox.fr/fr/batons-de-ski-leki-worldcup-lite-slalom-4683.html?#.VymNdr5_TwY It seems to be similar to shorten URL links (that are legitimate), but I am not about the influence of this, what do you think ? Is this really influencing (in bad) the (potential) rankings of  https://www.yotpo.com subdomain pages? What would you recommend to do?

    | KobyYotpo
    0

  • We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?

    | ThomasHarvey
    0

  • I could not find any articles or mentions of this online, and I am wondering if it has to do with the website being an "m-dot" website and not responsive. Any thoughts would be appreciated! IaZJWB2

    | accpar
    0

  • Hello to everyone. In the last 2 weeks my website emorroidi.imieirimedinaturali.it has a strange behavior in SERP: it disappears for the keywords ranked and then reappears, and so on. Here's the chronicle of the last days: 12/6: message in GWT: Improvement of the visibility of the website in search. 12/6 the website disappears for all the keywords ranked 16/6 the website reappears for all the keywords ranked with some keywords higher in ranking 18/6 the website disappears for all the keywords ranked 22/6 the website reappears for all the keywords ranked 24/6 the website disappears for all the keywords ranked... I can't explain this situation. Could it be a penalty? What Kind? Thank you.

    | emarketer
    0

  • My question is regarding some difficult URL structure questions in an online real estate marketplace. Our problem is that our customers search behavior is very broad, but their intent very narrow. For IRL examples go to objektia (dot) se. Example: Lease commercial space Stockholm Is a usual search query, wherein the user searches for the **broad category **commercial space, in the geography of Stockholm. The problem is that their intent is actually much more specific, since: Commercial space === [Office, Retail, Industrial, Storage, Properties] I have previously asked the forum for help regarding the placement of products in our URL-hierarchy, in which I got some good answers. We chose to go the route of alternative #3, ie placing our products (real estate listings), directly beneath their respective category (neighborhoods). https://moz.com/community/q/placement-of-products-in-url-structure-for-best-category-page-rankings Basically we chose to have the following URL structure: Structure: domain.se/category/subcategory/product Example: domain.se/Stockholm/suburb-of-stockholm/specific-listing-12 Now the question is, how do we deal with the **space type **modifier in our URL structure. Nobody wants to see retail space when they are after office space, so our current search page solution (category page) is the following: Structure: domain.se/space-type/neighborhood/sub-neighborhood All space types: domain.se/commercial-space/neighborhood/sub-neighborhood Specific space type: domain.se/office-space/neighborhood/sub-neighborhood Now, the problem with our current solution in combination with our intent to move our product pages into this hierarchy, is that every product page will be (and is today) linking towards the specific type category. Our internal link network would be built around type categories that are extremely relevant from a UX standpoint, but almost worthless (surprisingly) from an organic traffic standpoint. Also, every search page (category page) for each space type would be competing for the same search broad search phrase. The alternative is to place the type modifier at the end of the URL: Category page type at the end: domain.se/neighborhood/sub-neighborhood/type Listing page (product page), type at the end: domain.se/neighborhood/sub-neighborhood/street-address/type/listing-12

    | Viktorsodd
    0

  • Hello, We want to get a top notch company to look at us for 4-5K. We don't need SEO, we've got plenty of motion through the press and word of mouth, but if an all around agency was to give good advice I could get him some time with our CEO. How do I get the best for only 4-5K, we may continue with services or it may just be a one time thing. Who should I contact? Bob

    | BobGW
    0

  • Hi, I'm trying to find a sitemap generator/plugin that I can point my client to. My client is using Magento, and is one of the largest sports store i Norway (around 20 000 products). I've heard there's one that can set the <priority>according to page views, sold units, and other relevant parameters, and that also takes care of the other elements in the sitemap.xml.</priority> Any good recommendations out there? 🙂

    | Inevo
    0

  • Hello Moz World, So, I trying to wrap my head around all of the different robots.txt. I decided to dive into a site like Twitter, and look at their robot text. And now, I'm super confused. What are they telling the search engines with /hasttag/*src=.  Why don't they just use: Useragent: * Disallow: But, they address each search engine. Is there any benefit to this? Thanks for all of the awesome responses!!! B/R Will H.

    | MarketingChimp10
    0

  • Hello, we migrated a domain onto a new Wordpress site over a year ago.  We redirected (with plugin: simple 301 redirects) all the old urls (.asp) to the corresponding new wordpress urls (non-.asp).  The old pages are still indexed by Google, even though when you click on them you are redirected to the new page. Can someone tell me reasons they would still be indexed? Do you think it is hurting my rankings?

    | phogan
    0

  • Hi everybody, I have a client that used to rank very well in 2014. They launched an updated URL structure early January 2015, and since they rank very low on most of the keywords (except the brand keywords). I started working with them early this year, tried to understand what happened, but they have no access to their old website and I cant really compare. I tried the started optimisation methods but nothing seems to work. I have a feeling they have been penalised by Google, probably a Panda penalty, but their Webmaster tools account does not show any penalties under manual actions. Do people impose penalties that are not added to Webmaster tools? If so, is there away I can find out what penalties and what is wrong exactly so we can start fixing it? The website is for a recruitment agency and they have around 400 jobs listed on it. I would love to share the link to the website but I don't believe the client will be happy with that. Thank you in advance.

    | iQi
    0

  • Hi guys, hope you're well. I have a problem with my new website. I have 3 pages with the same content: http://example.examples.com/brand/brand1 (good page) http://example.examples.com/brand/brand1?show=false http://example.examples.com/brand/brand1?show=true The good page has rel=canonical & it is the only page should be appear in Search results but Google has indexed 3 pages... I don't know how should do now, but, i am thinking 2 posibilites: Remove filters (true, false) and leave only the good page and show 404 page for others pages. Update robots.txt with disallow for these parameters & remove those URL's manually Thank you so much!

    | thekiller99
    0

  • Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?

    | jpuzakov
    0

  • Hi, I have several questions about starting a new domain due to Penguin. The site is: http://bajajlaw.com. Quick backstory: This site was hit every time Penguin rolled out. No clean-up was done until October 2015. At that time, I took over the project. My efforts include: (1) Remove'em, (2) manual removal, (3) and the Disavow Tool. The HP went from being at around #50 for the target KW (San Diego criminal defense attorney) to about #25. Never really moved higher than that. However, I redid the content for the internal pages (DV, Theft Crimes, etc.) and they are all ranking fairly well (first page or top of 2nd). In short, the penalty only seems to affect the HP, not the internal pages. Instead of waiting for Penguin to roll-out, client wants to move forward with new domain. My questions are as follow: 1. Can I use the same content for the internal pages and 301 from the old internal pages to the new? 2. Should I 301 from the old to the new domain for the HP, or not? 3. If I do a 301 from an internal page to a new internal page, does that have the same effect of doing a 301 from the old HP to the new HP? I have read various opinions on this topic. I'd appreciate feedback from anyone who has experience doing this sort of thing. Thanks. P.s. I'm inclined to wait for P4 to rollout, but given that nobody seems to know when that might be, it's hard for me to advise client to keep waiting for it.

    | mrodriguez1440
    0

  • I'm working through duplicate content issues. The tracking code or the session id in the URL is being recognized as a different page than the original. Example: www.example.com is dup content to www.example.com?_nk=x&ad=y&_ga=z, which is tied to a marketing campaign If my setup in the URL parameter tool is set to: Effect = None  Crawl = Representative URL, then do I: 1. Miss all the traffic being driven to the ?_nk page?
    2. With a Rep URL, there still would be two indexed listings:  the .com & the .com?_nk...right? Neither is good.  Redirects of all the URLs is not an option b/c there are hundreds of these that would need to be redirected. And I also don't want to slow down page load time with excessive redirects, which has been the case when adding 100+ redirects for the recent website migration we did.

    | johnnybgunn
    0

  • We possibly have internal links on our site that point to 404 pages as well as links that point to old pages. I need to tidy this up as efficiently as possible and would like some advice on the best way to go about this.

    | andyheath
    0

  • Using Tag Assistant (Google Chrome add-on), we have found that the site's pages has GA codes. (also see screenshot 1) However, when we used Screaming Frog's filter feature -- Configuration > Custom > Search > Contain/Does Not Contain, (see screenshot 2) SF is displaying several URLs (maybe all) of the site under 'Does Not Contain' which means that in SF's crawl, the site's pages has no GA code. (see screenshot 3) What could be the problem why SF states that there is no GA code in the site's pages when in fact, there are codes based on Tag Assistant/Manager? Please give us steps/ways on how to fix this issue. Thanks! SgTovPf VQNOJMF RCtBibP

    | jayoliverwright
    0

  • We all know that more and more people are increasing the amount of different categories that eCommerce sites have. Say for example, you have over 3,000 different products, all categories contain unique text at the top of each, all of the categories link to each other (so loads on internal linking) and no two categories contain the exact same products. My question is this, is there ever a stage that you could create too many categories? Alternatively, do you think you should just keep creating categories based on what our customers search for?

    | the-gate-films
    1

  • This is our primary sitemap https://www.samhillbands.com/sitemaps/sitemap.xml We have a about 750 location based URL's that aren't currently linked anywhere on the site. https://www.samhillbands.com/sitemaps/locations.xml Google is indexing most of the URL because we submitted the locations sitemap directly for indexing. Thoughts on that?  Should we just create a page that contains all of the location links and make it live on the site? Should we remove the locations sitemap from separate indexing...because of duplicate content? # Sitemap Type Processed Issues Items Submitted Indexed --- --- --- --- --- --- --- --- --- 1 /sitemaps/locations.xml Sitemap May 10, 2016 - Web 771 648 2 /sitemaps/sitemap.xml Sitemap index May 8, 2016 - Web 862 730

    | brianvest
    0

  • Hi guys, This drives me nuts. I hear all the time that any time value is exchanged for a link that it technically violates Google's guidelines. What about real organizations, chambers of commerce, trade groups, etc. that you are a part of that have online directories with DO-follow links. On one hand people will say these are great links with real value outside of search and great for local SEO..and on the other hand some hardliners are saying that these technically should be no-follow. Thoughts???

    | RickyShockley
    0

  • Working on develop mobile pages using dynamic serving method, we are planing on only develop number of important pages (not the whole site) to be mobile friendly. To keep the consistency of the user experience, the new mobile site will only have internal links to pages that are mobile friend. Questions: If an existing non-mobile page ranking #1 on mobile SERP today, this page will not have a mobile friendly version, and will not link in the mobilefriendly site. will there be any impact to the ranking. Assuming: When Google mobile/Smartphone bots will not see a link to this page. The page will still accessible to Google desktop bots.

    | tomchu
    0

  • We have 700 city pages on our site.  We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?

    | brianvest
    0

  • We recently launched a new site on https, and I'm seeing a few errors in the SERPS with our meta descriptions as our pages are starting to get indexed. We have the correct meta data in our code but it's being output in Google differently. Example: http://imgur.com/ybqxmqg Is this just a glitch on Google's side or is there an obvious issue anyone sees that I'm missing? Thanks guys!

    | Brian_Owens_1
    0

  • I'm running a castingsite with jobs. On the site, there is also an archive with the jobs from the past 10 years or so. I'm experiencing lots of dublicate titles, like on "Actors needed for commercial, Copenhagen". It's really hard (impossible) to make sure, that all these title's are unique, since there is a lot of similar jobs coming in. So, my question is:
    Any ideas on how to get rid of this? I have some ideas myself, but dunno if they are good?
    1. Add a JobID number to every joboffer. Like "Actors needed for commercial, Copenhagen (34343)"
    2. Add a "no-index" to the joboffers which have expired" (Since people cannot apply for them anymore anyways) Basically all the archived jobs.
    3. Add a Date posted on the title. Like "Actors needed for commercial, Copenhagen (15. april 2016) Hope you guys can help.
    /Kasper

    | KasperGJ
    0

  • I am a very basic question on managing categories in WordPress. We have an Android website, and we cover news, rumors, tips and tricks about new devices. We have been creating categories for the new devices or at least for the popular ones which are launched every year, and link to them internally with the hope that it would improve the page authority and ranking. For example, we have a category page for Moto X, another one for Moto X (2014) and one more for Moto X (2015). One of the reasons for creating a category was to ensure that it is easier for readers to get information about a particular device rather than going to a category page that has information about all the models. However, the problem with their strategy we're now realizing is that it means we have to build page authority for the new category page from scratch, which can take time. So we are thinking of reusing the same category for multiple models. So reuse the Moto X category page for Moto X (2016). However, we are not sure if it would be right approach as we would be linking to the same category page with different anchor texts. So while it would be good to reuse a page rather than rebuild the page authority from scratch, would we be diluting the authority for the main keyword by using it for different models. I would love to hear your thoughts on how we should be handling categories and internal links in this case.

    | Gautam
    0

  • Good day to you Mozers, I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced. This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified. Examples are: Every PoS could have a different price on their product Some of them have services available and some may have fewer, but the content on these service page doesn't change. I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag. i.e: https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential. Question is:
    For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one. I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on? I am worried about: Indexation, Accessibility and crawl budget being wasted. Thank you in advance,

    | Charles-O
    1

  • I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites.  It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
    All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
    Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.  
    I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
    product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
    This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
    Their nav would include an actual link to product.mydomain.com/theircompanyname
    Each client would have a different "theircompanyname" link.
    They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).  
    I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
    Their nav would include an actual link to mydomain.com/product/theircompanyname
    Each client would have a different "theircompanyname" link.
    They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).  
    I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!

    | emzeegee
    0

  • When we do a search for our brand, we are get the following results in google.com.au (see image attachment). As outlined in red, there are listings in Google that result in 404 Page Not Found URLs. What can we do to enable google to do a recrawl or to ensure that these broken URLs are no longer listed in Google? Thanks for your help here! sBqpvtj

    | Gavo
    0

  • Hello, We have a blog and at the end each blog post (and from the sidebar) we link to one main product page (tagged with a particular query string). Now Google will see from every blog post all of these internal links pointing back to this page. Do you think this would cause a problem and that these links should be nofollowed? I think Google will kind of detect that these is kind of  a "navigation" as the code will be the same across all webpages. Most of all, doing them nofollow I think it is worse because it may trigger some sort of pagerank sculpting algo filter, if it still exists. Thanks, Conrad

    | conalt
    0

  • Hey there Mozzers! I have a question about global SEO. I have a website that has multiple tlds (.com.au .co.uk .com etc ) Each of these are redirecting depending the user location. Where should my link building be focused on? What are some Global SEO Techniques you suggest ?

    | AngelosS
    0

  • Hey Moz, I'm a rather experienced SEO who just encountered a problem I have never faced. I am hoping to get some advice or be pointed in the right direction. I just started work for a new client. Really great client and website. Nicer than most design/content. They will need some rel canonical work but that is not the issue here. The traffic looked great at first glance 131k visits in April. Google Analytics Acquisition Overview showed 94% of the traffic as organic. When I dug deeper and looked at the organic source I saw that Google was 99.9% of it. Normal enough. Then I looked at the time on site and my jaw dropped. 118,454 Organic New Users for Google only stayed on the site for 3 seconds. There is no way that the traffic is real. It does not match what Google Webmaster tools, Moz, and Ahrefs are telling me. How do I stop a service that is sending fake organic Google traffic?

    | placementLabs
    0

  • Hey everybody!  It's been a while since off the boards! I am reworking a site and I have been looking into their Redirection Plugin. I personally tend to lean towards just using the .htaccess because, well, why not. However, when looking deeper into the plugin I found myself a little confused with their redirection wording. RewriteRule ^/products/landing-page-october-2015/$ /products/special-education-news-october-2015/ [R=301,L] Is that the same thing as a classic Redirect 301?

    | HashtagHustler
    0

  • Hi experts, I own a site for castingjobs (Site1) and a site for selling paintings (Site2). In a long time, I've had a link at the bottom of Site1, linking to Site 2. (Basicaly: Partnerlink: Link site 2). Site1 is for me the the only important site, since it's where Im making my monthly revenue. I added the link like 5 years ago or so, to try to boost site 2. My question is:
    1. Is it somehow bad for SEO for site 1, since the two sites have nothing to do with each other, they are basically just owned by me.
    2. Would it make sense to link from Site 2 to Site 1 indstead?

    | KasperGJ
    0

  • Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff

    | jeffchen
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.