Questions created by prima-253509
-
Considering Switch to old Domain - Any Bad Karma?
So here is the issue. I am working with a company that used to have a branded domain. Then they split the domain into two separate keyword rich domains and tried to change branding to match the keyword rich domains. This made for a really long brand name that is difficult to actually rank for as it is mostly hi traffic key terms and also created brand confusion because all of the social accounts still operate under the old brand name. We are considering a new brand initiative and going back to the original brand name as it better meets our business objectives (they still get traffic from branded searches under the old brand) and the old branded web domain. My question is if there is any added risk in going back to an old domain that has been forwarded for the past 2 years to the new domain? I know the risks and problems of a domain name change, but I am not as certain about the added complication of moving back to an old domain and essentially reversing the flow of 301's. Any thoughts? Cheers!
Branding | | prima-2535090 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Google Hiding Indexed Pages from SERPS?
Trying to troubleshoot an issue with one of our websites and noticed a weird discrepancy. Our site should only have 3 pages in the index. The main landing page with a contact form and two policy pages, yet google reports over 1,100 pages (that part is not a mystery, I know where they are coming from.....multi site installations of popular CMS's leave much to be desired in actually separating websites) Here is a screen shot showing the results of the site command: http://www.diigo.com/item/image/2jing/oseh I have set my search settings to show 100 (the max number of results) results per page. Everything is fine until I get to page three where I get the standard "In order to show you the most relevant results, we have omitted some entries very similar to the 122 already displayed." But wait a second, I clicked on page three, now there are only two pages of results and the number of results reported has dropped to 122 http://www.diigo.com/item/image/2jing/r8c9 When I click on the "show omitted results" I do get some more results, and the returned results jumps back up to 1,100. However I only get three pages of results. And when I click on the last page the number of results returned changes to 205 http://www.diigo.com/item/image/2jing/jd4h Is this a difference between indexes (same thing happens when I turn instant search back on, Shows over 1,100 results but when I get to the last page of results it changes to 205). Any other way of getting this info? I am trying to go in and identify how these pages are being generated, but I have to know what ones are showing up in the index for that to happen. Only being able to access 1/5th of the pages indexed is not cool. Anyone have any idea about this or experience with it? For reference I was going through with SEOmoz's excellent toolbar and exporting the results to csv (using the Mozilla plugin). I guess google doesn't like people doing that so maybe this is a way to protect against scraping by only showing limited results in the Site: command. Thanks!
Moz Pro | | prima-2535090 -
Disqus integration and cloaking
Hey everyone, I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking. Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user. Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run). Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears. Josh
White Hat / Black Hat SEO | | prima-2535090 -
Homepage outranked by sub pages - reason for concern?
Hey All, trying to figure out how concerned I should be about this. So here is the scoop, would appreciate your thoughts. We have several eCommerce websites that have been affected by Panda, do to content from manufacturers and lack of original content. We have been working hard to write our own descriptions and are seeing an increase in traffic again. We have also been writing blogs since February and are getting a lot of visits to them. Here is the problem, our blog pages are now outranking our homepage when you type in site:domain-name Is this a problem? our home page does not show up until you are 3 pages in. However when you type in just our domain name in google as a search it does show up in position one with sitelinks under it. This is happening across both of our sites. Is this a cause for concern or just natural due to our blogs being more popular than our homepage. Thanks! Josh
Technical SEO | | prima-2535090 -
What tools do you use to submit a site to local yellowpages?
Hey all, two part question for you. Do you use any tools to automatically submit websites to local yellowpages (example: http://business.intuit.com/directory/marketing/100_syndication_sites.jsp)? and if so, what one and why? Are there any dangers to doing it this way? It seems that this might save a lot of time and be incredibly helpful to manage your brand profile pages in a centralized location. Also some tools that I am seeing incorporate brand monitoring (which you can do through a variety of tools I know). Anyways, thoughts? comments? tips?
Branding | | prima-2535090 -
Google analytics now showing social signals
Looking through Google analytics today and noticed that there is a section under top content that shows number of Facebook likes & shares, tweets, diggs, delicious book marks, etc. Anyone else seeing this? [staff note: see answers, this came from a Chrome extension]
Social Media | | prima-2535090 -
Google Analytics Benchmarking Newsletter: How does your site perform?
With Google recently releasing benchmarking data I am curious as to what you all see across the various types of website niches that you work with (eCommerce, news, blog, services, small business, etc). And how SEO'd websites compare with this "raw" data provided by google. We have one medium size (12,000 products) strictly eCommerce website that has a bounce rate of 37% and an avg time on site of 5:20 While two other medium size eCommerce/blog sites have a bounce rate of 57% and 59% with average time on site of 2:37 and 2:30 respectively. Finally, I manage a website for a local small business that provides business and home cleaning services. This site has a bounce rate of 45% and 1:40 average time on site. How do your sites perform in these areas? Is it typical to see this great of a disparity between strict eCommerce websites and those sites that are both informational and transactional in nature? What about other kinds of websites? Cheers!
Search Behavior | | prima-2535091 -
Bing Update?
With all of the talk about Panda V2 I am just curious as to whether Bing has also updated their algorithm in the past month. I have seen some reports about the number of indexed pages dropping in half on June 10th as reported in bing's webmaster tools, as well as experiencing that drop on our websites, and a significant decrease in rankings do to them dropping some high performing pages out of their index. So Just wondering if anyone else had seen this or had any info on an algorithm update rolled out from bing. Cheers, Josh
Algorithm Updates | | prima-2535090 -
What is the effect of a proxy server replicating a sight on SEO
I have heard of PPC company's that set up a proxy server to replicate your site so that they can use their own tracking methods for their reports. What affect if any does this have on SEO for a site?
Paid Search Marketing | | prima-2535091 -
Tool for scanning the content of the canonical tag
Hey All, question for you. What is your favorite tool/method for scanning a website for specific tags? Specifically (as my situation dictates now) for canonical tags? I am looking for a tool that is flexible, hopefully free, and highly customizable (for instance, you can specify the tag to look for). I like the concept of using google docs with the import xml feature but as you can only use 50 of those commands at a time it is very limiting (http://www.distilled.co.uk/blog/seo/how-to-build-agile-seo-tools-using-google-docs/). I do have a campaign set up using the tools which is great! but I need something that returns a response faster and can get data from more than 10,000 links. Our cms unfortunately puts out some odd canonical tags depending on how a page is rendered and I am trying to catch them quickly before it gets indexed and causes problems. Eventually I would also like to be able to scan for other specific tags, hence the customizable concern. If we have to write a vb script to get it into excel I suppose we can do that. Cheers, Josh
Moz Pro | | prima-2535090 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090