Technical Site Questions
-
When i do a google cache of our site, i see 2 menus, our developers say that's because the 2nd is for the mobile menu - is that correct, as when i look up other sites that have mobile rendering they only have one menu visible. Plus GWT's has the number of internal links per page at least x2 what they should have - are they connected?
Secondly when i do a spider test through http://tools.seobook.com/general/spider-test/ it shows all "behind the scenes text" eg font names, portals, sliders, margins - "font size px" is shown as 17 times and a density of 2.15% - surely this isnt correct as google will be thinking that these are my keywords !?
My site is www.over50choices.co.uk
Thanks
Ash
-
My mistake. I must have been up too late and started seeing double. I can't find any equity with what I saw then and what I see now.
-
Hi Travis i think there is robot txt to block the crawlers in my site map for the Login issue?
With regards to the source code, i know all sites have it, but i dont why my site is showing it on the SEOTool spider search, when it doesnt show the code for any other site?
What duplicate pages are you referring to?
Thanks
Ash
-
Hi Chris i dont know if its impacting my rankings, but thats what i am trying to establish ie my rankings do move up and down and most are on page 2, if these technical issues were resolved would it improve how google reads my site?
-
You should probably have them noindex nofollow the login link. My crawler picked up tons of them.
I just see a white background when I check cache on google.co.uk.
Keyword density is a myth, don't worry about it.
The 'behind the scenes' text is the source code. It could likely be done better, but there's isn't any getting around not having source code. (I won't touch a .Net site, so I'm not trying to sell you anything.)
Also, there appears to be some duplicate content concerns.(e.g.: Page.aspx page.aspx)
There's definitely more to this, but I should probably get a couple hours of sleep before the day job.
-
Hi Ash,
I think you have reason for doubting your code but not being a programmer, I can only tell you to have someone else give you a second opinion. It is odd that your pages do not render in Google's cached version unless you view it as text only. Also, if you use the seobook tool, the html coding isn't included in the report for other domains the way it is reported on for yours. As far as the duel nav menu, that's another programming issue that seems unusual. These issues may stem from the implementation of your mobile site but again, I'm not an authority on that. Do you see any ranking symptoms you think my be attributable to this?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Site Structure - Is it ok to Keep current flat architecture of existing site pages and use silo structure on two new categories only?
Hi there, I have a site structure flat like this it ranks quite well for its niche site.com/red-apples.html site.com/blue-apples.html The site is branching out into a new but related lines of business is it ok to keep existing site architecture as above while using a silo structure just for the two new different but related business? site.com/meat/red-meat.html site.com/fish/oceant-trout.html Thanks for any advice!
Intermediate & Advanced SEO | | servetea0 -
Questions About Link Detox
Greetings: In April of 2014 an SEO firm ran a link removal campaign (identified spammy links and uploaded a disavow). The overall campaign was ineffective and MOZ domain rank has fallen to 24 from about 30 in the last year and traffic is 20% lower. I purchased a basic package for Link Detox and ran a report today (see enclosed) to see if toxic links could be contributing to our mediocre rankings. As a novice I have a few questions for you regarding this the use of Link Detox: -We scored a domain wide detox risk of 1,723. The site has referring root domains with 7113 links to our site. 121 links were classified as high audit priority. 56 as medium audit priority. 221 links were previously disavowed and we uploaded a spreadsheet containing the names of the previously disavowed links. We had LinkDetox include an analysis of no-follow links as they recommend this. Is our score really bad? If we remove the questionable links should we see some benefit in ranking? -Some of the links we disavowed last year are still linking to our site. Is it worthwhile to include those links again in our new disavow file? -Prior to filing a disavow we will request that Webmaster remove offending links. LinkDetox offers a package called Superhero for $469.00 that automates the process. Does this package effectively help with the entire process of writing and tracking the removal requests? Do you know of any other good alternatives? -A feature called "Boost" is included in the LinkDetox Super Hero package. It is suppose to expedite Google's processing of the disavow file. I was told by the staff at Link Detox that with Boost Google will process the disavow within a week. Do you have any idea if this claim is valid??? It would be great if it were true. -We never experienced any manual penalty from Google. Will uploading a disavow help us under the circumstances? Thanks for your feedback, I really appreciate it!!! Alan p2S6H7l
Intermediate & Advanced SEO | | Kingalan10 -
I have 2 Questions
what if we do the interlinking on the exact keywords? Is this comes under spam technique? For example - http://blog.payscout.com/automotive-merchant-services/ I interlink the exact keyword in the above URL. Can we use same image 2-3 times on the same website with different anchor tags? For example - http://packforcity.com/what-to-wear-in-new-orleans-in-january/ http://packforcity.com/what-to-wear-in-san-francisco-in-october/ Same image used on the website with different alt tag.
Intermediate & Advanced SEO | | AlexanderWhite0 -
Reindexing a site with www.
We have a site that has a mirror - i.e. www.domain.com and domain.com - there is not redirect both url's work and show pages so basically a site with 2 sets of URLs for each page. We have changed it so the domain.com and all assorted pages 301 redirect to the right URL with www. i.e. domain.com/about 301's to www.domain.com/about In the search engines the domain.com is the site indexed and the only www. page indexed is the homepage. I checked in the robots.txt file and nothing blocking the search engines from indexing both the www. and non www. versions of the site which makes me wonder why did only one version get indexed and how did the clients avoid a duplicate content issue? Secondly is it best to get the search engines to unidex domain.com and resubmit www.domain.com for the full site? We are definately staying with the www.domain.com NOT domain.com so need to find the best way to get the site indexed with www. and remove the non www. Hope that makes sense and look forward to everyone's input.
Intermediate & Advanced SEO | | JohnW-UK0 -
Content question about 3 sites targeted at 3 different countries
I am new here, and this is my first question. I was hoping to get help with the following scenario: I am looking to launch 3 sites in 3 different countries, using 3 different domains. For example the.com for USA, the .co.uk for UK , and a slightly different .com for Australia, as I could not purchase .com.au as I am not a registered business in Australia. I am looking to set the Geographic Target on Google Webmaster. So for example, I have set the .com for USA only, with .co.uk I won't need to set anything, and I will set the other Australian .com to Australia. Now, initially the 3 site will be "brochure" websites explaining the service that we offer. I fear that at the beginning they will most likely have almost identical content. However, on the long term I am looking to publish unique content for each site, almost on a weekly basis. So over time they would have different content from each other. These are small sites to begin with. So each site in the "brochure" form will have around 10 pages. Over time it will have 100's of pages. My question or my worry is, will Google look at the fact that I have same content across 3 sites negatively even though they are specifically targeted to different countries? Will it penalise my sites negatively?
Intermediate & Advanced SEO | | ryanetc0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
How come this site does so well?
Hi Guys, It's bugging the crap out of me why this site does so well http://www.stagedinburgh.com/ when I look at it's link profile its so weak and terrible plus many links comes from the sites they own. Somehow the site out ranks many sites for search terms like edinburgh stag party, edinburgh stag do, edinburgh stag weekends. Am I missing something? They seem to only have links from 13 domains and they aint great. What am I missing?
Intermediate & Advanced SEO | | PottyScotty0