Why won't the Moz plug in "Analyze Page" tool read data on a Big Commerce site?
-
We love our new Big Commerce site, just curious as to what the hang up is.
-
I know several developers but the main concern is the platform, Big Commerce. I am not offering feedback regarding the platform, but the first decision you need to make is whether you are committed to sticking with Big Commerce.
If you wish to keep the site built on Big Commerce, my recommendation would be to seek out a developer who specifically has experience working with that platform. There are tons of developers and companies who are all to willing to accept any web development work. You want a specialist who can say "I have built dozens of Big Commerce sites, that's mainly what I do."
-
Thanks Ryan. As I'm not a developer I wouldn't have known how to troubleshoot this. I had suspicions that things were not all good, as I noticed some slow slow page load speeds.
So basically, my client's developer hacked up the code very nicely.
Know any developers interested in getting involved with this project? Seems like I'll need to advise my client to fire yet another developer.
Best, Stephen
-
The AnalyzePage function works fine on Big Commerce sites. I checked a couple other sites and it worked perfectly. For example: http://tricejewelers.com/ is a Big Commerce site.
The difference I see on the particular site you shared is it has the largest number of coding errors I have ever seen on a web page. http://validator.w3.org/check?uri=http%3A%2F%2Fwww.asseenontvfrenzies.com%2Fyonanas%2F&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.2
When I try to use AnalyzePage via FF, it hangs. When I use Chrome, I see results but it is for the social plugins, not the page itself. I suspect the root issue is the coding errors. For a more definitive answer you can open a ticket with the help desk help@seomoz.org.
Good luck.
-
Can you share the link to the page?
Analyze Page does not work if a page is not fully loaded. I have experienced issues in that regard, but then I refresh the page and it works fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirects and site map isn't showing
We had a malware hack and spent 3 days trying to get Bluehost to fix things. Since they have made changes 2 things are happening: 1. Our .xml sitemap cannot be created https://www.caffeinemarketing.co.uk/sitmap.xml we have tried external tools 2. We had 301 redirects from the http (www and non www versions) nad the https;// (non www version) throughout the whole website to https://www.caffeinemarketing.co.uk/ and subsequent pages Whilst the redirects seem to be happening, when you go into the tools such as https://httpstatus.io every version of every page is a 200 code only whereas before ther were showing the 301 redirects Have Bluehost messed things up? Hope you can help thanks
Technical SEO | | Caffeine_Marketing0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Don't reach to make our site back in rankings
My URL is: http://tinyurl.com/nslu78 Hi, I really hope someone can help because my site seems to be penalized since last year now. Because we were not SEO experts but doctors and wanted to do things in a white hat way so we have given our SEO strategy (on-site and off-site) to the best US SEO agencies and now we are penalized. We was ranking on the 1st page with 15 keywords and now we don't even are in the first 10 pages. I know that our sector is suspicious but we are a real laboratory and our site is 100% transparent. I understand that a lot of people can think that we are all the same but this is not true, we are not a virtual company that don't even show their name or address, we show name, address, phone number, fax, email, chat service, VAT number everything so please help us. We have spent 3 months analysing every paragraph of google guidelines to see if we were violating some rule such as hidden text, link schemes, redirections, keyword stuffing, maleware, duplicate content etc.. and found nothing except little things but maybe we are not good enough to find the problem. In 3 months we have passed from 85 toxic links to 24 and from 750 suspicious links to 300. we have emailed, and call all the webmasters of each site several times to try to delete as many links as possible.We have sent to google a big excel with all our results and attempts to delete those badlinks. We have then sent a reconsideration request explaining all the things that we have verified on-site and off-site but it seems that it didn't worked because we are still penalized. I really hope someone can see where the problem is.
Technical SEO | | andromedical
thank you0 -
Webmaster Tools "Links to your site" history over time?
Is there a way to see a history of the "links to your site"? I've seen a lot of posts here from people say "I just saw a big drop in my numbers." I don't look at this number enough to be that familiar with it. Is there a way to see if Google has suddenly chopped our numbers? I've poked around a little, but not found a method yet. Thanks, Reeves
Technical SEO | | wreevesc0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
Same URL in "Duplicate Content" and "Blocked by robots.txt"?
How can the same URL show up in Seomoz Crawl Diagnostics "Most common errors and warnings" in both the "Duplicate Content"-list and the "Blocked by robots.txt"-list? Shouldnt the latter exclude it from the first list?
Technical SEO | | alsvik0 -
Page rank 2 for home page, 3 for service pages
Hey guys, I have noticed with one of our new sites, the home page is showing page rank two, whereas 2 of the internal service pages are showing as 3. I have checked with both open site explorer and yahoo back links and there are by far more links to the home page. All quality and relevant directory submissions and blog comments. The site is only 4 months old, I wonder if anyone can shed any light on the fact 2 of the lesser linked pages are showing higher PR? Thanks 🙂
Technical SEO | | Nextman0 -
We have been hit with the "Doorway Page" Penalty - fixed the issue - Got MSG that will still do not meet guidelines.
I have read the FAQs and checked for similar issues: YES / NO
Technical SEO | | LVH
My site's URL (web address) is:www.recoveryconnection.org
Description (including timeline of any changes made): We were hit with the Doorway Pages penalty on 5/26/11. We have a team of copywriters, and a fast-working dev dept., so we were able to correct what we thought the problem was, "targeting one-keyword per page" and thin content. (according to Google) Plan of action: To consolidate "like" keywords/content onto pages that were getting the most traffic and 404d the pages with the thin content and that were targeting singular keywords per page. We submitted a board approved reconsideration request on 6/8/11 and received the 2nd message (below) on 6/16/11. ***NOTE:The site was originally designed by the OLD marketing team who was let go, and we are the NEW team trying to clean up their mess. We are now resorting to going through Google's general guidelines page. Help would be appreciated. Below is the message we received back. Dear site owner or webmaster of http://www.recoveryconnection.org/, We received a request from a site owner to reconsider http://www.recoveryconnection.org/ for compliance with Google's Webmaster Guidelines. We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from http://www.recoveryconnection.org/ may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team Any help is welcome. Thanks0