Amazing results even after 1 week!
-
Have to say, I'm pretty impressed with Moz, this is now my first full week of membership and wow have I seen some great increases in my site stats! Hopefully this isn't just a blip and that it wil continue for weeks and months to come.
- Authority has jumped from 27 to 34
- Google page one results jumped from 7 to 13
- Trafffic increased by 12%
- Solved duplicate content issues
- Started a proactive social media campaign
I could go on and on, but can't say enough positive things about the services that are provided here, an investment well worth paying and already paying for itself.
The goal for the next few weeks is to improve domain authority from 34 to 40+, I've been using long tail phrases for my articles, which has been tremendously beneficial.
One query is that even though the domain authority has moved from 27 - 34, I don't appear to have gained any extra backlinks - perhaps I'm misunderstand this metric?
The other query is that there are 100's of backlinks pointing to my domain (I provide an open source cms so I know the links are there), but none of these lnks appear to be counting towards my authority. Is there a way I can submit these pages to the index on their behalf?
Cheers, Lee
-
Hi Marijn,
thanks for responding, I guess I got the wrong impression of the Moz functionality, thought that the weekly updates were instantanious and provide a "as it is now" perspective of authority, so its handy to know, so thanks for the share
I have a query about updates, (in relation to the Google index) that you might be able to help me with? Only if you have the time of course.
I provide software that displays a footer link, I know for sure which sites are using my software and that a little under half retain the footer link.
A best case guess of the amount of links that are pointing to my main are in the order of 400k (maybe even double this amount), because it varies per site dependant on the amount of pages, one site, has 10k pages (which is probably the largest) but the others range from 100 to a couple of thousand pages (all of which contain a link back).
The problem is that my domain authority doesn't seem to reflect this amount of links. One would think even new sites would pass a little link juice, or is it possible that Google is discounting most or a selection of them?
I was thinking of compiling a list of sites that use my software, with links so that Google will crawl the sites, but I read that this can affect authority negatively?
fyi - Moz says that there are only 35k, which is less than a 10th (at best) of what's out there.
Best wishes, Lee
-
I wonder if your metrics really jumped up since a week because you're using Moz, let's still give them some credit but they updated their Moz metrics last week so the work to up your authority was probably already done 1,5 months ago.
Next to that, there is no way unfortunately to tell Moz about the links that you've got but are not in the list at the moment, you have to trust Moz and hope for the best on their next update that these links will be in there index.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Help: Blog post translations resulting in 404 Not Found?
A client set up a website that has multilingual functionality (WPML) and the back end is a bit of a mess. The site has around 6 translated versions of the 30 or so existing English blog posts in French, Italian and Spanish - all with their own URLs. The problem is that on the remaining 24 English blog posts, the language changer in the header is still there - even though the majority of posts have not been translated - so when you go to change the language to French, it adds **?lang=fr **onto the existing english URL, and is a page not found (4xx client error). I can't redirect anything because the page does not exist. Is there a way to stop this from happening? I have noticed it's also creating italian/french/spanish translation of the english Categories too. Thanks in advance.
Technical SEO | | skehoe0 -
How to avoid duplicate content on internal search results page?
Hi, according to Webmaster Tools and Siteliner our website have an above-average amount of duplicate content. Most of the pages are the search results pages, where it finds only one result. The only difference in this case are the TDK, H1 and the breadcrumbs. The rest of the layout is pretty static and similar. Here is an example for two pages with "duplicate content": https://soundbetter.com/search/Globo https://soundbetter.com/search/Volvo Edit: These are legitimate results that happen to have the same result. In this case we want users to be able to find the audio engineers by 'credits' (musicians they've worked with). Tags. We want users to rank for people searching for 'engineers who worked with'. And searching for two different artists (credit tags) returns this one service provider, with different urls (the tag being the search parameter) hence the duplicate content. I guess every e-commerce/directory website faces this kind of issue. What is the best practice to avoid duplicate content on search results page?
Technical SEO | | ShaqD1 -
Results pages are not getting pagerank
Hello there, I have a website with a PR5 and seo "juice" is passing down smoothly except for results pages (sorry french ) : http://homengo.com/comment-ca-marche/presentation/ is getting a PR http://homengo.com/s/vente/paris_dept-75/ is not The same goes for all results pages which could indicate a problem. Is there something wrong with these pages, i can not figure it out, or do you have some tools which could help identify the trouble ? Thanks a lot
Technical SEO | | seomengo0 -
Some competitors have a thumbnail in Google search results
I've noticed that a few of my top competitors have a small photo (thumbnail) next to their listing. I'm sure it's not a coincidence that they are ranked top for the search phrase too. Is this really a help and how can it be done? Many thanks, Iain.
Technical SEO | | iainmoran0 -
Will an identical site impact SERP results
I came across two identical sites for two different business owners in the same industry. I'm sure you've seen these. A web company offers individuals in the same profession a template site with the exact same content for each site. All that is different is the domain. i.e. mycompany.com/news/topicsname will have the exact same content, images, tags, etc. as mycompany2.com/news/topicsname. I would assume having the duplicate content, especially if two site owners are in the same town, will ultimately hurt the rankings of at least one site. Is this correct? Thank you for your help.
Technical SEO | | STF0 -
Why is a site search query being returned in SE results
Hello One of my top targeted keywords is now linking to the results page of an internal site search query (the search query is for my site url). Oddly, this page does not contain the targeted keywords. My site url used to be the highest ranking page for my targeted keywords. Can anybody advise why this is happening and how I can change it? Thanks Nick
Technical SEO | | PP_user0 -
Google +1 Button on Flash sites
One of my customers is willing to add Google +1 button on their Flash websites. Is it possible? How can we add Google +1 button on a Flash site? Thanks in advance!
Technical SEO | | merkal20050