Hi Brian,
Just sent you a copy from the guidelines I downloaded yesterday from Searchengineland.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Brian,
Just sent you a copy from the guidelines I downloaded yesterday from Searchengineland.
Yeah, it shouldn't be like this but it's probably due to the plugin you're using. Best would be to kick off the new year in style and start using the WordPress SEO plugin by Yoast. It will automatically take care of updating your sitemaps whenever you post new content. On top of that you could also filter out individual posts if needed.
Hi Phillipp,
You almost got me with this one, but it's fairly simple. In your question you're pointing at the robots.txt of your HTTP page. But it's mostly your HTTP**S **pages that are indexed and if you look at that robots.txt file it's pretty clear why these pages are indexed: https://www1.swisscom.ch/robots.txt all the pages that are indexed match with one of your Allow statements are the complete Disallow. Hopefully that provides you with the insight on how to fix your issue.
Hi Raymond,
I seriously hope that they meant this in a different context than just saying PHP for SEO is bad. Because it's absolutely not true, at least 20% of the web is almost using WordPress which is nothing more than PHP. Besides that millions of sites are using PHP as well so I won't bother about this advice and if they literally said this then look for another SEO company as they're probably not worth the risk.
Hey!
Yeah you forgot to close some objects + arrays. This is at least validating but please make sure your objects are correct:
Using your keyword twice is definitely not what I would call keyword stuffing. But I would make sure that you create quality content for this post as the EMD (exact matching domain) could get you in trouble someday if you don't do this.
Hi Carl,
Auch, this is probably not a use case you ever wanted to fix for you clients. However I would suggest filing a DCMA request based on the copyright of your texts/ images used on your client site. This will, hopefully, at least remove the copied site out of Googles index.
Filing such a request could be done here: http://support.google.com/bin/static.py?hl=en&ts=1114905&page=ts.cs
Good luck!
Hi Eric,
The useragent of SEOMoz is: rogerbot.
Probably. It depends on what the quality is of those links, but if you likely already have to ask the question the answer in most cases is Yes.
I would check the service providers first just to know for sure they're all coming from the same provider. You can check this by visiting your Audience > Technology > Network report on the left side of your Google Analytics. If you see the same network and browsers being used I would use a filter (only if you're really determined/ 100% sure that it's bot traffic) to get them completely out of your Google Analytics view.
Hi,
What you should realize is that a drop of 1 point in Domain Authority isn't the end of the world. You probably were already at the low end in this case of the number that you had. What likely would have happened is that you received some additional links (or maybe lost a couple) that were of lower domain authority by itself. With that your average receiving DA is decreasing a bit. But as this number is just a way to tell you what's going on I wouldn't recommend panicking over it as the data that Google has is more robust and isn't just based on 1 number like DA. I've seen many cases were DA can drop with 5-10+ points and traffic/business metrics aren't affected at all.
Martijn.
For any people later coming in, in this thread. I got in touch with the site owner, it turned out that the GA tracking code was missing on the thank-you page.
Hi Dan,
Yes there is, it's called Campaign Tracking and is a feature in Google Analytics. You add a couple of extra parameters to your URL to identify the specific source, medium and campaign. There is a great URL Builder which could easily generate the URLs for you, which could be found here.
For example your new URLs would be something like this:
http://www.example.com/bedrift/netthandel/?utm_source=sitename&utm_medium=referral&utm_campaign=_banner1 _
Happy analyzing!
No, it won't help you at all as it's not a valid extension that they will use. What you can do is add a link to the HTML sitemap from multiple pages on your site so you provide an efficient way for Google to access it and use it to crawl the other pages on your site.
Imagine you're in the Facebook application and you open an article you see in the news feed, well that's your Safari browser in application. So the likeliness of the user bouncing is very high as you usually only engage with the link you're clicking.
Hi Jason,
I wouldn't worry about changing this at all, in the end, the 50K limit that has been put on sitemap is an arbitrary one. So if you keep your sitemaps well under that it doesn't really change anything at all. In the end, the files itself are not a ranking factor, they're being used to become aware of URLs that don't exist on the site or for search engines to be notified of URLs that have been updated (through the last mod attribute). So changing it to 15K shouldn't harm you.
Martijn.
What? This is the first time ever I've heard this, sounds like a total scam to me.
Hi Jorge,
If you could sent me you're emailaddress by PM then I'll make sure you'll receive the handbook. Because I received more questions about this, this would be the best option.
@sammecooper If you're trying to rank for: Training and Engagement then you likely want to include that in the title tag instead of making it too descriptive. Google does put still quite some value in it. In this case, I'd try to find a way to make it both descriptive as well as enticing to click.
It's not completely the way of preserving your shares, pins & likes for all of your content but in this post of Mike King he explains on how to still get the numbers for your old pages: http://searchenginewatch.com/article/2172926/How-to-Maintain-Social-Shares-After-a-Site-Migration
Hi Neil,
Wow, this was a pretty hard question. But I've got (hopefully) a useful answer for you:
The metric: Organic Searches you found within building a custom report in Google Analytics is defined as: "The number of organic searches that happened within a session. This metric is search engine agnostic.".
This differs a bit from the way Non Paid Search traffic is measured: "Organic campaigns can come from an unpaid search engine results link, a referral from another website (such as a blog) and direct traffic.".
I'm not quite sure but the difference in my opinion is within the way sessions are defined for both types and possibly also the way unpaid is defined for both metrics.
Hopefully this will help you any further!
Hi Atul,
Data from Google Webmaster Tools could be found in the new version of Google Analytics. You'll have to follow the following menu structure to get to the data: Traffic Sources > Search Engine Optimization > Queries. This data will be provided from Google Webmaster Tools.
It could take a couple of hours before you're data could show up.
Happy analyzing!
No unfortunately this data is not available at all via the API.
What you could do is check how the message was sent to the user by what connection. It could say for example that it was using a Chrome Browser or another application who supports this.
Great answer by Tom already, but I want to add that probably images and other types of content whom are mostly not by default included in sitemaps could also be among the indexed 'pages'.
Hi David,
Welcome to the grey area of SEO, according to the rules of Google you're not allowed to mark up your reviews with the proper schemas when you don't show the complete reviews about your company or service. So as long as you don't show them you can tell Google whatever kind of reviews you have since nobody is going to make sure that's really the case. So indeed Google's just accepting the reviews on faced value. They're automatically approving them since about 1,5 years and if you ask me it didn't increase the quality of the results.
In this case they marked up the text with the rich snippet data for reviews, more info could be found here. There's not much you can do about this, you could try and file a spam request but that's really a long shot.
Hope this helps!
Hi Iris,
The 'issue' for REL canonicals is only just a warning that Moz found the rel canonical on your site. So it's not saying it's an issue of some sorts. But if you want to be sure post the URL and we'll take a look.
No currently it's not possible. Although I would also like to be able to do that ;-).
Hi,
The answer is that it really depends, SquareSpace for a 'bigger' business is not something that I would recommend itself as it doesn't provide you with a ton of flexibility to do custom things that a 'normal' Web site will. But in this case I'm doubting that. It is mostly going to depend on the goals that they have and the competition that they're having in their domain. If this is an industry where there is a lot of competition you'll have to do a ton of work to get a SquareSpace site ranked better as you can't focus on any technical SEO issues and must rely on having good enough content on the pages. That is possible with a SquareSpace site and shouldn't need a redo of their site.
However if the space is filled and they'll have to do a ton of work in order to make this business rank in their industry/niche then it would sometimes be wise just to rebuild it as it could make things easier. Easier to update, easier to add new content and easier to handle certain implementations. If that's not the case I wouldn't put my money on it. In the end you can probably make a relatively easy business case. What's the expected impact and what additional costs would a new site bring.
Martijn.
Hi Sander,
I'd say no, it's not really bad. It will have a small impact on the load time of your site probably but besides that it will enable you to do most of the things via a tag manager which will provide you with more flexibility.
Hi Byron,
There are a couple of tools I know off that could help you out.
Hi,
It can be found in Google Webmaster Tools and can be used to fetch a page to see what Google sees when they visit your page.
I hope you don't use that for tracking e-commerce as it doesn't contain the e-commerce tracking To enable e-commerce tracking you need a couple of JS functions. All the documentation you'll need around this topic can be found here: https://developers.google.com/analytics/devguides/collection/analyticsjs/ecommerce
ga('require', 'ecommerce'); is the first to load the GA E-commerce Library.
ga('ecommerce:addTransaction', {
'id': '1234', // Transaction ID. Required.
'affiliation': 'Acme Clothing', // Affiliation or store name.
'revenue': '11.99', // Grand Total.
'shipping': '5', // Shipping.
'tax': '1.29' // Tax.
});
This part is where you fill in all the details of the transaction, so it's a summary. In the next step you'll add all the needed data for the items within the transaction.
ga('ecommerce:addItem', {
'id': '1234', // Transaction ID. Required.
'name': 'Fluffy Pink Bunnies', // Product name. Required.
'sku': 'DD23444', // SKU/code.
'category': 'Party Toys', // Category or variation.
'price': '11.99', // Unit price.
'quantity': '1' // Quantity.
});
The last command: ga('ecommerce:send'); will send the transaction to GA.
Yes I would definitely link back the images to the new site so you can make sure that if they have any link juice they still will have when they're redirected.
Hi Jordan,
I would only stay away from tags related to frames and iframes as they're not good for SEO. Besides that you'll be OK with every kind of tag. Including the span tag. It's ridiculous that somebody would argue that using these tags are not good for SEO as they provide a lot of opportunity to style elements within an element.
Hi Ignitas,
To get some insights into what urls are linking to a 404 page the best would be to export your Crawl Diagnostics to CSV. These files contain the urls linking to your pages.
Hope this helps a bit! But I would still suggest to the Moz team to include this feature within the PRO tools.
Hi
It really depends, so far they are really focusing on bringing article and news kind of posts to AMP, and it's the same currently with Facebook Instant Articles. And all the development around both seems to be still focused on this for at least the next 3-6 months I'd say. I'm assuming at some point they'll switch to supporting more though.
Hi Alex, although I'm starting to doubt myself I'd say the issue is in the values of the hreflang tags. The values are using the ISO codes and I'm pretty sure they're separated with a dash instead of an underscore.
Hi,
The answer is neither are accurate. But the best answer on this one is given by Ryan Kent in a previous thread about this issue. "If you want the most comprehensive link data, it is best to use numerous backlink tools and combine the data." - Ryan.
I would definitely suggest to combine the data itself as both aren't fully accurate.
I wonder if your metrics really jumped up since a week because you're using Moz, let's still give them some credit but they updated their Moz metrics last week so the work to up your authority was probably already done 1,5 months ago.
Next to that, there is no way unfortunately to tell Moz about the links that you've got but are not in the list at the moment, you have to trust Moz and hope for the best on their next update that these links will be in there index.
Hope this helps!
Hi Robin,
No it's probably not an issue for SEO. I'm doing an estimated guess that you're on WordPress as hentry usually provides some of these warnings there. What you should do to fix it is adding a couple of extra classes to the specific elements for these cases.
Hi Beth,
Did you check all the URLs that might be relative as well? It feels like one of your outgoing links might be missing the http:// part which makes crawlers think that the URL is relative.
Martijn.
Hi,
Yes! They're still important, in itself they won't give you any ranking boost as it will only increase the easiness for the search engines to crawl your pages. So they will find all your pages on your site more easily. If you're already using WordPress you can easily set up your plugins to provide you with sitemaps, don't forget to add them in a line to your robots.txt and definitely make sure you submit them in Google Search Console and Bing Webmaster Tools.
Martijn.
Yes, you should be able to just list the different URLs per language. You can read more about that in Googles documentation here: https://support.google.com/webmasters/answer/2620865?hl=en
Hi,
Always think about: would it be normal if they would link to a page on my site with that kind of anchor link. If the answer would be yes because the link text will be the same as the product name then it's OK. I would always be very cautious with this. This is something we used to call rich anchor text spam if you would take this the wrong way and try to get all your links with the anchor text fully related to your page.
Hope this helps!
Agree with Keszi, you'll eventually run into issues with the copyright on pictures, either by the celebrity or the owner of the picture.
I would say Yes, if you make sure you're redirecting the right pages to the working URLS on the other site you should be fine. Also make sure that sitemaps are updated as Google won't like redirects within a sitemap.
Are you normalizing it somehow? As some of the other charts that are available are showing increases too but not on a daily basis anymore. At some point the new reality should become the baseline right?
Hey Taylor,
You can do it from a incognito window, that would probably be slightly better then actually comparing it through a logged in window but probably won't make a big difference. In the end you're then still giving away a ton of signals about your browser and who you are (basically the same between the two). Probably better would be to use a random rank tracker who can do this, in that case you'll have less personalized issues.
Hi Clojo,
Just like any other backlink the links from Reference and Ask.com will have value but I doubt for your competitors they will make the actual difference in your rankings versus their rankings. As you've noticed their Domain Authority is very high but in the end they also link to hundreds of thousands of other sites which will eventually also decrease the value of a link from them. What I would be focused on is getting more links from relevant sources and I wouldn't consider these two sites one of them although it would be a nice to have to get them.
Martijn.