Guys & Gals anyone know if urllist.txt is still used?
-
I'm using a tool which generates urllist.txt and looking on the SEO Forums it seems that Yahoo used to use this. What I'd like to know is is it still used anywhere and should we have it on the site?
-
Thanks for the advice, we already create and submit the XML sitemap to Google, that wasn't the question. Would there be any benefit in creating the urllist.txt file?
-
I would just use a sitemap.xml file instead for Google, Bing and Yahoo. Then you can submit the sitemap.xml file within the Google Webmaster Tools and Bing Webmaster Tools (includes Yahoo). You can easily create an XML sitemap at http://www.xml-sitemaps.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issue with GA tracking and Native AMP
Hi everyone, We recently pushed a new version of our site (winefolly.com), which is completely AMP native on WordPress (using the official AMP for WordPress plugin). As part of the update, we also switched over to https. In hindsight we probably should have pushed the AMP version and HTTPS changes in separate updates. As a result of the update, the traffic in GA has dropped significantly despite the tracking code being added properly. I'm also having a hard time getting the previous views in GA working properly. The three views are: Sitewide (shop.winefolly.com and winefolly.com) Content only (winefolly.com) Shop only (shop.winefolly.com) The sitewide view seems to be working, though it's hard to know for sure, as the traffic seems pretty low (like 10 users at any given time) and I think that it's more that it's just picking up the shop traffic. The content only view shows maybe one or two users and often none at all. I tried a bunch of different filters to only track to the main sites content views, but in one instance the filter would work, then half an hour later it would revert to no traffic. The filter is set to custom > exclude > request uri with the following regex pattern: ^shop.winefolly.com$|^checkout.shopify.com$|/products/.|/account/.|/checkout/.|/collections/.|./orders/.|/cart|/account|/pages/.|/poll/.|/?mc_cid=.|/profile?.|/?u=.|/webstore/. Testing the filter it strips out anything not related to the main sites content, but when I save the filter and view the updated results, the changes aren't reflected. I did read that there is a delay in the filters being applied and only a subset of the available data is used, but I just want to be sure I'm adding the filters correctly. I also tried setting the filter to predefined, exclude host equal to shop.winefolly.com, but that didn't work either. The shop view seems to be working, but the tracking code is added via Shopify, so it makes sense that it would continue working as before. The first thing I noticed when I checked the views is that they were still set to http, so I updated the urls to https. I then checked the GA tracking code (which is added as a json object in the Analytics setting in the WordPress plugin. Unfortunately, while GA seems to be recording traffic, none of the GA validators seem to pickup the AMP tracking code (adding using the amp-analytics tag), despite the json being confirmed as valid by the plugin. This morning I decided to try a different approach and add the tracking code via Googles Tag Manager, as well as adding the new https domain to the Google Search Console, but alas no change. I spent the whole day yesterday reading every post I could on the topic, but was not able to find any a solution, so I'm really hoping someone on Moz will be able to shed some light as to what I'm doing wrong. Any suggestions or input would be very much appreciated. Cheers,
Technical SEO | | winefolly
Chris (on behalf of WineFolly.com)0 -
Use keywords that has another keyword in it for another link
Hi, I have these two links: A1 & A2 and the keywords for them are these: Pest control for A1 Pest control service for A2 is google smart enough to differentiate these two & rank the exact page for them accordingly? or does google guess pest control keyword in A2 link as well? please help me with this issue. & the same with these : termite inspection & termite inspections Arizona!! many thanks Shervin
Technical SEO | | Shervin0 -
Video & Graph That Lazy Loads
Hi, Product pages on our site have a couple of elements that are lazy loaded / loaded after user action. Apart from images which is a widely discussed topic in lazy loading, in our case Videos & Price Graphs are lazy loaded. For videos we do something that Amit Agarwal recommended here: http://labnol.org/internet/light-youtube-embeds/27941/ - We load a thumbnail and a play button over it. When a user clicks that play button, the video embedd form Youtube would load. However we are not sure if Google gets that and since the whole thing is under a H3 tag, will we a) loose out benefit of putting a relevant video there b) send any negative signals for only loading a image thumbnail under an h3 tag? We also have price graph, that lazy loads and is not seen when you see a cached version of our page on Google. Are we losing credit (in Google's eyes) for that content on our page? Sample page which has both price history graph & video http://pricebaba.com/mobile/apple-iphone-6s-16gb Appreciate your help! Thanks
Technical SEO | | Maratha0 -
Unique use of nofollow tag
Love the community here. I just had a quick question about the using noindex, nofollow. We are a car dealership group that uses a website provider (cobalt). Since they provide the website they are the only ones with access to remove pages etc. We can add pages but only they can remove them. There are some pages we need to have removed but according to them they are unable to remove them, (I think the manufacture might mandate having some pages), anyway some of these pages literally have nothing on them, and there isn't really any useful content we could add to them. So we are using noindex on them to ensure that they stay out of search indices, but I am wondering if we should also use nofollow on them. If I understand nofollow correctly it just means search engines won't follow the links on the page, well for most of these pages the only links on them are the navigation, and since we don't plan on adding any content to these pages and we can't remove them should we use noindex and nofollow as a way to "remove" them from the site as much as we can?
Technical SEO | | Murdock_Auto_Group0 -
Wordpress rel next & previous for SEO
Hi, I have implemented this function into my wordpress theme. However, I can only get the prev rel to show up. Does anyone have an idea? function rel_next_prev(){
Technical SEO | | SEOhughesm
global $paged; if ( get_previous_posts_link() ) { ?>
} if ( get_next_posts_link() ) { ?>
} }
add_action( 'wp_head', 'rel_next_prev' );
?>0 -
Using a single sitemap for multiple domains
We have a possible duplicate content issue based on the fact that we have a number of websites run from the same code base across .com / .co.uk / .nl / .fr / .de and so on. We want to update our sitemaps alongside using the href lang tags to ensure Google knows we've got different versions of essentially the same page to serve different markets. Google has written an article on tackling this:https://support.google.com/webmasters/answer/75712?hl=en but my question remains whether having a single sitemap accessible from all the international domains is the best approach here or whether we should have individual sitemaps for each domain.
Technical SEO | | jon_marine0 -
Has anyone used a company to help promote their site
Hi, i receive around ten emails a day claiming they can help you get your site in the top ten in google, now i know most are a load of rubbish but i am just wondering if anyone has used any of these companies for a new site or an old site. I am about to launch a new site after xmas and i am just wondering if any of these companies are worth looking at to help promote the new site instead of doing all the ground work myself. Would love to know your thoughts
Technical SEO | | ClaireH-1848860 -
.htaccess problem using POST method
Hi guys I'm after some help with trying to achieve the following: 1. Canonicalise to http://www. 2. Remove the index.php from root and subfolders. I have the .htaccess code below, which seemed to work fine, but the urls use the POST method and this isn't working with the rewrites. Can anyone please advise as to what I am doing wrong? As you can probably guess .htaccess isn't my strongest SEO discipline! The code I have is: http:// to http://www. RewriteEngine on
Technical SEO | | TrevorJones
RewriteCond %{HTTP_HOST} ^mydomainexample.com
RewriteRule (.*) http://www.mydomainexample.com/$1 [R=301,L] /index.php to / Options +FollowSymLinks
DirectoryIndex index.php RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.php\ HTTP/
RewriteRule ^index.php$ http://www.mydomainexample.com/ [R=301,L] Subdirectory /index.php to / RewriteCond %{THE_REQUEST} ^[A-Z]+\ /([^/]+/)index.(php|html|htm?)[#?]?
RewriteRule ^(([^/]+/))index.(php|html|htm?)$ http://www.mydomainexample.com/$1 [R=301,L] Just to add to this I have found this which I think is what I need to restrict it to GET: RewriteCond %{THE_REQUEST} ^GET.*index\.php [NC]RewriteRule (.*?)index\.php/*(.*) /$1$2 [R=301,L] Thank you in advance for any suggestions as to how I may put this code together.. Trevor0