Is there a way to get a list (backlink profile) of all tiny url's that point to my site or a competitors site?
-
I have noticed that most all links you find in all the major back link profile tools such as OSE or GWM, etc... do not show tiny url's. If there is a service that shows all the tiny urls pointing to your site, can someone please share.
It has already been proven that tiny url's do pass link juice, so with that being said... if there is no way to find all the tiny urls that point to a site, wouldn't it be a great strategy to create all my back links with tiny url's to mask my profile from competitors?
Thanks!
-
You can get a pretty complete list from Open Site Explorer by changing some of the filter settings.
Set Show to only 301 Set **links from **to only external Set pages to pages on this root domain
This should show you a significant portion of your inbound short URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does nudity on a site affect search results?
One of my clients sells lingerie and with it being lingerie there are a fair few photos of bottoms and some exposed breasts. Gosh. Anyway I know how this affects Adwords campaigns - Google classifies the site as 'adult' and your ads don't show anywhere. I also know how it affects image searches. However how does it affect text seaches?? Are rankings demoted because of nudity? I've worked with a clothing site with some nudity on it before and this didn't affect it, but would love to hear from anyone with specific experience of thisThanks
Industry News | | neenor0 -
Has anybody used Yext or Universal Business Listings as an automated approach to getting clients into all of the many directories? If so does it work? Or does Google penalize in using these automated services?
I'm trying to figure out if using either Yext or Universal Business Listings is worth it. They have reseller programs for SEO agencies. I just am curious what other SEO folks think of these services as I'm considering using one of them to automate and save time for clients. If you go to Yext.com or universalbusinesslistings.org you can see these. Curious what others say about these. Thanks
Industry News | | SOM240 -
Can't seem to submit my xml sitemap to Baidu, can someone help?
Hello Mozers and SEM's, I'm sure all of you know and understand the importance of uploading your sitemap to search engines. Once search engine in particular is Baidu. For some reason, i can't find a way to submit my link www.mysite.com/sitemap.xml. For Google, Bing, and Yandex it was easy, but Baidu is giving me problems. Don't tell me to "wait until they crawl your site", i have over 1000+ pages of unique content that those other 3 search engines found because of my sitemap; Baidu can't be that slow. Their robots only found 147 pages. 😞 If you take a look at the image attachment, you can see why I'm stuck, i can't read chinese! (and it's an image, so i couldn't translate it). Has anybody had any luck submitting their .xml link into Baidu? Can someone walk me through it? Let me know! Shawn xILP69w
Industry News | | Shawn1240 -
SEO Risks for redirecting sites
Hey Everyone, I've tried searching for this question, but am not exactly sure what keywords to search for so I'm probably missing the resources if they already exist... My client has had duplicated sites for years, and after multiple penalizations of those sites I was finally able to convince him to consolidate them into a "mega-site". Currently, he has a main domain, a geo-subdomain for each office location under the main domain, and a geo-domain for each office location. We plan on redirecting each geo-domain to the corresponding geo-subdomain. So, the final result will be one main domain, and a sub-domain for each office location. I'm looking for any information regarding tracking SEO data after the redirects are in place, how to guard against potential drops in SERPs, what's the smartest strategy to implement, etc... My client is very sensitive to his sites' SEO data, so if anyone has any SEO-related advice regarding redirecting sites it would be greatly appreciated! Thank you!
Industry News | | Level2Designs0 -
Are Wordpress sites being dinged by Google? Read a few articles regarding.
I read a couple "SEO" related articles that sites built in Wordpress are going to be dinged by Google because Google sees Wordpress sites as simple to make and a higher potential to be "spammy". Is there any truth to this? Your thoughts? I do give "thumbs up" and "best answer" marks and appreciate receiving thumbs up myself... Thanks
Industry News | | JChronicle1 -
Subdomains seen as one site
Earlier this month Google announced that sub-domains are now
Industry News | | AlanMosley
be treated as one site. At first I thought this was good news as I like to use
sub-domains for separation of categories and the like. But what about links
from one sub-domain to the other, they uised to be external links now they are
internal links. If you don’t have many external links, I would say that the
cross sub-domain links would have been important, if you have a lot of external
links then the flow of link juice would be of more benefit. I think overall its
is a good thing. Does anyone have any opinions about this or know of any writings
on the subject since this announcement? http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Schedule for Panda / Farmer Update rollout for non English sites?
Hey Mozzers, I just read about the global rollout of googles Panda/Farmer Update for English queries, e.g. at searchengineland. Has anyone read something about when google is planning to roll out the update for non-English speaking countries, i.e. queries in other languages? I googled a bit but couldn't find anything, not even speculations. Cheers, Frank
Industry News | | FranktheTank-474970