I want to rank with this page http://www.servicesarab.com/%D9%86%D9%82%D9%84-%D8%B9%D9%81%D8%B4-%D8%A7%D9%84%D9%83%D9%88%D9%8A%D8%AA/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Free tool, and it ranks well for adult sites and checking if they are down, will that hurt us with ranking for normal sites with google?
Hi all, We rank for searches around "is youporn down" and similar because we provide a free tool to check if a website is up or down: https://downforeveryoneorjustme.com/youporn I am worried that ranking for these adult searches is hurting us with ranking for things like "is reddit down", thoughts? I'd appreciate some input!
White Hat / Black Hat SEO | | bwb0 -
Unlisted (hidden) pages
I just had a client say they were advised by a friend to use 'a bunch of unlisted (hidden) pages'. Isn't this seriously black hat?
White Hat / Black Hat SEO | | muzzmoz0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Rank drop ecommerce site
Hello, We're going to get an audit, but I would like to hear some ideas on what could cause our ranking drop. There's no warnings in GWT. We deleted 17 or so blogs (that had no backlinks pointing to these blogs and were simply for easy links) last summer thinking that they weren't white hat so we had to start eliminating them. At the same time, we eliminated a few sitewide paid links that were really strong. With all of this deletion, our keywords started to drop. For example, our main keyword went from first to third/fourth. With the deletions, our keywords dropped immediately a couple of spots, then with no more deletions, all of our keywords have been slowly dropping over the last seven months or so. Right now we are at the bottom of the first page for that same main keyword, and other keywords look similar. We have 70 linking root domains, of which: 15 are blogs with no backlinks that were created simply for the purpose of easy links. We didn't delete them all yet because of the immediate ranking drop when we deleted the last ones. One PR5 site has links to our home page scattered throughout it's lists of resources for people in different states in the US. It doesn't look like a standard paid link site, but it has many paid links in it's different pages. One PR4 site has our logo with another paid link logo at the bottom of one of it's pages. There are 2 other paid links from two PR4 sites that look editorial. There are other links on the sites to other websites that are paid. All links for these 2 sites look editorial. That's all the bad stuff. Other things that could be causing drop in rank - > Our bread crumbs are kind of messed up. We have a lot of subcategory pages that rel=cononical to main categories in the menu. We did this because we had categories that were exactly the same. So you'll drill down on a category page and you'll end up on a main category. To the average user, it seems perfectly fine. Our on-site SEO still has a few pages that repeat words in the titles and h1 tags several times (especially our #1 main keyword), titles similar to something like: running shoes | walking shoes | cross-training shoes where a word is repeated 2 or 3 times. Also, there are a few pages that are more keyword stuffed than we would like in the content. Just a couple of paragraphs but 2 keywords are dispersed in them three times each. The keywords in this content is not in different variations, it's exactly the keyword. We've still got a few URLs that are keywords stuffed with like 3 different keywords. We may have many 404 errors (due to some mistakes we made with the URLs in our cart) - if Google hasn't deindexed them all then we could have dozens of 404s on important category pages. But nothing is showing up in GWT. Our sitemap does not include any broken links. Google is confused about our branding it seems. I'm adding branding to the on-site SEO but right now Google often shows keywords as our branding when Google changes the way the title tag is displayed sometimes in the search engines. We don't link out to anyone. We have lots of content, almost no duplicate content, and some authoritative very comprehensive articles. Your thoughts on what to do to get our rankings back up?
White Hat / Black Hat SEO | | BobGW0 -
Help with E-Commerce Product Pages
Hi, I need to find the best way to put my products on our e-commerce website. I have researched and researched but I thought I'd gather a range of ideas in here. Basically I have the following fields: Product Title
White Hat / Black Hat SEO | | YNWA
Product Description
Product Short Description SEO Title
Focus Keyword(s) (this is a feature of our CMS)
Meta Description The problem we have is we have a lot of duplicate content eg. 10 Armani Polos but then each one will be a different colour (but the model number is the same). I don't want to miss out on rankings because of this. What would you say is the best way to do this? My idea is this: Product Title: Armani Jeans Polo Shirt Blue
Product Description: Armani Jeans Polo Shirt in Blue Made from 100% cotton Armani Jeans Polo with Short Sleeves, Pique Collar and Button Up Collar. Designer Boutique Menswear are official stockists of Armani Jeans Polos.
Short Description: Blue Armani Jeans Polo SEO Title: Armani Jeans Polo Shirt Blue MA001 | Designer Boutique Menswear
Focus Keywords: Armani Jeans Polo Shirt
Meta Description: Blue Armani Jeans Polo Shirt. Made from 100% cotton. Designer Boutique Menswear are official stockists of Armani Polos. What are peoples thoughts on this? I would then run the same format across each of the different colours. Another question is on the product title and seo title, should these be exactly the same? And does it matter if I put the colour at the beginning or end of the title? Any help would be great.0 -
Removing Poison Links w/o Disavow
Okay so I've been working at resolving former black-hat SEO tactics for this domain for many many months. Finally our main keyword is falling down the rankings like crazy no matter how many relevant, quality links I bring to the domain. So I'm ready to take action today. There is one inner-page which is titled exactly as the keyword we are trying to match. Let's call it "inner-page.html" This page has nothing but poison links with exact match anchor phrases pointing at it. The good links I've built are all pointed at the domain itself. So what I want to do is change the url of this page and let all of the current poison links 404. I don't trust the disavow tool and feel like this will be a better option. So I'm going to change the page's url to "inner_page.html" or in otherwords, simply changed to an underscore instead of a hyphen. How effective do you think this will be as far as 404ing the bad links and does anybody out there have experience using this method? And of course, as always, I'll keep you all posted on what happens with this. Should be an interesting experiment at least. One thing I'm worried about is the traffic sources. We seem to have a ton of direct traffic coming to that page. I don't really understand where or why this is taking place... Anybody have any insight into direct traffic sources to inner-pages? There's no reason for current clients to visit and potentials shouldn't be returning so often... I don't know what the deal is there but "direct" is like our number 2 or 3 traffic source. Am I shooting myself in the foot here? Here we go!
White Hat / Black Hat SEO | | jesse-landry0 -
Can someone explain how a site with no DA, links or MozTrust, MozRank can rank #1 in the SERPs?
I do SEO for a legal site in the UK and one of the keywords I'm targeting is 'Criminal Defence Solicitors'. If you search this term in Google.co.uk this site comes top www.cdsolicitors.co.uk, yet in my mozbar it has 0 links, 0 DA etc, I noticed it top a few weeks ago and thought something spammy was going on; I thought if I was patient, Google would remove it, however it still hasn't. Can someone explain how it is top in the SERPs? I've never seen this before. thanks
White Hat / Black Hat SEO | | TobiasM0 -
Are paid reviews gray/black hat?
Are sites like ReviewMe or PayPerPost white hat? Are follow links allowed within the post? Should I use those aforementioned services, or cold contact high authority sites within my niche?
White Hat / Black Hat SEO | | 10JQKAs0