Maybe I'm missing something, but...
Unzip the file on your computer first, then upload the extracted folder containing the files and sub folders to the themes folder.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Maybe I'm missing something, but...
Unzip the file on your computer first, then upload the extracted folder containing the files and sub folders to the themes folder.
PA = Page Authority and DA = Domain Authority. These are values SEOmoz use to "emulate" PageRank (emulate may not be the best choice of words).
you can get access to these metrics via the SEOmoz toolbar or opensiteexplorer
Hi Yannick,
The time can vary hugely depending on the competitiveness of the terms you targeting.
A general rule of thumb I use is wait a month and check opensiteexplorer results, check PA and DA for improvements, if these metrics have improved then you should see some positive SERPs movement.
regards
Aran
Hi, As far as I'm aware, a sitemap is exactly that, A site map, thus they shouldn't be used to map URLs for mulitple domains.
To avoid duplicate content canonical references should be used.
Use a separate site map for each domain.
Hi Damon,
Get the SEOmoz Toolbar, it has a function to highlight NoFollow links. very handy!
http://www.seomoz.org/seo-toolbar
Regards
Aran
Tables are designed to display tabular data, not to be used for site design layout.
Though Google doesn't penalise you for it, Its an ancient way of coding, which may result in a 'Low Quality' signal.
As Marcus and EGOL mention, its best to avoid using table for layout, but they are fine for detailing information which may require a table format.
Regards
Aran
Personally, I wouldn't stop it being indexed.Its not like your being spammy with the onpage links.
P.s. awesome website, really love the photography on the background images.
#1 definitely, one big site is easier to optimise, promote and manage than a series of smaller sites.
with lots of smaller sites link building is a huge job where each link will only benefit the site it links too, where as 1 big site all links to it are going to benefit the domain as a whole.
the possibilities are endless now.
Check out http://bit.ly/m9pmc6 for some inspiration
I assume you have used some kind of rewrite on the server to change URLS to / rather than .aspx. This could be utilising some kind of redirect to move traffic from .aspx to /.
Firstly, I'd check how the URLs are rewritten (if they are), Unfortuntaley my knowledge of windows server configs is limited, however in Linux I'd be checking my HT Access file and the redirect/rewrite rules.
Once you have ascertained how the redirect us happening and solve the issue, then i'd implement the correct 301 redirects.
I'd like to ask why you opted for name-of-page/ rather than name-of-page.aspx?
Hi,
Making facebook apps for iframe is really simple, just like creating any webpage however you just have to take into account the width limitations of the iframe. You essentially host the page on your site and point the iframe to the approriate page.
Check out this resource: http://bit.ly/jFpqVX
This should get your started.
Good luck, and I'd love to see it when its done!
Cheers
Aran
As EGOl and iNet state, using flash for a homepage isn't a great idea.
You will need links outside of the flash to ensure crawlers find deeper pages.
If you do optimise sub level pages over the homepage, then fancy homepage flash become inert as most visitors are going to land on sub pages, by passing the homepage.
Include Flash by all means, but make it a page element, not the entire page.
Jquery on the other hand is awesome and if done correctly will not hinder your SEO.
You need to utilise some regex, though I'm unsure of the actual regex code to match ">
In future offer a service to embed a link, then bloggers etc will always get the right url and also you can control anchor text.
Hi,
I think its unlikely that you will get a penalty for using display none in this way. Also, Google will index the content from the product description. If you have the Web Dev Toolbar in Firefox, try turning CSS off and look at the page for a basic Googlebot view of the page. You can see that all the 'tabbed' content that is normally hidden is shown. This may also highlight some other issues on the page (hint hint)...
Cheers
Aran
You are probably experiencing this error because of the "> in the URL you are trying to redirect.
Looks like the crawl has highlighted the as an error as it may have found a malformed link linking to the Paper Airplane Flight page. It looks as though you have a link that may look as follows ">paper airplane flight Notice the extra "> after the link URL.
You need to find the offending link on the site and fix it. That will solve the error.
If you cannot find duplicate content anywhere, then it sounds like great content. Allow the comment.
If you cannot find duplicate content anywhere, then it sounds like great content. Allow the comment.
I notice the page speed of your equity site was 60/100, a super quick fix for this is Gzip compression in HT Access (if on linux, not so quick on Windows). A little more complex would be caching rules, both will have a great beneficial effect on page speed, and hopefully SERPs too.
Best of luck, just noticed your in Lancaster! just down the road from me in Chorley!
Speeding the page load time will not do any harm, but for small drops in serps I generally look at carrying out a competitive analysis on the top 5 or top 10 competitors.
Agreed, very questionable. Maybe hes saving on hosting fees!
In the current scenario any inbound links to your friends site will be building the Domain Authority etc of the FunktionalDesignStudios domain. Thus, yes he is boosting his own domain. Though technically speaking it will also be boosting your friends site whilst it is served from a sub directory of the funktionaldesign domain.
1. Try to avoid spamming keywords into your titles. Keep them Short(ish), informative and helpful.
Maybe a title of "Why is my back sore? asked in Back & Neck Pain". This assumes your categories questions, so the "Back & Neck Pain" bit is your category title.
2. Having a keyword in thousands of page title is not going to be of any significant benefit to your site and it rankings.
3. I would gauge the success by simply applying your new found titling technique to the next few questions you process, follow their success and compare to previous questions.
Keep in mind that your existing 1700 pages may be getting ranked well for phrases which are well optimised in your current titling system. perhaps only apply the new technique to new questions only.
Good luck
Hi Mike,
It appears your not on a shared IP, which is good (checked with http://www.axandra.com/free-online-seo-tool/shared-hosting-check.php).
Page Speed shows you could improve the overall speed fairly easily with Gzip or similar and some caching rules, which may be beneficial to the UX and SERPs.
I see your at 3rd position for "Equity release" today, not 7-8. Maybe the drop was a short term fluctuation?
I'd recommend doing some competitive analysis on the competition and identify what they are doing, this allows you to be preventitive with your seo.
Cheers
Aran
As gmellak says, you may want to try the scrape approach. However, Its a dirty approach and if you submit too many requests you will get Captchas.
Agreed, I wonder why your target audience appears to be Indian, however your content is written in English?
not sure if thats the correct tinyurl you have posted in! lol.
I'm just saying its frown upon, not down right illegal. As i said, there are ways around it.
I happen to know the RankTracker does use the Google API, after a few uses it requires manual input for passing the prove your human captcha.
As for SEOmoz, its a good question, I have no idea how SEOmoz track SERPs, probably a trade secret though!
Never used Marketwire, however I use PR Web regularly for syndicating clients press release. I like the service, I get a good spread of sites publishing articles and a decent amount of links (embedded in the article through PR Webs online editor).
If Google was removing the page from its index, you would have not SERP for the page.
Its more likely a issue with the site structure, urls, redirects or similar...Can you give us the url in question so we can look at it further?
Hi Mark,
Google doesn't take kindly to automated tools querying SERPs. You may find that if you make a tool for querying serps, it will only work for a number of times before you get the "Prove your human" captcha.
In Google's TOS it says: You specifically agree not to access (or attempt to access) any of the Services through any automated means (including use of scripts or web crawlers).
There are ways around it such as masking IPs and using several differing API keys_, however you would be breaking TOS._
Personally, I'm in the business of keeping Google happy, so I'd steer clear.
Cheers
Aran
Some great answers here, I'd add that there are numerous websites around the web to advertise your competition, some offer paid advertising, others free. Most offer a follow link. There are quite a lot of low quality sites, which I'd avoid, check their mozTrust etc prior to submitting your competition.
I've found twitter a great medium for promoting competitions in the past. Choose a unique, related hashtags and tweet about it regularly.
Agreed, though Charles could use canonical tags to tell Google that the new pages are authoritative. This may take a while to be indexed, but should prevent any detrimental effects with duplicate content.
It seems so unnatural to want to actually remove content when we spend so long striving to create awesome content!
You can use the meta robots tags as you mentioned in your question, this will prevent search engines indexing the pages, unfortunately we need to tackle the human side of the issue,if anyone links to the article, then eventually the link will result in a 404 page.
There is nothing wrong with a 404 page, they serve an imporant purpose. Since your articles are not around very long and not being indexed by search engines I see no reason to simply leave the 404 in place.
Ensure you have a custom 404 which is an imformative and helpful resource rather than a simple 404 Page not found message. use the 404 to direct the visitor to a category level page which is related to the topic of the article. Offer a simple list of links to various parts of the site that may be of interest.
Check out the SEOmoz articles
www.seomoz.org/blog/personalizing-your-404-error-pages
www.seomoz.org/blog/are-404-pages-always-bad-for-seo
Hope this helps.
Would it be possible to 'Archive' articles after the 1-6month period ?
Archive could just be a database flag that keeps the articles from appearing in Article index thus keeping the same url, but not clogging up main site with hundreds of links to expired articles?
Hi,
Try Google Insights (http://www.google.com/insights/search/), you can do a search over various time frames, get a graph to visualise traffic levels over time and download the data to CSV for Excel.
Cheers,
Aran
Yeah, I'm assuming that from Googles view of analytics is post search, thus time after search...
I might be completely wrong (don't sue, kill or hurt me!), however I couldn't find a better answer from a more reputable source!
Hi, your question led me on an interesting trip though Google help, revisiting the basics of analytics.
Google define the Bounce Rate as:
"Bounce rate is the percentage of single-page visits or visits in which the person left your site from the entrance (landing) page" - Google Webmaster Help http://bit.ly/gPPNPj - This makes me think of single page sites, do they have 100% bounce rate?
Heres a look at how Google Analytics performs it calculations. http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=77234
What I took from this is that all visitor times are taken and averaged, bounce or not.
Cheers
Aran
If your site is heavy on images, you can always ensure they are all correctly optimised. This will ensure minimum file size, thus quicker dowload times.
I always prefer the pipe for the following reasons, though I don't think it has any specific SEO value.
a) its reader friendly
b) its a natural separator
c) as Seth says below, it looks cool!
d) Whats good for SEOmoz is good for me!
I agree with Roger, though I did initially struggle with Super Cache. In the end I opted to manual configure Gzip compression and caching rules in my HT Access file.
Check out this page which has some info on Cache and Gzip
Can you tell us the keywords you suspect are sending traffic from paid ads and also your url?
You'll probably find that you'll get keyword cannibalisation with multiple pages all jockeying for the same Key Phrases.
Possibly a big and risky job, but could you not rewrite the URLs to include the category name rather than cat id?
/Alfa-romeo-147-sport-brakes-en
Without seeing the site and checking out the current structure its hard to say exactly I would structure it. Can you post a link?
Cheers
Aran
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
The simplest way to my knowledge is to use the Google site: operator. Simply type site:www.yourdomain.co.uk into Google search box. The results this search brings back will show all the pages Google has indexed for your website.
You can also use cache:www.yourdomain.co.uk to see what google is holding in cache, clicking the Cached link in the listing will show when the site was last indexed.
In my personal experience the one thing I found that makes conversions shoot up is when Delivery Information (delivery cost in particular is readily available) or even better when delivery is free.
And as Richard says above, checkout is a simple process, but in additon to that - the checkout process does NOT require registration to the site.
Hi,
Working in SEO I always cringe when a designer mentions Flash. However my phobia of Flash is old school and maybe I need to get with the times. Google will crawl textual content in Flash, including anchor texts. It will also follow links in Flash and presumably applies the same 'link juice' factors that it would normally apply to links.
Heres an article on Flash and crawling textual content http://bit.ly/dDXYoc
Hope this helps.
Aran