You could use a CDN - I personally use MaxCDN. This would host your JS/CSS/Image/PDF types of files and deliver them to the user faster than your normal web server will. Here's how to connect MaxCDN to Magento if that's the CMS you're on: https://www.maxcdn.com/one/tutorial/magento-cdn/
Posts made by KaneJamison
-
RE: Use cookie-free domains
-
RE: Will background images get indexed?
On a sidenote, if you want to prevent indexation of them you can probably add the URLs in to Robots.txt or disallow a specific folder for those images.
On the flipside, if you want to get them indexed, you can manually link to the images and submit an image sitemap to google with just those particular images. Doing both is better than just one.
-
RE: Will background images get indexed?
Quick anecdotal reference - for the 1 site I just checked (which allows CSS to be crawled), no they are not indexed.
However, Google crawls CSS and can load those images when you Fetch And Render inside of GWT I believe, so theoretically they can index them.
Next question is what would cause them to do so (eg inbound links to that image file), but I don't have an answer for that.
-
RE: Best way to find the best keywords to write Q&A
All of the answers above are a good start.
I would also suggest trying keywordtool.io for every celebrity name you want to talk about, and see what types of things people are looking for about those celebrities. Ubersuggest will perform this task as well.
-
RE: SEO best practices for embedding content in a map
Not a problem - would love to see the finished version once you complete it.
-
RE: SEO best practices for embedding content in a map
Hey Eric,
You've got a deep one here with a few different things going on. Let me start with some observations and then walk you through the direction I would take if this were my project:
- The content on that example you gave is all HTML that's crawlable. So that page is getting indexed properly.
- If you were to reduce the amount of content in the left section, and swap it with a button leading to the blog post, Google shouldn't have any problem indexing those links to the pages which have more content. In that sense, your map page would be no different than a blog archive page, with titles and teasers leading to a complete post.
- Let's pretend for a second that we want to go with that solution, but we don't want users to have to leave the page to read the full content when they click the button. Then we'd want to display the content somehow in a way where we know it won't get indexed. We should be able to override that <a>link and load it into a popup instead of actually loading the page. If it gets displayed in a popup modal, that would be a nice experience without leaving the page. An iframe should ensure it's not indexed as content on the page, though you'd have to play with how it's sized and positioned. You could also load the content in with Javascript, though Google is more likely to index that properly than they used to do, and I can't recall which particular methods are non-indexable.</a>
<a>* Your next point was regarding users sharing the proper URL. You can hardcode the share buttons to the URL that is appropriate for them to share. domain.com/map#snorkelmaui would be a good URL to enforce the map to flow down to the Snorkel Maui business listing, and domain.com/map/businesses/snorkel-maui/ would be more like the URL of the individual article that is separate from the map but which can be loaded in a modal. This page would probably have some kind of "back to the master map" button or functionality to lead users back to that full map page experience.* Your other point was regarding users not visiting the correct page and therefore it would rank poorly. This isn't a big deal. If it's getting indexed properly and has internal links flowing from the popular and (let's hope) well-linked map page, then it should rank just as well as any other URL on the site with internal links.</a>
<a></a>
<a>Option B: If you want to get really advanced and avoid the separate page experience, you could use some kind of AJAX pushState() scenario to change the URL while they're looking at the modal, and fix it when they exit to modal. Downside here is that if they refreshed the page they wouldn't see the map experience, they'd see the static page version. You could also take this pushstate approach and use it to create a single page experience that does have multiple URLs without leaving the page, but each individual page is rankable on its own. </a>These two blog posts should set you down the right path if you choose that option.
I think that covers your concerns and lays out 2 options for you, but let me know where you still have questions.
-
RE: Embedding PDF previews and maintaining crawlability/link-equity.
That's the route I'd push for as well I think.
Agreed on experimentation. Please report back if you get a chance to test this. Perhaps choose a small number of PDFs on this site redesign and leave the link off of them?
-
RE: Flux in Bing/Yahoo search rankings?
If all of the traffic is one keyword, then a minor interface change could cause a big traffic change. Any new ad units or stuff like that showing up that wasn't there before in the SERP?
There's been flux lately from the Firefox partnership, but seems unlikely for you to notice that trickle down.
I assume you saw no change on mobile or tablet, just desktop?
-
RE: Flux in Bing/Yahoo search rankings?
That's all interesting. Didn't mean to suggest you needed to look at BWT (sounds like you have quite a bit already), I meant to reiterate that Benjamin should look there. I rarely spend any time in BWT so I can't say much about its response times and error rate, but I wouldn't be surprised if it has delays or takes awhile to sort out big site changes.
Hope it trends well for you and please report back if you see any developments. I'm curious what Benjamin sees on the backend of BWT as well.
-
RE: Flux in Bing/Yahoo search rankings?
I just scanned through 20-30 sites of various traffic/content models and I don't see anything major on any of them, but that doesn't mean you're not seeing something specific happening.
If I were you I would check into Bing Webmaster Tools data and see what they can tell you about recent changes to crawling or visibility to see if there's a signal there that you can look into further.
-
RE: Embedding PDF previews and maintaining crawlability/link-equity.
I haven't seen any studies with <embed> the way I have with <iframe>. <embed> is also used for video and flash, but neither would be indexed the same way as PDF so hard to compare. The embed tag is pretty standardized, so I really doubt they wouldn't crawl this similarly.</p> <p>IIRC in the ugly era of flash, it was proper to have a <noscript> {crawlable content here} </noscript> section after the <embed>, so that's one comparable situation, but that's due to the flash itself not being crawled well.</p> <p>If it's not a hassle, I would add the text link to the PDF that says "download full PDF" or similar. If it is a hassle and takes longer than a couple hours, then it's a harder call.</p> <p>Similar thread that could be helpful:</p> <ul> <li><a href="http://stackoverflow.com/questions/3686331/does-google-index-html-content-supplied-by-the-object-tag">http://stackoverflow.com/questions/3686331/does-google-index-html-content-supplied-by-the-object-tag</a></li> </ul></iframe>
-
RE: SEO question regarding rails app on www.site.com hosted on Heroku and www.site.com/blog at another host
I don't understand how the Rails side of things work, but as long as the Wordpress URLs resolve to www.site.com/blog, and canonical tags reflect that, then you should be good.
For app companies where there won't be many links accumulated to the app itself, we often recommend running a complete Wordpress website at www.mysite.com, and running their actual app at app.mysite.com. This allows for easy content management by the marketing team, and marketing projects don't require lots of developer resources to get implemented. Not saying you should do this, but it can simplify things from a marketing & IT perspective. If there's a public facing version of the app, for example how Moz allows pages like https://moz.com/researchtools/ose/ to rank, then it can make more sense to run everything on a single subdomain.
-
RE: URL open with double domain names when click on visit URL link in Google Analytics
If you turn off that filter, this problem should go away. Google shows the hostname separately from the page path to simplify viewing, otherwise column widths would be huge in every report.
If you want to see the hostname in reports, you can add hostname as a secondary dimension, or you can create a segment for each hostname/subdomain, and that will show you one or the other.
Alternatively, you can just keep it how it is now, and stop using that popup button. No harm in that approach.
-
RE: Content Aggregation Site: How much content per aggregated piece is too much?
Ryan's resources look good regarding copyright & fair usage. If you're citing a paragraph or two and linking to the source, you're generally in the clear.
In terms of SEO, you'll want to be adding as much unique content to the page as you're citing if you plan on indexing the content. Here are examples of sites that do this curation approach well:
If you're just going to pull the content in directly from an RSS feed, or if you're just adding a sentence plus the quoted text and a link, then you're probably not adding enough value for the content to be worth indexing. I'd set the meta robots tag to "noindex, follow" in this case.
-
RE: Duplicated content detected with MOZ crawl with canonical applied
If it's a period of 2 weeks and you're going to do it anyways, I would just make the new content and not go to the expense of setting up redirects and then taking them down, which can cause issues when you plan on recreating a URL.
-
RE: Duplicated content detected with MOZ crawl with canonical applied
I personally would not generate new language sections unless the content has been translated and localized on those pages. Right now your Spanish homepage has English content in the body, so I would view this as incomplete. Ideally you'd translate the entire page for those sections.
When you do that, you'll want to use hreflang, not canonicals, to indicate different versions of the same content.
So, my recommendation is (A) get rid of the Spanish content sections which would solve the duplication problem, or (B) finish translating the content and then install hreflang code, which would also solve the duplication problem.
Unfortunately I don't know of a good hreflang tool for Joomla specifically.
Let me know if that makes sense?
-
RE: Duplicated content detected with MOZ crawl with canonical applied
Also, if you decide to keep the /es/ section of the website then you'll need to look into hreflang instead of canonical tags, because /es/ and /en/ will not be duplicate content once they're translated.
Read this Q&A from Google for details - https://sites.google.com/site/webmasterhelpforum/en/faq-internationalisation#q20
-
RE: Duplicated content detected with MOZ crawl with canonical applied
Hey Jose,
If you have an /es/ subfolder then ideally you would be translating that content to Spanish, not canonicalizing that content back to the English version.
I can see from http://www.spain-internship.com/es/internships-in-salamanca that not all /es/ pages are translated - is this true across the entire website?
If you don't have any Spanish content, then you should just kill off the /es/ version entirely.
-
RE: Yoast seo title question
Hey Noah,
Ray's comment above is correct but might be unclear. Yoast handles this the way you want - you've got a theme issue. This code is wrong:
if(!wp_title("",false)) { echo bloginfo( 'title');} ?>
This tells Wordpress to output "page title - blog name" as the H1. Yoast isn't interfering in that. You'll want your H1 to say this:
-
RE: Tag Manager V2
So that statement appears to be referring to a GTM event tracking tag. The filter means that when someone is on a specific page and clicks on a specific link, it will send an event to GA. I believe "Click ID" is referring to the HTML id tag for the element being clicked.
Send me the exact URL you're seeing that on and I can clarify further.
-
RE: Are there any tools to give a value STRICTLY for Quantity of Content on your website?
There's no good reason to measure your content purely by quantity, simply because there's no inherent reason to have 10 pages on a site versus 10 million. The "right" amount of content for any particular website will be entirely driven by your business model, you marketplace, and the actions of your audience.
If you're looking for a way to measure the value of each page of content you're producing against your actual revenue and business goals, then to do what you're asking, you need to have a website set up to perfectly track multi-touch attribution across the entire customer lifecycle, and that's a hard task for most business models.
For us mortals, GA is the simplest way to get closer to this. By assigning e-commerce revenue or goal values, GA will assign values to pages, allowing you to see the total sum value of that page. The content groups features of GA will allow you to analyze content in bucketed groups as well, rather than page by page. Using GA is limited by the goals you are tracking - if you don't attach a dollar value to a newsletter signup, then GA will never assign more value to a page that generates tons of newsletter signups.
Marketing automation/analytics tools like Hubspot, Mixpanel, and Kissmetrics have similar features allowing you to track the value of a piece of content.
So, setting up the right analytics environment is the best way to justify your efforts on a per page basis.
-
RE: Tag Manager V2
Here's a few of the better resources I've seen:
- http://www.lunametrics.com/blog/2014/10/15/google-tag-manager-refresh/
- http://www.optimizesmart.com/beginners-guide-google-tag-manager-v2/
- http://www.simoahava.com/analytics/auto-event-tracking-gtm-2-0/
- http://www.simoahava.com/gtm-tips/migrate-containers-to-new-ui/
You can also scan the changelog for GTM - the October 15th entry has "Learn More" links related to 2.0: https://support.google.com/tagmanager/answer/4620708?hl=en
-
RE: Correct Hreflang & Canonical Implementation for Multilingual Site
As a 2014 follow up to anyone reading this thread, Google later released a tag labeled "x-default" that should make the self-referencing canonical question moot.
Read more at http://googlewebmastercentral.blogspot.com/2013/04/x-default-hreflang-for-international-pages.html
-
RE: I want to make changes, in my site's visual appearence
The visual design itself is not a major concern as far as ranking, however the underlying code changes can present an issue (or an opportunity) depending on the code quality of both the current and new designs.
This guide from SEER covers the essentials you need to be aware of during a redesign: http://www.seerinteractive.com/blog/seo-website-redesign-checklist
If you avoid dramatically changing the body content of the page, the URLs, the title tags, and the internal linking structure, those are the most common changes that would produce issues.
-
RE: When you can't see the cache in search, is it about to be deindexed?
Hey BCutrer,
Just wanted to make sure you'd seen a good solution to this and everything was deindexed properly?
I haven't heard anyone mention the lack of a cached version as a sign of deindexation about to occur, but would be curious if you still think that was the case. I would sooner guess that noarchive was placed on those pages.
-
RE: Is there any problem with my information structure?
Linking with #part1 won't cause issues for indexation and how the site is structured. They're commonly used for internal analytics and other purposes that don't refer back to Id="" or name="" tags on the page.
Regarding the rest of it, I'm afraid I'd have to see example code or the actual page to fully understand your question. If it's still an issue, feel free to leave additional code examples here and I can take a look.
-
RE: How can I get a list of every url of a site in Google's index?
If this is still an issue you're facing, have you checked the sitemap settings to see which page types are getting included? For example, a site with a few thousand tags that are not entered in the sitemap but not yet set to noindex could easily produce extra pages like this.
The next step is parameterization. Anything going on there with search URLs or product URLs? eg ?refid=1235134&q=search+term or ?prod=152134&variant=blue
If you really want to scrape through Google, get a list of your sitemap and scrape queries like "inurl:domain.com/a", "inurl:domain.com/b", "inurl:domain.com/c". etc. This should allow you to dive deeper into the site map to see what Google really has indexed. For URL subfolders with tons of URLs like domain.com/product/a, you'll want to do the same thing at a subfolder level instead of root URLs.
-
RE: How can I get a list of every url of a site in Google's index?
You can do that with a tool like Scrapebox or Outwit. Go slow, or else you'll need to use proxies to get Google to respond fast enough. As another commenter mentioned, it's probably against TOS.
-
RE: URL mapping for site migration
Just to confirm mosquitohawk's comments, there's not a great way to do this other than sorting through the spreadsheet.
Hopefully URLs have distinct enough subfolders that you can break them out into sections easily.
-
RE: Procedure of ecommerce tracking installation and set-up using GTM
Here's a few resources you'll want to read:
- https://support.google.com/tagmanager/answer/4363363?hl=en&ref_topic=3002579
- https://support.google.com/tagmanager/answer/3002596?hl=en
If you scroll to the bottom of each page you should be able to find the directions translated for other languages.
Depending on what type of e-commerce platform you're on, you might be able to get the software to do some of this for you, like firing the on-page code.
-
RE: I want to upgrade to Universal Analytics but already using GTM and I have few queries...
This means your GA installation is still on the old platform, and you will need to manually upgrade GA to Universal Analytics before you can begins n the rest of the process.
The button you see on the page should only require a few clicks to start the process.
Once you upgrade, you can take as long as you want to upgrade the actual code on your website. The existing code should keep recording properly until you're ready to change it.
-
RE: I want to upgrade to Universal Analytics but already using GTM and I have few queries...
1) After login to google analytics from admin section i have to transfer property to universal analytics ? right?
Click over to "Tracking Code" and see if it says analytics.js or ga.js. If it says ga.js, you'll need to do this manually and you haven't been auto-upgraded.
2) I have to wait either 24 hours or 48 hours before retagging or doing any changes in UA. So how can i know my property transferred sucessfully?
If I recall correctly, you can fire Classic GA code into an upgraded account, but you can't fire Universal code into an account that hasn't been upgraded. So - if you have to do the manual process - you want to let the process finish and roll out the new Universal code after it's done.
3) After property transferred to universal analytics, i have to configure the session timeout and campaign timeout periods via the Google Analytics Admin page. ( By default, sessions end after 30 minutes and campaigns end after 6 months ) is it okay if i don't change this settings?
You don't have to edit these unless you want a different default setting. Read more about these settings at https://support.google.com/analytics/answer/2795871?hl=en.
4) As of now in my analytic i have configured google adwords, google webmaster and google merchant. In analytic i have also set custom alerts, goals, funnels, enchance link attribution, eCommerce etc but google analytics code i have already added in google tag manager, so i have to some manual changes for all such things? or all the things i.e goals, funnel, alerts etc will be transferred automatically at time of transferring the property?
Your connections to Adwords, Webmaster tools, and Merchant Center should be fine.
Your goals should be fine, so should your custom alerts.
For E-Commerce, you will need to check the "Enable Enhanced Ecommerce Features" checkbox under "More Settings" in your Tag Manager tag for GA. Enhanced Link Attribution no longer shows up on any of my GTM accounts so I think they either removed this or made it default. I can't say for sure, but a few posts in webmaster forums confirm this uncertainty.
You may still need to do some custom programming on your transaction confirmation pages to track e-commerce transactions properly. They won't record automatically unless you're using a shopping cart/platform that fires them for you.
If you have any event tracking code on your site (eg file downloads, or outbound clicks), you can upgrade it to the Universal event tracking syntax, or you can kill them off, but you have to rebuild the event tracking in GTM if you do that. Either way is fine, but pick one or the other.
5) In Google tag manager i have already configured following tags:
- Name - Google adwords Conversion Tracking b) Type - Adwords Conversion Tracking c) Rules - order sucessful page
- Name - Google Analytic Page view Tracking b) Type - Classic Google Analytics c) Rules - All pages
- Name - Google Analytic Conversion Tracking b) Type - Classic Google Analytics c) Rules - order sucessful page
- Name - Twitter Conversion Tracking b) Type - Custom HTML tag c) Rules - order sucessful page
So at time of transferring property all this will be transferred automatically?
#1 and #4 should work fine if you have them set up correctly in GTM.
For both GA tags, you'll need to switch them to Universal manually, instead of Classic GA.
As mentioned above, for the GA conversion tracking tag, you might have to take care of some coding on the website to push sales data into the data layer for GTM, which can then be used to fire off the GA transaction tracking.
6) Also at thank you page of my website I have scripts for goal setting, Affiliator, so there is no relation of this with migration right?
Unfortunately I can't answer this one - leaving it for someone else.
===================
Read through these three guides a few times before you do the process to make sure you have a full grasp on the conversion process - then get started:
-
RE: Interlinking from unique content page to limited content page
Both of those are good solutions - so choose one or the other. If you choose the keyword variation route, then make sure you go through and edit the content on each page to properly reflect the new focus.
-
RE: Interlinking from unique content page to limited content page
The issue is two pages on the same site trying to rank for the same keyword. They're going to be fighting against each other and confusing search engines.
It's better to either combine the pages (option 1), or to give them a separate target keyword (option 2). Option #2 is probably easier, but you'll still need to make the user-friendly page more search friendly, and vice versa. Option 1 is probably better if you can add search-friendly text content beneath the map portion of the page, and ditch page 2.
-
RE: Social media starter strategy
I'd go with the umbrella brand pages. Your reservations all make perfect sense, and managing content for 10 pages sounds miserable, plus they'll all take a ton of traction to get started vs a single page.
Most users are tolerant of a brand that has products they may not care about. Unless they're seriously different (like a plastics company selling kid's toys, garden equipment, and medical devices) it makes more sense to keep them together.
-
RE: Which internal page approach is better? Couponsite/Kohls OR Couponsite/Houston/Kohls
I doubt users will be searching by "houston kohls coupon" for coupons. Most Kohl's have the same coupons, so it makes sense to just search "kohls coupon".
With that in mind, I'd probably stick with the top level page and not do cities.
-
RE: Keeping Google Analytics Data when Moving to Subdomain
I would change the current Analytics profile to represent the subdomain. The create a new profile for the top level domain, assuming that it will be all new pages and URLs and content.
So, hopefully this visual makes a little sense:
Current Profile ----------> New Profile www.mydomain.com -----> sub.mydomain.com
N/A --------------------------> www.mydomain.com -
RE: Product Descriptions & SEO
Good comments from the others here.
Focus on the unique description as the primary one shown to users (further up the page) unless the manufacturer's is better. You can also label that as "Manufacturer's Description:" to differentiate it from the content you wrote. Some users will even hide it on a tab that is crawlable but has to be clicked by the user, which is fine.
As Takeshi mentioned, the small amount of duplicated/templated text should be a problem as long as it's not 80% of the page content.
-
RE: Interlinking from unique content page to limited content page
I would make some slight variations from the two pages. For example, make page 1 "Seattle Homes For Sale" and page 2 "Seattle Home Listings". This avoids the issue of having two pages going after that same keyword and allows you to get more granular for the terms you want to rank for.
If both pages are almost identical content, then I would consider canonicals as a solution, but it doesn't sound to me like that's the case here.
-
RE: Link building with AddThis URL
Mike, this comment you made is correct:
"my understanding is that Google disregards everything after the "#" so there shouldn't be a duplicate content issue."
If you do somehow see one of these getting indexed in Google then you have an issue, but I have not seen this happen.
-
RE: Link building with AddThis URL
Quick correction here. ? indicates a URL parameter, # indicates a subsection of the same document.
-
RE: Re-classifying a Traffic Source in Google Analytics
I second Federico's suggestion about making sure installation is perfect sitewide. Martijn's comment is also correct - you can't edit anything in the past.
I have a couple more thoughts:
-
If you control the links that are placed at indeed.ca, just add your own custom UTM tracking codes so that everything is categorized correctly
-
It is possible to redirect users coming in from a specific referral source (eg indeed.ca) before they load the page and hit the analytics code. So, you would redirect the user to the same URL that they requested, but you would add something like ?utm_source=Indeed.ca&utm_medium=referral&utm_campaign=Job%20Listing at the end of the URL. I honestly would not bother going to this trouble unless it's seriously screwing up your reporting.
-
If you're currently looking at medium, you could try switching to channel reporting instead to see if it's still reported incorrectly?
-
You could do a custom Channel Grouping for it. To be honest I don't do these much, but it was suggested within our office as a potential solution. You can learn more about them at http://www.seerinteractive.com/blog/new-google-analytics-channel-groupings and https://support.google.com/analytics/answer/1250116?hl=en.
-
-
RE: Why Custom Post Types Don't Get Ranked Well
There is nothing about custom post types that is inherently different from posts. The biggest issue is making sure you properly set up the site structure so that they are interlinked well with the rest of the site. Depending on your SEO plugin, make sure the right URLs are in your sitemap as well.
Other than that, you could be having issues with newer content taking longer to rank, but it's not an issue of custom posts vs. posts.
-
RE: Outbrain Select SEO Implications
Generally agreed with EGOL here. Not sure why you'd want this on your website, but if it's a single page that is noindexed and uncrawlable, I don't see a problem with it, either. Just make sure it's "noindex, follow" so you're not blocking up the page from passing any link equity it accumulates.
-
RE: In need of guidance on keyword targeting
I agree with Jared's comments - the current keyword layout of the pages is an effective one. There's probably tweaks that can be made but overall I don't see major issues with having concrete repair as a subpage.
This homepage is ranking on page 1 for "concrete repair west yorkshire". That may or may not be the ideal keyword, but I would focus on link building - run your way through this entire list of opportunities and see how that affects performance.
-
RE: Professional Content Copywriter
Hi Bossandy.
I don't want to list off names here publicly but happy to recommend a couple of the copywriters that we frequently hire for client work if you'd like to PM me.
-
RE: What if your content is getting social shares but no links?
I agree with Aaron here. Both elements are important. If there were quite a few more social shares then it would be easier to ignore the lack of links.
Do a test for a couple months - rather than posting weekly, post every other week and make the content more substantial - longer form text, lots of good formatting like h2/h3s, bullets, photos, etc. Target a higher value term. Also consider running a small Facebook ad ($15-25) or Twitter/LinkedIn featuring that post, and targeting it towards the exact type of demographics you want to see it.
I think you'll get a lot better return from this type of approach.
-
RE: Codeigniter - Controller and duplicate pages
I'm not familiar with CodeIgniter, but this isn't terribly different from how Wordpress and other PHP-based CMSs manage permalinks.
Currently you're simply forwarding shortened URLs (/contact/) to the actual URL (/site/contact/), which isn't ideal. It would be preferable to remove the base path (/site/) from the URL completely.
This guide (http://www.web-and-development.com/codeigniter-remove-index-php-minimize-url/) has a good rundown on how to control this in .htaccess. I believe that the sections titled "Removing 1st URL segment" and "Routing automatically" are going to be the ones that are applicable to your case, because you're trying to change the controller.
You should also take a look through the original documentation at http://ellislab.com/codeigniter/user-guide/general/urls.html.
You're also going to want to make sure that any canonicals being used in the match the intended URL, and that any incorrect URLs (eg /site/contact/) are 301 redirecting to the proper URL (eg /contact/). Use wheregoes.com and type in the /site/contact/ version of the URL to test this - it should spit out "301 redirect" not "302" or anything else.
If you're not really comfortable messing around with htaccess, I'd highly recommend trying this out on a development server and making sure it all works correctly before you move it to a live server.
-
RE: Outranked for own content
So, they're outranking you for snippets of text. That actually is not a big deal. The bigger issues are these:
- Are they outranking you for any of your branded terms?
- Are they outranking you for any generic head terms that you used to rank highly for? (eg. "waterfront seattle condo rental"). This is far more important than whether they're outranking you for arbitrary text snippets that you wrote.
You're correct that trying to rewrite each description is a good tactic, but make sure that it's good copywriting, and not simply changing the words around for the sake of SEO. Users on these networks aren't finding your listings entirely through SEO, they're largely finding you through internal searches on tripadvisor.com, etc. Make sure you're ranking well in those searches and that you have copy that will convert prospective renters.
-
RE: What is the recommended way to save Image Files in WP?
Agreed with everything said here. The default Media uploader for Wordpress is the best to use unless there's something special about what you're trying to do with the images.
As Bradley said, optimizing images is important. http://wordpress.org/plugins/wp-smushit/ is a popular plugin that will let you do this as you upload images.