Apple has recently disabled all third parties cookies in all safari browser either ipad, iphone or desktop
-
Hi All,
As you all know Apple has recently disabled all third parties cookies in all safari browser either ipad, iphone or desktop and due to this entire UK ecommerce transactions which are being executed by credit card are going failure and a big loss are being faced by entire ecommerece industry. Even safari browser don't indicate that please enable cookies to have credit card transaction so that visitor or buy can know why my ecommerce transaction by credit card is getting failure in each and every site. So please suggest solution what to do so that ecommerce site can successful have transactiona and aware visitors to active cookies from his browser to initiate credit card transaction, please give me 5 to 10 ecommerce website examples who has successful ecommerce transaction using otp.Because without otp only amazon.com and aliexpress.com are processing transaction but none other uk sites inshort otp is not even asked by this sites while having credit card transaction so skip such sites.
Regards,
Mitesh
-
Hi
I still only have version 8 so I can't test it, but my GA is showing data for version 9 with sales.
I have also checked that these aren't Paypal orders.
Once I get version 9 I will test, but our visitors are currently able to checkout.
Thanks
Andy
-
Hi Andy,
I am talking about Ecommerce transactions carried out only and only with credit card and that also in safari browser ie. either in ipad or in iphone or in desktop device. so now reply me what is the output for your site ? have u tried yourself to make any live transaction to test?
Regards,
Mitesh
-
Hi
Sorry can you explain a bit more, I from a UK based ecommerce site, and I am still seeing data in GA from Safari for yesterday and previous days.
Customers are able to checkout as well.
Thanks
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Browser Size = Not Set Filter in Google Analytics
I have been trying to filter this traffic out of my Google Analytics data since it all seems to be related to spam traffic. I have had multiple instances wherein using this filter:
Technical SEO | | fuelmedical
(Backslash not displaying in message preview - I have written backlash to indicate its placement in the filter) Custom Filter - Exclude - Browser Size - ^backlash(not setbackslash)$ Traffic seems to appropriately filter out - but then the filter ceases working. In looking at a new site with Browser Size = (not set) traffic the filter preview doesn't appear to work either. Am I implementing the filter incorrectly? How do I filter this traffic out of GA data sucessfully? If I use the exact same method using RegEx in Google Data Studio - the filter works perfectly.1 -
I want to set different price type for India and USA through schema. for eaxmple website is abc.com & there is no cookie/country level redirection.
I am currently working on a website (Ed tech) that is doing business in India as well as USA. The courses are same. Content being served is also same. There is no cookie level redirection. The only difference is in the price range and price type. In schema we have set price type as $. We want to set different price type for India and USA through schema. How can we do this? For example given below website ranks for India & USA with the same domain name but prince range that we can setup either in INR or USA
Technical SEO | | DJ_James0 -
Google Bot is seeing the desktop version in cache for Mobile website too.
Hi, I have an e-commerce website that is dynamic(not responsive) in nature. when i check the cached version of mobile website it shows the desktop version in cache. Will it create any problem . How can i tell google bot to see my mobile cached version instead of desktop one.
Technical SEO | | dhananjay.kumar10 -
GWT Fetch & Render displays desktop version of site as mobile
Hi team, I noticed that when I request a desktop rendering in GWT using fetch and render, pages render as the mobile version. Screenshot attached. It's related to the VHS units in our CSS (as far as I'm aware). Does anyone know what the implications of this may be? Does it mean googlebot can only see the mobile version of our website? Any help is appreciated. Jake jgScJ
Technical SEO | | Jacobsheehan0 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
Updating old blog posts in Wordpress to appear more recent?
I'm doing work for a law firm that has a lot of blog post content from 2010-2011 ranking for long-term keywords. These pages are displaying date snippets in SERPs, but because legal information can change year to year, I don't want the content to appear as though it's 2-3 years old. The date of the post is in the URL structure, so I can't change the publication date w/o changing the URL. So my question is twofold: is there a way to show an updated date snippet in search results, or block the date snippet from showing, even if the date is in the URL? Or are there other options - creating pages for each of these posts and 301ing them to the page that has a cleaner URL, etc.? Thanks in advance for your help!
Technical SEO | | dchristensen30 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
Should I ask third pages to erase their links pointing at my site?
Good Morning Seomoz Fans, let me explain what is going on: A surfing site has included a link to my Site in their Footer. apparently, this could be good for my site, but as It has nothing to do with my site, I ask myself if I should tell them to erase it. Site A (Surfing Site) is pointing at Site B (Marketing Site) on their Footer. So Site B is receiving backlinks from every single page on Site A. But Site B has nothing to do with Site A: Different Markets. Should I ask them to erase the link on their footer as Surfing people will not find my Marketing Site interesting? Thanks in advance.
Technical SEO | | Tintanus0