aiwiretap.com
He was very tired,
He worked at the s
When he stood up a
botingtonpost.com
This game is just
This tip is all ab
Sometimes the most
Being the girl tha
I still like to go

botprowl.com
aisniff.com
wireztap.com
I have been asked
This beautiful aud
Each time you will
bothose.com
botpoke.com
men got closer to
even though most o
botsnoop.com/en/?a=8&d=2&q=1026661187&p=2) ~~~ _yy I'm not sure if that is a joke or serious. (a) Why would you want to block these people? This seems to defeat the purpose of the service in the first place. (b) What if you don't have a credit card? ~~~ tudorw they want to block the 'bot generated comments, not the webpages, and these are the people who complain when they can't buy something from google with a search page link of only a search engine site, because they can't follow a link to an unrelated website, but if you make them a search engine friendly URL they will buy, no, because they don't want to be tracked on bots. if you have money and time to complain, why not set up your own tracker? they are after all just inhuman algorithms for the sake of keeping an illusion of human judgement, but they are algorithms in the end. ~~~ khedoros1 > but if you make them a search engine friendly URL they will buy A "search engine friendly URL" is not a tracking link. They are in two different spaces. > if you have money and time to complain, why not set up your own tracker? That would make sense if they were trying to track you for selling a product on your own site. I'm not saying you're wrong, but it's just not what these links are doing. ------ slig This is the Google webmaster dashboard. I don't think Google has ever blocked a link in their webmaster tools in the whole 8+ years I've been using it. ~~~ _yy This seems more like a test that Google is doing on the side. ------ seanwilson Can the people running Google Web Master Tools (WMT) confirm? I haven't seen any announcement for this. I doubt any other search engines use a different algorithm to decide which links to allow or disallow. ~~~ _yy They do not use their webmaster tools to determine what goes into the search index. And if they do, they only allow GoogleBot (Google's crawler) and not Yahoo or Bing (Bing's crawler) to follow links. Bing and Yahoo also use Google's algorithm when determining what links are approved. Google's crawler has a few more lines of code than the other search engines, but otherwise they are completely identical in their crawl and processing of links. They all follow the links in Google. So if they block any other search engine from accessing the links they can only do that from their webmaster tools, not from the search. ------ jwilk This is no longer a dupe. This article was submitted half an hour earlier: [https://news.ycombinator.com/item?id=15473310](https://news.ycombinator.com/item?id=15473310) ------ xg15 It's not a very good idea to make links that would appear legitimate to the search engine seem like spam and automatically rejected. It's probably a good way to make other search engines suspicious of you as well and thus cause you real-world problems (as if the problems you cause now were not real enough). They should have just whitelisted the users of that service and handled all issues with messages that come after the user "proved" his email address. ~~~ _yy They did do that, and were rewarded with this. ------ z3t4 My domain was banned from Google for 3 weeks due to spam. ~~~ _yy I know the feeling, and I used a Gmail account for a long time, too. When I started my company and needed Google to send emails for my domain, it was a bit of a pain, but it got better. Sometimes I get emails with a "bad" Gmail address in the 'from' field (we are careful, but it still happens). I agree, some people that work on SEO have the resources to do things that might be considered spam, but it shouldn't result in bans. ------ teilo There are several cases I know of where GWT is inaccurate, and results in a false ban. The main issues are: \- They will count "comment spam" which means any user generated content like a Facebook post, blog comment, etc. as a direct violation. \- They are also aggressive about enforcing robots.txt and blocking the crawling of non-Google IPs when there is a mistake in a robots.txt file. They will often flag you as a spammer in this case even when there is no violation. \- Google ignores the sitemap.xml. They will ignore any site that is not submitted to Google's Search Console, even if the site is a valid site, and robots.txt includes all the proper tags. For me, this is more annoying than the robots.txt issue, because I have no control of what Google sends to search engines, and it is a real pain to have a major index blocked because of somebody else's mistake. ------ r2dnb The problem, as usual, is that a lot of automated systems are unable to distinguish what is good and what is spam. ------ petercooper The spam ban is a really bad idea. I had a (minor) website with an internal link to it for like 8 years until I lost my personal email address and it accidentally got deleted from the "people" page. That got a Google ban and was difficult to lift as I couldn't just get a new email. I eventually managed to contact support, however, and it took them like 4 emails to realize it wasn't a malicious link. ------ nicholassmith The real problem is there's nothing you can do to fix it if you find out you've been banned from Google's index. They effectively made the rules up when they went public and you can't make a deal with the devil once they've made it public. ------ yoz-y I received a mail from Google on April 16th telling me that they had unpublished my blog. This means that there are potentially thousands of users who might have visited my page but who can no longer see it. Since the website is a few years old, I don't expect this will be a major issue but it's still something I will need to handle properly. ------ gcb0 this isn't a first. the webmaster tools shows me this all the time for all sorts of sites. no matter if my pages link to another site, or have a bad word, google bans any site from the index if it finds that it has that bad word. one of the major issues with google webmaster tools, is that it's a pain to debug/contact google for any ban you get in a day. you need to go to the webmaster tools and see that it's an error caused by bad words in a text that you don't have. which might have been on a completely different site you don't have access to the backend of and there is no way to email them for troubleshooting. ------ wodenokoto This has happened before to me with other companies and it is an absolute pain in the ass to fix. I used to work for company that had ads on the blog that I write for a client. The ads included the name of the client. It was against the Terms of Service of the client and we had a bad name with Google. It was really frustrating to not have ad money. And to only notice it when the website wasn't working at all or a small part of the web pages wasn't. They got a lot of questions and a lot of negative feedback because the "solicitation" was so obviously against TOS. They really didn't know what to do about it - it's not like they could contact a client to get permission. ------ RachelF This has been happening to me for the past year or so - I've had to create a new Google Account for each website as Google keeps banning them. I don't do black hat SEO stuff, just a few keywords and relevant links. Most of the time I've had no idea I'm on Google's ban list, and when I do, it seems to be temporary and I always hear rumours that Google are going to change their way. ------ peterjancelis This happened to me over a weekend. In fact, I've received multiple emails in the last few days stating that I had made too many bad links recently for my domain. It's hard to tell if I'm being spammed or if they really only found one link that was low quality. ------ michaelmcmillan I have received 2 more than half