Stopping scripters from slamming your website

ScriptingE CommerceBotsDetection

Scripting Problem Overview


> I've accepted an answer, but sadly, I believe we're stuck with our original worst case scenario: CAPTCHA everyone on purchase attempts of the crap. Short explanation: caching / web farms make it impossible to track hits, and any workaround (sending a non-cached web-beacon, writing to a unified table, etc.) slows the site down worse than the bots would. There is likely some pricey hardware from Cisco or the like that can help at a high level, but it's hard to justify the cost if CAPTCHA-ing everyone is an alternative. I'll attempt a more full explanation later, as well as cleaning this up for future searchers (though others are welcome to try, as it's community wiki).

Situation

This is about the bag o' crap sales on woot.com. I'm the president of Woot Workshop, the subsidiary of Woot that does the design, writes the product descriptions, podcasts, blog posts, and moderates the forums. I work with CSS/HTML and am only barely familiar with other technologies. I work closely with the developers and have talked through all of the answers here (and many other ideas we've had).

Usability is a massive part of my job, and making the site exciting and fun is most of the rest of it. That's where the three goals below derive. CAPTCHA harms usability, and bots steal the fun and excitement out of our crap sales.

Bots are slamming our front page tens of times a second screen scraping (and/or scanning our RSS) for the Random Crap sale. The moment they see that, it triggers a second stage of the program that logs in, clicks I want One, fills out the form, and buys the crap.

Evaluation

> https://stackoverflow.com/questions/450835/how-would-you-stop-scripters-from-slamming-your-site-hundreds-of-times-a-second#answer-450899">lc</a>;: On stackoverflow and other sites that use this method, they're almost always dealing with authenticated (logged in) users, because the task being attempted requires that.

On Woot, anonymous (non-logged) users can view our home page. In other words, the slamming bots can be non-authenticated (and essentially non-trackable except by IP address).

So we're back to scanning for IPs, which a) is fairly useless in this age of cloud networking and spambot zombies and b) catches too many innocents given the number of businesses that come from one IP address (not to mention the issues with non-static IP ISPs and potential performance hits to trying to track this).

Oh, and having people call us would be the worst possible scenario. Can we have them call you?

> https://stackoverflow.com/questions/450835/how-would-you-stop-scripters-from-slamming-your-site-hundreds-of-times-a-second/450931#450931">BradC</a>;: Ned Batchelder's methods look pretty cool, but they're pretty firmly designed to defeat bots built for a network of sites. Our problem is bots are built specifically to defeat our site. Some of these methods could likely work for a short time until the scripters evolved their bots to ignore the honeypot, screen-scrape for nearby label names instead of form ids, and use a javascript-capable browser control.

  > https://stackoverflow.com/questions/450835/how-would-you-stop-scripters-from-slamming-your-site-hundreds-of-times-a-second/450946#450946">lc again: "Unless, of course, the hype is part of your marketing scheme." Yes, it definitely is. The surprise of when the item appears, as well as the excitement if you manage to get one is probably as much or more important than the crap you actually end up getting. Anything that eliminates first-come/first-serve is detrimental to the thrill of 'winning' the crap.

  > https://stackoverflow.com/questions/450835/how-would-you-stop-scripters-from-slamming-your-site-hundreds-of-times-a-second/450928#450928">novatrust</a>;: And I, for one, welcome our new bot overlords. We actually do offer RSSfeeds to allow 3rd party apps to scan our site for product info, but not ahead of the main site HTML. If I'm interpreting it right, your solution does help goal 2 (performance issues) by completely sacrificing goal 1, and just resigning the fact that bots will be buying most of the crap. I up-voted your response, because your last paragraph pessimism feels accurate to me. There seems to be no silver bullet here.

The rest of the responses generally rely on IP tracking, which, again, seems to both be useless (with botnets/zombies/cloud networking) and detrimental (catching many innocents who come from same-IP destinations).

Any other approaches / ideas? My developers keep saying "let's just do CAPTCHA" but I'm hoping there's less intrusive methods to all actual humans wanting some of our crap.

Original question

Say you're selling something cheap that has a very high perceived value, and you have a very limited amount. No one knows exactly when you will sell this item. And over a million people regularly come by to see what you're selling.

You end up with scripters and bots attempting to programmatically [a] figure out when you're selling said item, and [b] make sure they're among the first to buy it. This sucks for two reasons:

  1. Your site is slammed by non-humans, slowing everything down for everyone.
  2. The scripters end up 'winning' the product, causing the regulars to feel cheated.

A seemingly obvious solution is to create some hoops for your users to jump through before placing their order, but there are at least three problems with this:

  • The user experience sucks for humans, as they have to decipher CAPTCHA, pick out the cat, or solve a math problem.
  • If the perceived benefit is high enough, and the crowd large enough, some group will find their way around any tweak, leading to an arms race. (This is especially true the simpler the tweak is; hidden 'comments' form, re-arranging the form elements, mis-labeling them, hidden 'gotcha' text all will work once and then need to be changed to fight targeting this specific form.)
  • Even if the scripters can't 'solve' your tweak it doesn't prevent them from slamming your front page, and then sounding an alarm for the scripter to fill out the order, manually. Given they get the advantage from solving [a], they will likely still win [b] since they'll be the first humans reaching the order page. Additionally, 1. still happens, causing server errors and a decreased performance for everyone.

Another solution is to watch for IPs hitting too often, block them from the firewall, or otherwise prevent them from ordering. This could solve 2. and prevent [b] but the performance hit from scanning for IPs is massive and would likely cause more problems like 1. than the scripters were causing on their own. Additionally, the possibility of cloud networking and spambot zombies makes IP checking fairly useless.

A third idea, forcing the order form to be loaded for some time (say, half a second) would potentially slow the progress of the speedy orders, but again, the scripters would still be the first people in, at any speed not detrimental to actual users.

Goals

  1. Sell the item to non-scripting humans.
  2. Keep the site running at a speed not slowed by bots.
  3. Don't hassle the 'normal' users with any tasks to complete to prove they're human.

Scripting Solutions


Solution 1 - Scripting

How about implementing something like SO does with the CAPTCHAs?

If you're using the site normally, you'll probably never see one. If you happen to reload the same page too often, post successive comments too quickly, or something else that triggers an alarm, make them prove they're human. In your case, this would probably be constant reloads of the same page, following every link on a page quickly, or filling in an order form too fast to be human.

If they fail the check x times in a row (say, 2 or 3), give that IP a timeout or other such measure. Then at the end of the timeout, dump them back to the check again.


Since you have unregistered users accessing the site, you do have only IPs to go on. You can issue sessions to each browser and track that way if you wish. And, of course, throw up a human-check if too many sessions are being (re-)created in succession (in case a bot keeps deleting the cookie).

As far as catching too many innocents, you can put up a disclaimer on the human-check page: "This page may also appear if too many anonymous users are viewing our site from the same location. We encourage you to register or login to avoid this." (Adjust the wording appropriately.)

Besides, what are the odds that X people are loading the same page(s) at the same time from one IP? If they're high, maybe you need a different trigger mechanism for your bot alarm.


Edit: Another option is if they fail too many times, and you're confident about the product's demand, to block them and make them personally CALL you to remove the block.

Having people call does seem like an asinine measure, but it makes sure there's a human somewhere behind the computer. The key is to have the block only be in place for a condition which should almost never happen unless it's a bot (e.g. fail the check multiple times in a row). Then it FORCES human interaction - to pick up the phone.

In response to the comment of having them call me, there's obviously that tradeoff here. Are you worried enough about ensuring your users are human to accept a couple phone calls when they go on sale? If I were so concerned about a product getting to human users, I'd have to make this decision, perhaps sacrificing a (small) bit of my time in the process.

Since it seems like you're determined to not let bots get the upper hand/slam your site, I believe the phone may be a good option. Since I don't make a profit off your product, I have no interest in receiving these calls. Were you to share some of that profit, however, I may become interested. As this is your product, you have to decide how much you care and implement accordingly.


The other ways of releasing the block just aren't as effective: a timeout (but they'd get to slam your site again after, rinse-repeat), a long timeout (if it was really a human trying to buy your product, they'd be SOL and punished for failing the check), email (easily done by bots), fax (same), or snail mail (takes too long).

You could, of course, instead have the timeout period increase per IP for each time they get a timeout. Just make sure you're not punishing true humans inadvertently.

Solution 2 - Scripting

You need to figure a way to make the bots buy stuff that is massively overpriced: 12mm wingnut: $20. See how many bots snap up before the script-writers decide you're gaming them.

Use the profits to buy more servers and pay for bandwidth.

Solution 3 - Scripting

My solution would be to make screen-scraping worthless by putting in a roughly 10 minute delay for 'bots and scripts.

Here's how I'd do it:

  • Log and identify any repeat hitters.

You don't need to log every IP address on every hit. Only track one out of every 20 hits or so. A repeat offender will still show up in a randomized occassional tracking.

  • Keep a cache of your page from about 10-minutes earlier.

  • When a repeat-hitter/bot hits your site, give them the 10-minute old cached page.

They won't immediately know they're getting an old site. They'll be able to scrape it, and everything, but they won't win any races anymore, because "real people" will have a 10 minute head-start.

Benefits:

  • No hassle or problems for users (like CAPTCHAs).
  • Implemented fully on server-side. (no reliance on Javascript/Flash)
  • Serving up an older, cached page should be less performance intensive than a live page. You may actually decrease the load on your servers this way!

Drawbacks

  • Requires tracking some IP addresses
  • Requires keeping and maintaining a cache of older pages.

What do you think?

Solution 4 - Scripting

Take a look at http://nedbatchelder.com/text/stopbots.html">this article by ned Batchelder here. His article is about stopping spambots, but the same techniques could easily apply to your site.

> Rather than stopping bots by having > people identify themselves, we can > stop the bots by making it difficult > for them to make a successful post, or > by having them inadvertently identify > themselves as bots. This removes the > burden from people, and leaves the > comment form free of visible anti-spam > measures. > > This technique is how I prevent > spambots on this site. It works. The > method described here doesn't look at > the content at all.

Some other ideas:

  • Create an official auto-notify mechanism (RSS feed? Twitter?) that people can subscribe to when your product goes on sale. This reduces the need for people to make scripts.
  • Change your obfuscation technique right before a new item goes on sale. So even if the scripters can escalate the arms race, they are always a day behind.

EDIT: To be totally clear, Ned's article above describe methods to prevent the automated PURCHASE of items by preventing a BOT from going through the forms to submit an order. His techniques wouldn't be useful for preventing bots from screen-scraping the home page to determine when a Bandoleer of Carrots comes up for sale. I'm not sure preventing THAT is really possible.

With regard to your comments about the effectiveness of Ned's strategies: Yes, he discusses honeypots, but I don't think that's his strongest strategy. His discussion of the SPINNER is the original reason I mentioned his article. Sorry I didn't make that clearer in my original post:

> The spinner is a hidden field used for > a few things: it hashes together a > number of values that prevent > tampering and replays, and is used to > obscure field names. The spinner is an > MD5 hash of: > > - The timestamp, > - The client's IP address, > - The entry id of the blog entry being commented on, and > - A secret.

Here is how you could implement that at WOOT.com:

Change the "secret" value that is used as part of the hash each time a new item goes on sale. This means that if someone is going to design a BOT to auto-purchase items, it would only work until the next item comes on sale!!

Even if someone is able to quickly re-build their bot, all the other actual users will have already purchased a BOC, and your problem is solved!

The other strategy he discusses is to change the honeypot technique from time to time (again, change it when a new item goes on sale):

  • Use CSS classes (randomized of course) to set the fields or a containing element to display:none.
  • Color the fields the same (or very similar to) the background of the page.
  • Use positioning to move a field off of the visible area of the page.
  • Make an element too small to show the contained honeypot field.
  • Leave the fields visible, but use positioning to cover them with an obscuring element.
  • Use Javascript to effect any of these changes, requiring a bot to have a full Javascript engine.
  • Leave the honeypots displayed like the other fields, but tell people not to enter anything into them.

I guess my overall idea is to CHANGE THE FORM DESIGN when each new item goes on sale. Or at LEAST, change it when a new BOC goes on sale.

Which is what, a couple times/month?

If you accept this answer, will you give me a heads-up on when the next one is due? :)

Solution 5 - Scripting

Q: How would you stop scripters from slamming your site hundreds of times a second?
A: You don't. There is no way to prevent this behavior by external agents.

You could employ a vast array of technology to analyze incoming requests and heuristically attempt to determine who is and isn't human...but it would fail. Eventually, if not immediately.

The only viable long-term solution is to change the game so that the site is not bot-friendly, or is less attractive to scripters.

How do you do that? Well, that's a different question! ;-)

...

OK, some options have been given (and rejected) above. I am not intimately familiar with your site, having looked at it only once, but since people can read text in images and bots cannot easily do this, change the announcement to be an image. Not a CAPTCHA, just an image -

  • generate the image (cached of course) when the page is requested
  • keep the image source name the same, so that doesn't give the game away
  • most of the time the image will have ordinary text in it, and be aligned to appear to be part of the inline HTML page
  • when the game is 'on', the image changes to the announcement text
  • the announcement text reveals a url and/or code that must be manually entered to acquire the prize. CAPTCHA the code if you like, but that's probably not necessary.
  • for additional security, the code can be a one-time token generated specifically for the request/IP/agent, so that repeated requests generate different codes. Or you can pre-generate a bunch of random codes (a one-time pad) if on-demand generation is too taxing.

Run time-trials of real people responding to this, and ignore ('oops, an error occurred, sorry! please try again') responses faster than (say) half of this time. This event should also trigger an alert to the developers that at least one bot has figured out the code/game, so it's time to change the code/game.

Continue to change the game periodically anyway, even if no bots trigger it, just to waste the scripters' time. Eventually the scripters should tire of the game and go elsewhere...we hope ;-)

One final suggestion: when a request for your main page comes in, put it in a queue and respond to the requests in order in a separate process (you may have to hack/extend the web server to do this, but it will likely be worthwhile). If another request from the same IP/agent comes in while the first request is in the queue, ignore it. This should automatically shed the load from the bots.

EDIT: another option, aside from use of images, is to use javascript to fill in the buy/no-buy text; bots rarely interpret javascript, so they wouldn't see it

Solution 6 - Scripting

I don't know how feasible this is: ... go on the offensive.

Figure out what data the bots are scanning for. Feed them the data that they're looking for when you're NOT selling the crap. Do this in a way that won't bother or confuse human users. When the bots trigger phase two, they'll log in and fill out the form to buy $100 roombas instead of BOC. Of course, this assumes that the bots are not particularly robust.

Another idea is to implement random price drops over the course of the bag o crap sale period. Who would buy a random bag o crap for $150 when you CLEARLY STATE that it's only worth $20? Nobody but overzealous bots. But then 9 minutes later it's $35 dollars ... then 17 minutes later it's $9. Or whatever.

Sure, the zombie kings would be able to react. The point is to make their mistakes become very costly for them (and to make them pay you to fight them).

All of this assumes you want to piss off some bot lords, which may not be 100% advisable.

Solution 7 - Scripting

So the problem really seems to be: the bots want their "bag 'o crap" because it has a high perceived value at a low perceived price. You sometimes offer this item and the bots lurk, waiting to see if it's available and then they buy the item.

Since it seems like the bot owners are making a profit (or potentially making a profit), the trick is to make this unprofitable for them by encouraging them to buy the crap.

First, always offer the "bag 'o crap".

Second, make sure that crap is usually crap.

Third, rotate the crap frequently.

Simple, no?

You'll need a permanent "why is our crap sometimes crap?" link next to the offer to explain to humans what's going on.

When the bot sees that there's crap and the crap is automatically purchased, the recipient is going to be awfully upset that they've paid $10 for a broken toothpick. And then an empty trash bag. And then some dirt from the bottom of your shoe.

If they buy enough of this crap in a relatively short period of time (and you have large disclaimers all over the place explaining why you're doing this), they're going to lose a fair "bag 'o cash" on your "bag 'o crap". Even human intervention on their part (checking to ensure that the crap isn't crap) can fail if you rotate the crap often enough. Heck, maybe the bots will notice and not buy anything that's been in the rotation for too short a time, but that means the humans will buy the non-crap.

Heck, your regular customers might be so amused that you can turn this into a huge marketing win. Start posting how much of the "crap" carp is being sold. People will come back just to see how hard the bots have been bitten.

Update: I expect that you might get a few calls up front with people complaining. I don't think you can stop that entirely. However, if this kills the bots, you can always stop it and restart it later.

Solution 8 - Scripting

> 1. Sell the item to non-scripting humans. > > 2. Keep the site running at a speed not slowed by bots. > > 3. Don't hassle the 'normal' users with any tasks to complete to prove they're human.

You probably don't want to hear this, but #1 and #3 are mutually exclusive.

On the Internet, nobody knows you're a dog

Well, nobody knows you're a bot either. There's no programatic way to tell the whether or not there's a human on the other end of the connection without requiring the person to do something. Preventing scripts/bots from doing stuff on the web is the whole reason CAPTCHAs were invented. It's not like this is some new problem that hasn't seen a lot of effort expended on it. If there were a better way to do it, one that didn't involve the hassle to real users that a CAPTCHA does, everyone would be using it already.

I think you need to face the fact that if you want to keep bots off your ordering page, a good CAPTCHA is the only way to do it. If demand for your random crap is high enough that people are willing to go to these lengths to get it, legitimate users aren't going to be put off by a CAPTCHA.

Solution 9 - Scripting

The method Woot uses to combat this issue is changing the game - literally. When they present an extraordinarily desirable item for sale, they make users play a video game in order to order it.

Not only does that successfully combat bots (they can easily make minor changes to the game to avoid automatic players, or even provide a new game for each sale) but it also gives the impression to users of "winning" the desired item while slowing down the ordering process.

It still sells out very quickly, but I think that the solution is good - re-evaluating the problem and changing the parameters led to a successful strategy where strictly technical solutions simply didn't exist.


Your entire business model is based on "first come, first served." You can't do what the radio stations did (they no longer make the first caller the winner, they make the 5th or 20th or 13th caller the winner) - it doesn't match your primary feature.

No, there is no way to do this without changing the ordering experience for the real users.

Let's say you implement all these tactics. If I decide that this is important, I'll simply get 100 people to work with me, we'll build software to work on our 100 separate computers, and hit your site 20 times a second (5 seconds between accesses for each user/cookie/account/IP address).

You have two stages:

  1. Watching front page
  2. Ordering

You can't put a captcha blocking #1 - that's going to lose real customers ("What? I have to solve a captcha each time I want to see the latest woot?!?").

So my little group watches, timed together so we get about 20 checks per second, and whoever sees the change first alerts all the others (automatically), who will load the front page once again, follow the order link, and perform the transaction (which may also happen automatically, unless you implement captcha and change it for every wootoff/boc).

You can put a captcha in front of #2, and while you're loathe to do it, that may be the only way to make sure that even if bots watch the front page, real users are getting the products.

But even with captcha my little band of 100 would still have a significant first mover advantage - and there's no way you can tell that we aren't humans. If you start timing our accesses, we'd just add some jitter. We could randomly select which computer was to refresh so the order of accesses changes constantly - but still looks enough like a human.

First, get rid of the simple bots

You need to have an adaptive firewall that will watch requests and if someone is doing the obvious stupid thing - refreshing more than once a second at the same IP then employ tactics to slow them down (drop packets, send back refused or 500 errors, etc).

This should significantly drop your traffic and alter the tactics the bot users employ.

Second, make the server blazingly fast.

You really don't want to hear this... but...

I think what you need is a fully custom solution from the bottom up.

You don't need to mess with TCP/IP stack, but you may need to develop a very, very, very fast custom server that is purpose built to correlate user connections and react appropriately to various attacks.

Apache, lighthttpd, etc are all great for being flexible, but you run a single purpose website, and you really need to be able to both do more than the current servers are capable of doing (both in handling traffic, and in appropriately combating bots).

By serving a largely static webpage (updates every 30 seconds or so) on a custom server you should not only be able to handle 10x the number of requests and traffic (because the server isn't doing anything other than getting the request, and reading the page from memory into the TCP/IP buffer) but it will also give you access to metrics that might help you slow down bots. For instance, by correlating IP addresses you can simply block more than one connection per second per IP. Humans can't go faster than that, and even people using the same NATed IP address will only infrequently be blocked. You'd want to do a slow block - leave the connection alone for a full second before officially terminating the session. This can feed into a firewall to give longer term blocks to especially egregious offenders.

But the reality is that no matter what you do, there's no way to tell a human apart from a bot when the bot is custom built by a human for a single purpose. The bot is merely a proxy for the human.

Conclusion

At the end of the day, you can't tell a human and a computer apart for watching the front page. You can stop bots at the ordering step, but the bot users still have a first mover advantage, and you still have a huge load to manage.

You can add blocks for the simple bots, which will raise the bar and fewer people with bother with it. That may be enough.

But without changing your basic model, you're out of luck. The best you can do is take care of the simple cases, make the server so fast regular users don't notice, and sell so many items that even if you have a few million bots, as many regular users as want them will get them.

You might consider setting up a honeypot and marking user accounts as bot users, but that will have a huge negative community backlash.

Every time I think of a "well, what about doing this..." I can always counter it with a suitable bot strategy.

Even if you make the front page a captcha to get to the ordering page ("This item's ordering button is blue with pink sparkles, somewhere on this page") the bots will simply open all the links on the page, and use whichever one comes back with an ordering page. That's just no way to win this.

Make the servers fast, put in a reCaptcha (the only one I've found that can't be easily fooled, but it's probably way too slow for your application) on the ordering page, and think about ways to change the model slightly so regular users have as good a chance as the bot users.

-Adam

Solution 10 - Scripting

I say expose the price information using an API. This is the unintuitive solution but it does work to give you control over the situation. Add some limitations to the API to make it slightly less functional than the website.

You could do the same for ordering. You could experiment with small changes to the API functionality/performance until you get the desired effect.

There are proxies and botnets to defeat IP checks. There are captcha reading scripts that are extremely good. There are even teams of workers in India who defeat captchas for a small price. Any solution you can come up with can be reasonably defeated. Even Ned Batchelder's solutions can be stepped past by using a WebBrowser control or other simulated browser combined with a botnet or proxy list.

Solution 11 - Scripting

Disclaimer: This answer is completely non-programming-related. It does, however, try to attack the reason for scripts in the first place.

Another idea is if you truly have a limited quantity to sell, why don't you change it from a first-come-first-served methodology? Unless, of course, the hype is part of your marketing scheme.

There are many other options, and I'm sure others can think of some different ones:

  • an ordering queue (pre-order system) - Some scripts might still end up at the front of the queue, but it's probably faster to just manually enter the info.

  • a raffle system (everyone who tries to order one is entered into the system) - This way the people with the scripts have just the same chances as those without.

  • a rush priority queue - If there is truly a high perceived value, people may be willing to pay more. Implement an ordering queue, but allow people to pay more to be placed higher in the queue.

  • auction (credit goes to David Schmitt for this one, comments are my own) - People can still use scripts to snipe in at the last minute, but not only does it change the pricing structure, people are expecting to be fighting it out with others. You can also do things to restrict the number of bids in a given time period, make people phone in ahead of time for an authorization code, etc.

Solution 12 - Scripting

No matter how secure the Nazi's thought their communications were, the allies would often break their messages. No matter how you try to stop bots from using your site the bot owners will work out a way around it. I'm sorry if that makes you the Nazi :-)

I think a different mindset is required

  • Do not try to stop bots from using your site
  • Do not go for a fix that works immediately, play the long game

Get into the mindset that it doesn't matter whether the client of your site is a human or a bot, both are just paying customers; but one has an unfair advantage over the other. Some users without much of a social life (hermits) can be just as annoying for your site's other users as bots.

Record the time you publish an offer and the time an account opts to buy it.

> This gives you a record of how quickly > the client is buying stuff.

Vary the time of day you publish offers.

> For example, have a 3 hour window > starting at some obscure time of the > day (midnight?) Only bots and hermits > will constantly refresh a page for 3 > hours just to get an order in within > seconds. Never vary the base time, > only the size of the window.

Over time a picture will emerge.

01: You can see which accounts are regularly buying products within seconds of them going live. Suggesting they might be bots.

02: You can also look at the window of time used for the offers, if the window is 1 hour then some early buyers will be humans. A human will rarely refresh for 4 hours though. If the elapsed time is quite consistent between publish/purchase regardless of the window duration then that's a bot. If the publish/purchase time is short for small windows and gets longer for large windows, that's a hermit!

Now instead of stopping bots from using your site you have enough information to tell you which accounts are certainly used by bots, and which accounts are likely to be used by hermits. What you do with that information is up to you, but you can certainly use it to make your site fairer to people who have a life.

I think banning the bot accounts would be pointless, it would be akin to phoning Hitler and saying "Thanks for the positions of your U-boats!" Somehow you need to use the information in a way that the account owners wont realise. Let's see if I can dream anything up.....

Process orders in a queue:

When the customer places an order they immediately get a confirmation email telling them their order is placed in a queue and will be notified when it has been processed. I experience this kind of thing with order/dispatch on Amazon and it doesn't bother me at all, I don't mind getting an email days later telling me my order has been dispatched as long as I immediately get an email telling me that Amazon knows I want the book. In your case it would be an email for

  1. Your order has been placed and is in a queue.
  2. Your order has been processed.
  3. Your order has been dispatched.

Users think they are in a fair queue. Process your queue every 1 hour so that normal users also experience a queue, so as not to arouse suspicion. Only process orders from bot and hermit accounts once they have been in the queue for the "average human ordering time + x hours". Effectively reducing bots to humans.

Solution 13 - Scripting

We are currently using the latest generation of BigIP load balancers from F5 to do this. The BigIP has advanced traffic management features that can identify scrapersand bots based on frequency and patterns of use even from amongst a set of sources behind a single IP. It can then throttle these, serve them alternative content or simply tag them with headers or cookies so you can identify them in your application code.

Solution 14 - Scripting

How about introducing a delay which requires human interaction, like a sort of "CAPTCHA game". For example, it could be a little Flash game where during 30 seconds they have to burst checkered balls and avoid bursting solid balls (avoiding colour blindness issues!). The game would be given a random number seed and what the game transmits back to the server would be the coordinates and timestamps of the clicked points, along with the seed used.

On the server you simulate the game mechanics using that seed to see if the clicks would indeed have burst the balls. If they did, not only were they human, but they took 30 seconds to validate themselves. Give them a session id.

You let that session id do what it likes, but if makes too many requests, they can't continue without playing again.

Solution 15 - Scripting

First, let me recap what we need to do here. I realize I'm just paraphrasing the original question, but it's important that we get this 100% straight, because there are a lot of great suggestions that get 2 or 3 out of 4 right, but as I will demonstrate, you will need a multifaceted approach to cover all of the requirements.

Requirement 1: Getting rid of the 'bot slamming':

The rapid-fire 'slamming' of your front page is hurting your site's performance and is at the core of the problem. The 'slamming' comes from both single-IP bots and - supposedly - from botnets as well. We want to get rid of both.

Requirement 2: Don't mess with the user experience:

We could fix the bot situation pretty effectively by implementing a nasty verification procedure like phoning a human operator, solving a bunch of CAPTCHAs, or similar, but that would be like forcing every innocent airplane passenger to jump through crazy security hoops just for the slim chance of catching the very stupidest of terrorists. Oh wait - we actually do that. But let's see if we can not do that on woot.com.

Requirement 3: Avoiding the 'arms race':

As you mention, you don't want to get caught up in the spambot arms race. So you can't use simple tweaks like hidden or jumbled form fields, math questions, etc., since they are essentially obscurity measures that can be trivially autodetected and circumvented.

Requirement 4: Thwarting 'alarm' bots:

This may be the most difficult of your requirements. Even if we can make an effective human-verification challenge, bots could still poll your front page and alert the scripter when there is a new offer. We want to make those bots infeasible as well. This is a stronger version of the first requirement, since not only can't the bots issue performance-damaging rapid-fire requests -- they can't even issue enough repeated requests to send an 'alarm' to the scripter in time to win the offer.


Okay, so let's se if we can meet all four requirements. First, as I mentioned, no one measure is going to do the trick. You will have to combine a couple of tricks to achieve it, and you will have to swallow two annoyances:

  1. A small number of users will be required to jump through hoops
  2. A small number of users will be unable to get the special offers

I realize these are annoying, but if we can make the 'small' number small enough, I hope you will agree the positives outweigh the negatives.

First measure: User-based throttling:

> This one is a no-brainer, and I'm sure you do it already. If a user is logged in, and keeps refreshing 600 times a second (or something), you stop responding and tell him to cool it. In fact, you probably throttle his requests significantly sooner than that, but you get the idea. This way, a logged-in bot will get banned/throttled as soon as it starts polling your site. This is the easy part. The unauthenticated bots are our real problem, so on to them:

Second measure: Some form of IP throttling, as suggested by nearly everyone:

> No matter what, you will have to do some IP based throttling to thwart the 'bot slamming'. Since it seems important to you to allow unauthenticated (non-logged-in) visitors to get the special offers, you only have IPs to go by initially, and although they're not perfect, they do work against single-IP bots. Botnets are a different beast, but I'll come back to those. For now, we will do some simple throttling to beat rapid-fire single-IP bots.

> The performance hit is negligable if you run the IP check before all other processing, use a proxy server for the throttling logic, and store the IPs in a memcached lookup-optimized tree structure.

Third measure: Cloaking the throttle with cached responses:

> With rapid-fire single-IP bots throttled, we still have to address slow single-IP bots, ie. bots that are specifically tweaked to 'fly under the radar' by spacing requests slightly further apart than the throttling prevents.

> To instantly render slow single-IP bots useless, simply use the strategy suggested by abelenky: serve 10-minute-old cached pages to all IPs that have been spotted in the last 24 hours (or so). That way, every IP gets one 'chance' per day/hour/week (depending on the period you choose), and there will be no visible annoyance to real users who are just hitting 'reload', except that they don't win the offer.

> The beauty of this measure is that is also thwarts 'alarm bots', as long as they don't originate from a botnet.

> (I know you would probably prefer it if real users were allowed to refresh over and over, but there is no way to tell a refresh-spamming human from a request-spamming bot apart without a CAPTCHA or similar)

Fourth measure: reCAPTCHA:

> You are right that CAPTCHAs hurt the user experience and should be avoided. However, in one situation they can be your best friend: If you've designed a very restrictive system to thwart bots, that - because of its restrictiveness - also catches a number of false positives; then a CAPTCHA served as a last resort will allow those real users who get caught to slip by your throttling (thus avoiding annoying DoS situations).

> The sweet spot, of course, is when ALL the bots get caught in your net, while extremely few real users get bothered by the CAPTCHA.

> If you, when serving up the 10-minute-old cached pages, also offer an alternative, optional, CAPTCHA-verified 'front page refresher', then humans who really want to keep refreshing, can still do so without getting the old cached page, but at the cost of having to solve a CAPTCHA for each refresh. That is an annoyance, but an optional one just for the die-hard users, who tend to be more forgiving because they know they're gaming the system to improve their chances, and that improved chances don't come free.

Fifth measure: Decoy crap:

> Christopher Mahan had an idea that I rather liked, but I would put a different spin on it. Every time you are preparing a new offer, prepare two other 'offers' as well, that no human would pick, like a 12mm wingnut for $20. When the offer appears on the front page, put all three 'offers' in the same picture, with numbers corresponding to each offer. When the user/bot actually goes on to order the item, they will have to pick (a radio button) which offer they want, and since most bots would merely be guessing, in two out of three cases, the bots would be buying worthless junk.

>Naturally, this doesn't address 'alarm bots', and there is a (slim) chance that someone could build a bot that was able to pick the correct item. However, the risk of accidentally buying junk should make scripters turn entirely from the fully automated bots.

Sixth measure: Botnet Throttling:

> [deleted]

Okay............ I've now spent most of my evening thinking about this, trying different approaches.... global delays.... cookie-based tokens.. queued serving... 'stranger throttling'.... And it just doesn't work. It doesn't. I realized the main reason why you hadn't accepted any answer yet was that noone had proposed a way to thwart a distributed/zombie net/botnet attack.... so I really wanted to crack it. I believe I cracked the botnet problem for authentication in a different thread, so I had high hopes for your problem as well. But my approach doesn't translate to this. You only have IPs to go by, and a large enough botnet doesn't reveal itself in any analysis based on IP addresses.

So there you have it: My sixth measure is naught. Nothing. Zip. Unless the botnet is small and/or fast enough to get caught in the usual IP throttle, I don't see any effective measure against botnets that doesn't involve explicit human-verification such as CAPTHAs. I'm sorry, but I think combining the above five measures is your best bet. And you could probably do just fine with just abelenky's 10-minute-caching trick alone.

Solution 16 - Scripting

There are a few other / better solutions already posted, but for completeness, I figured I'd mention this:

If your main concern is performance degradation, and you're looking at true hammering, then you're actually dealing with a DoS attack, and you should probably try to handle it accordingly. One common approach is to simply drop packets from an IP in the firewall after a number of connections per second/minute/etc. For example, the standard Linux firewall, iptables, has a standard operation matching function 'hashlimit', which could be used to correlate connection requests per time unit to an IP-address.

Although, this question would probably be more apt for the next SO-derivate mentioned on the last SO-podcast, it hasn't launched yet, so I guess it's ok to answer :)

EDIT:
As pointed out by novatrust, there are still ISPs actually NOT assigning IPs to their customers, so effectively, a script-customer of such an ISP would disable all-customers from that ISP.

Solution 17 - Scripting

  1. Provide an RSS feed so they don't eat up your bandwidth.
  2. When buying, make everyone wait a random amount of time of up to 45 seconds or something, depending on what you're looking for exactly. Exactly what are your timing constraints?
  3. Give everyone 1 minute to put their name in for the drawing and then randomly select people. I think this is the fairest way.
  4. Monitor the accounts (include some times in the session and store it?) and add delays to accounts that seem like they're below the human speed threshold. That will at least make the bots be programmed to slow down and compete with humans.

Solution 18 - Scripting

First of all, by definition, it is impossible to support stateless, i.e. truly anonymous, transactions while also being able to separate the bots from legitimate users.

If we can accept a premise that we can impose some cost on a brand-spanking-new woot visitor on his first page hit(s), I think I have a possible solution. For lack of a better name, I'm going to loosely call this solution "A visit to the DMV."

Let's say that there's a car dealership that offers a different new car each day, and that on some days, you can buy an exotic sports car for $5 each (limit 3), plus a $5 destination charge.

The catch is, the dealership requires you to visit the dealership and show a valid driver's license before you're allowed in through the door to see what car is on sale. Moreover, you must have said valid driver's license in order to make the purchase.

So, the first-time visitor (let's call him Bob) to this car dealer is refused entry, and is referred to the DMV office (which is conveniently located right next door) to obtain a driver's license.

Other visitors with a valid driver's license is allowed in, after showing his driver's license. A person who makes a nuisance of himself by loitering around all day, pestering the salesmen, grabbing brochures, and emptying the complimentary coffee and cookies will eventually be turned away.

Now, back to Bob without the license -- all he has to do is endure the visit to the DMV once. After that, he can visit the dealership and buy cars anytime he likes, unless he accidentally left his wallet at home, or his license is otherwised destroyed or revoked.

The driver's license in this world is nearly impossible to forge.

The visit to the DMV involves first getting the application form at the "Start Here" queue. Bob has to take the completed application to window #1, where the first of many surly civil servants will take his application, process it, and if everything is in order, stamp the application for the window and send him to the next window. And so, Bob goes from windows to window, waiting for each step of his application to go through, until he finally gets to the end and receives his drivere's license.

There's no point in trying to "short circuit" the DMV. If the forms are not filled out correctly in triplicate, or any wrong answers given at any window, the application is torn up, and the hapless customer is sent back to the start.

Interestingly, no matter how full or empty the office is, it takes about the same amount of time to get serviced at each successive window. Even when you're the only person in line, it seems that the personnel likes to make you wait a minute behind the yellow line before uttering, "Next!"

Things aren't quite so terrible at the DMV, however. While all the waiting and processing to get the license is going on, you can watch a very entertaining and informative infomercial for the car dealership while you're in the DMV lobby. In fact, the infomerical runs just long enough to cover the amount of time you spend getting your license.

The slightly more technical explanation:

As I said at the very top, it becomes necessary to have some statefulness on the client-server relationship which allows you to separate humans from bots. You want to do it in a way that doesn't overly penalize the anonymous (non-authenticated) human visitor.

This approach probably requires an AJAX-y client-side processing. A brand-spanking-new visitor to woot is given the "Welcome New User!" page full of text and graphics which (by appropriate server-side throttling) takes a few seconds to load completely. While this is happening (and the visitor is presumably busy reading the welcome page(s)), his identifying token is slowly being assembled.

Let's say, for discussion, the token (aka "driver's license) consists of 20 chunks. In order to get each successive chunk, the client-side code must submit a valid request to the server. The server incorporates a deliberate delay (let's say 200 millisecond), before sending the next chunk along with the 'stamp' needed to make the next chunk request (i.e., the stamps needed to go from one DMV window to the next). All told, about 4 seconds must elapse to finish the chunk-challenge-response-chunk-challenge-response-...-chunk-challenge-response-completion process.

At the end of this process, the visitor has a token which allows him to go to the product description page and, in turn, go to the purchasing page. The token is a unique ID to each visitor, and can be used to throttle his activities.

On the server side, you only accept page views from clients that have a valid token. Or, if it's important that everyone can ultimately see the page, put a time penalty on requests that is missing a valid token.

Now, for this to be relatiely benign to the legitimate human visitor,t make the token issuing process happen relatively non-intrusively in the background. Hence the need for the welcome page with entertaining copy and graphics that is deliberately slowed down slightly.

This approach forces a throttle-down of bots to either use an existing token, or take the minimum setup time to get a new token. Of course, this doesn't help as much against sophisticated attacks using a distributed network of faux visitors.

Solution 19 - Scripting

Write a reverse-proxy on an apache server in front of your application which implements a Tarpit (Wikipedia Article) to punish bots. It would simply manage a list of IP addresses that connected in the last few seconds. You detect a burst of requests from a single IP address and then exponentially delay those requests before responding.

Of course, multiple humans can come from the same IP address if they're on a NAT'd network connection but it's unlikely that a human would mind your response time going for 2mS to 4mS (or even 400mS) whereas a bot will be hampered by the increasing delay pretty quickly.

Solution 20 - Scripting

I'm not seeing the great burden that you claim from checking incoming IPs. On the contrary, I've done a project for one of my clients which analyzes the HTTP access logs every five minutes (it could have been real-time, but he didn't want that for some reason that I never fully understood) and creates firewall rules to block connections from any IP addresses that generate an excessive number of requests unless the address can be confirmed as belonging to a legitimate search engine (google, yahoo, etc.).

This client runs a web hosting service and is running this application on three servers which handle a total of 800-900 domains. Peak activity is in the thousand-hits-per-second range and there has never been a performance issue - firewalls are very efficient at dropping packets from blacklisted addresses.

And, yes, DDOS technology definitely does exist which would defeat this scheme, but he's not seeing that happen in the real world. On the contrary, he says it's vastly reduced the load on his servers.

Solution 21 - Scripting

My approach would be to focus on non-technological solutions (otherwise you're entering an arms race you'll lose, or at least spend a great deal of time and money on). I'd focus on the billing/shipment parts - you can find bots by either finding multiple deliveries to same address or by multiple charges to a single payment method. You can even do this across items over several weeks, so if a user got a previous item (by responding really really fast) he may be assigned some sort of "handicap" this time around.

This would also have a side effect (beneficial, I would think, but I could be wrong marketing-wise for your case) of perhaps widening the circle of people who get lucky and get to purchase woot.

Solution 22 - Scripting

You can't totally prevent bots, even with a captcha. However you can make it a pain to write and maintain a bot and therefore reduce the number. Particularly by forcing them to update their bots daily you'll be causing most to lose interest.

Here are a some ideas to make it harder to write bots:

  • Require running a javascript function. Javascript makes it much more of a pain to write a bot. Maybe require a captcha if they aren't running javascript to still allow actual non-javascript users (minimal).

  • Time the keystrokes when typing into the form (again via javascript). If it's not human-like then reject it. It's a pain to mimic human typing in a bot.

  • Write your code to update your field ID's daily with a new random value. This will force them to update their bot daily which is a pain.

  • Write your code to re-order your fields on a daily basis (obviously in some way that's not random to your users). If they're relying on the field order, this will trip them up and again force daily maintenance to their bot code.

  • You could go even further and use Flash content. Flash is totally a pain to write a bot against.

Generally if you start taking a mindset of not preventing them, but making it more work for them, you can probably achieve the goal you're looking for.

Solution 23 - Scripting

Stick a 5 minute delay on all product announcements for unregistered users. Casual users won't really notice this and noncasual users will be registered anyhow.

Solution 24 - Scripting

Time-block user agents that make so-many requests per minute. Eg if you've got somebody requesting a page exactly every 5 seconds for 10 minutes, they're probably not a user... But it could be tricky to get this right.

If they trigger an alert, redirect every request to a static page with as little DB-IO as possible with a message letting them know they'll be allowed back on in X minutes.

It's important to add that you should probably only apply this on requests for pages and ignore all the requests for media (js, images, etc).

Solution 25 - Scripting

Preventing DoS would defeat #2 of @davebug's goals he outlined above, "Keep the site at a speed not slowed by bots" but wouldn't necessary solve #1, "Sell the item to non-scripting humans"

I'm sure a scripter could write something to skate just under the excessive limit that would still be faster than a human could go through the ordering forms.

Solution 26 - Scripting

All right so the spammers are out competing regular people to win the "bog of crap" auction? Why not make the next auction be a literal "bag of crap"? The spammers get to pay good money for a bag full of doggy do, and we all laugh at them.

Solution 27 - Scripting

The important thing here is to change the system to remove load from your server, prevent bots from winning the bag of crap WITHOUT letting the botlords know you are gaming them or they will revise their strategy. I don't think there is any way to do this without some processing at your end.

So you record hits on your home page. Whenever someone hits the page that connection is compared to its last hit, and if it was too quick then it is sent a version of the page without the offer. This can be done by some sort of load balancing mechanism that sends bots (the hits that are too fast) to a server that simply serves cached versions of your home page; real people get sent to the good server. This takes the load off the main server and makes the bots think that they are still being served the pages correctly.

Even better if the offer can be declined in some way. Then you can still make the offers on the faux server but when the bot fills out the form say "Sorry, you weren't quick enough" :) Then they will definitely think they are still in the game.

Solution 28 - Scripting

Most purely technical solutions have already been offered. I'll therefore suggest another view of the problem.

As I understand it, the bots are set up by people genuinely trying to buy the bags you're selling. The problem is -

  1. Other people, who don't operate bots, deserve a chance to buy, and you're offering a limited amount of bags.
  2. You want to attract humans to your site and just sell the bags.

Instead of trying to avoid the bots, you can enable potential bag-buyers to subscribe to an email, or even SMS update, to get notified when a sell will take place. You can even give them a minute or two head start (a special URL where the sell starts, randomly generated, and sent with the mail/SMS).

When these buyers go to buy they're in you're site, you can show them whatever you want in side banners or whatever. Those running the bots will prefer to simply register to your notification service.

The bots runners might still run bots on your notification to finish the buy faster. Some solutions to that can be offering a one-click buy.

By the way, you mentioned your users are not registered, but it sounds like those buying these bags are not random buyers, but people who look forward to these sales. As such, they might be willing to register to get an advantage in trying to "win" a bag.

In essence what I'm suggesting is try and look at the problem as a social one, rather than a technical one.

Asaf

Solution 29 - Scripting

You could try to make the price harder for scripts to read. This is achieved most simply by converting it to an image, but a text recognition algorithm could still get around this. If enough scripters get around it, you could try applying captcha-like things to this image, but obviously at the cost of user experience. Instead of an image, the price could go in a flash app.

Alternately, you could try to devise a way to "shuffle" the HTML pf a page in some way that doesn't affect the rendering. I can't think of a good example off the top of my head, but I'm sure it's somehow doable.

Solution 30 - Scripting

How about this: Create a form to receive an email if a new item is on sale and add a catching system that will serve the same content to anyone refreshing in less than X seconds.

This way you win all the escenarios: you get rid of the scrapers(they can scrape their email account) and you give chance to the people who wont code something just to buy in your site! Im sure i would get the email in my mobile and log in to buy something if i really wanted to.

Solution 31 - Scripting

How do you know there are scripters placing orders?

The crux of your problem is that you can't separate the scripters from the legitimate users and therefore can't block them, so how is it that you know there are scripters at all?

If you have a way to answer this question, then you have a set of characteristics you can use to filter the scripters.

Solution 32 - Scripting

Let's turn the problem on its head - you have bots buying stuff that you want real people to buy, how about making a real chance that the bots will buy stuff that you don't want the real people to buy.

Have a random chance for some non displayed html that the scraping bots will think is the real situation, but real people won't see (and don't forget that real people includes the blind, so consider screen readers etc as well), and this travels through to purchase something exorbitantly expensive (or doesn't make the actual purchase, but gets payment details for you to put on a banlist).

Even if the bots switch to 'alert the user' rather than 'make the purchase', if you can get enough false alarms, you may be able to make it sufficiently worthless for people (maybe not everyone, but some reduction in the scamming is better than none at all) not to bother.

Solution 33 - Scripting

Just a side-remark: it seems to me that the problem is, that your user expected behaviour is very similar to a bot (come in big waves, unautheticated, click every button :)), so the Captcha might be the only turing test able to discern it :)).

Solution 34 - Scripting

Not a complete fix, but I didn't see it here yet.

Track the "slamming" addresses, and put up a disclaimer saying that BOC/ items will not be shipped to any address that is not following your TOS.

This will have psych impact on some, and others who want to take advantage of your site will have to switch up methods, but you will have negated one avenue for them.

Solution 35 - Scripting

As suggested above, I did some work on non-captcha forms by using a pre-calculated hash of the expected value of a result stored in the form. The idea works for two Wordpress anti-spam plugins: WP-Morph and WP-HashCash. The only drawback is the client browser having to be able to interpret JavaScript.

Solution 36 - Scripting

So your problem is too much business? People are sniping your sales? This is assuming that these scripters are generating qualified sales? And the issue is they are snapping up all your product before everyone else does?

How about you make a full webservice API for 'scripters' to interface with. Then offer a slight discount or some kind of perk to make them play by your rules. Double your business and have your web sales and API sales.

Either that or just get WAY more inventory - you can't fight it - embrace and adapt to it.

Solution 37 - Scripting

Here's my take. Attack the ROI of the bot owners, so that they'll instead do the legitimate thing you want them to do instead of cheating. Let's look at it from their point of view. What are their assets? Apparently, an unlimited number of disposable machines, IP addresses, and perhaps even a large number of unskilled people willing to do inane tasks. What do they want? To always get the special deal you are offering before other legitimate people get it.

The good news is that they only have a limited window of time in which to win the race. And what I don't think they have is an unlimited number of smart people who are on call to reverse engineer your site at the moment you unleash a deal. So if you can make them jump through a specific hoop that is hard for them to figure out, but automatic for your legitimate customers (they won't even know it's there), you can delay their efforts just enough that they get beat by the massive number of real people who are just dying to get your hot deal.

The first step is to make your notion of authentication non-binary, by which I mean that, for any given user, you have a probability assigned to them that they are a real person or a bot. You can use a number of hints to build up this probability, many of which have been discussed already on this thread: suspicious rate activity, IP addresses, foreign country geolocation, cookies, etc. My favorite is to just pay attention to the exact version of windows they are using. More importantly, you can give your long-term customers a clear way to authenticate with strong hints: by engaging with the site, making purchases, contributing to forums, etc. It's not required that you do those things, but if you do then you'll have a slight advantage when it comes time to see special deals.

Whenever you are called upon to make an authentication decision, use this probability to make the computer you're talking to do more-or-less work before you will give them what they want. For example, perhaps some javascript on your site requires the client to perform a computationally expensive task in the background, and only when that task completes will you let them know about the special deal. For a regular customer, this can be pretty quick and painless, but for a scammer it means they need a lot more computers to maintain constant coverage (since each computer has to do more work). Then you can use your probability score from above to increase the amount of work they have to do.

To make sure this delay doesn't cause any fairness problems, I'd recommend making it be some kind of encryption task that includes the current time of day from the person's computer. Since the scammer doesn't know what time the deal will start, he can't just make something up, he has to use something close to the real time of day (you can ignore any requests that claim to come in before the deal started). Then you can use these times to adjust the first-come-first-served rule, without the real people ever having to know anything about it.

The last idea is to change the algorithm required to generate the work whenever you post a new deal (and at random other times). Every time you do that, normal humans will be unaffected, but bots will stop working. They'll have to get a human to get to work on the reverse-engineering, which hopefully will take longer than your deal window. Even better is if you never tell them if they submitted the right result, so that they don't get any kind of alert that they are doing things wrong. To defeat this solution, they will have to actually automate a real browser (or at least a real javascript interpreter) and then you are really jacking up the cost of scamming. Plus, with a real browser, you can do tricks like those suggested elsewhere in this thread like timing the keystrokes of each entry and looking for other suspicious behaviors.

So for anyone who you know you've seen before (a common IP, session, cookie, etc) you have a way to make each request a little more expensive. That means the scammers will want to always present you with your hardest case - a brand-new computer/browser/IP combo that you've never seen before. But by putting some extra work into being able to even know if they have the bot working right, you force them to waste a lot of these precious resources. Although they may really have an infinite number, generating them is not without cost, and again you are driving up the cost part of their ROI equation. Eventually, it'll be more profitable for them to just do what you want :)

Hope that's helpful,

Eric

Solution 38 - Scripting

Use hashcash.

>Hashcash is a denial-of-service counter measure tool. Its main current use is to help hashcash users avoid losing email due to content based and blacklist based anti-spam systems.

Solution 39 - Scripting

Why not make the content the CAPTCHA?

On the page where you display the prize, always have an image file in the same location with the same name, when a bag o crap sale is on, dynamically generate and load an image with the text etc advertising the prize, when no sale is on just have some default image that integrates well with the site. Seems like its the same concept as CAPTCHA... if the bot cannot figure out the meaning of the image they will not be able to "win" it, if they can they would have been able to figure out your CAPTCHA images anyways.

Solution 40 - Scripting

I don't know if this has been suggested yet, but rather than keeping a list of IP's of the bots, which you would need to scan through on every single page request, why not set a cookie or a session var to keep track of the bots? Here's an example in PHP:

<?php
// bot check
$now = microtime(true);
// bot counter var
$botCounter = 0;
if (array_key_exists('botCheck_panicCounter', $_REQUEST))
{
  $botCounter = $_REQUEST['botCheck_panicCounter'];
}

// if this seems to be a bot
if ($botCounter > 5)
{
  die('Die()!!');
}

// if this user visited before
if (array_key_exists('botCheck_lastVisit', $_REQUEST))
{
  $lastVisit = $_SESSION['botCheck_lastVisit'];
  $diff = $now - $lastVisit;

  // if it's less than a second
  if ($diff < 1)
  {
    // increase the bot counter
    $botCounter += 1;
    // and save it
    $_REQUEST['botCheck_panicCounter'] = $botCounter;
  }
}

// set the var for future use
$_SESSION['botCheck_lastVisit'] = $now;

// ---------------
// rest of the content goes here
?>

I didn't check for syntax errors, but you get the idea.

Solution 41 - Scripting

First of all don't try to use technology to defeat technology.

Your issues:

  1. Usability of the site
  2. List making the site exciting and fun
  3. Load on server caused by scripters.

Your Goals:

  1. Keep the site running at a speed not slowed by bots.
  2. Sell the item to non-scripting humans.
  3. Don't hassle the 'normal' users with any tasks to complete to prove they're human.

Goal #1: Keep the site running at a speed not slowed by bots.

This is actually pretty simple. Have someone else host the page. Instead of the front page being hosted on your servers, have Amazon S3 / Akamai host the page. Most of the page is 'static' anyhow. Regenerate the page every 5 minutes or so the more dynamic items get refreshed. (Hell, regenerate it every 1 minute if you want). But now the bots are not hitting your server - they are hitting Akamai's CDN which can certainly take the load.

Of course do this for RSS feeds as well. There is no reason why some other service can't take the bandwidth / load hit for you. On a related note, have all images served by Akamai, etc. Why take the hit?

Goal #2: Sell the item to non-scripting humans

I am in agreement with others that say make it so that scripting gives no real advantage. However, scripting is also a sign of a passionate woot customer, so you don't want to be an a*hole either.

So I would say let them buy but make them pay an inflated amount (or more preferably) just slow them down so that others have a chance.

So each time a user hits the site offer the bag of crap at $29.99 and have a timer at a random speed drop or raise the price. Have an image or some other indicator that tells humans if the price will go lower if they are patient.

The user has a "Buy now!" button that they click when they see price/# items being what they want.

Example:

User:

  • 0 sec $29.99 (1 item) Image says:"Wait for a lower price!"
  • 7 sec $31.99 (1 item) Image says:"Wait for a lower price!"
  • 13 sec $27.99 (1 item) Image says:"Bet you can do better!"
  • 16 sec $1.99 (0 item) Image says:"You would be nuts to pay us something for nothing!"
  • 21 sec $4.99 (two items) Image says:"Thats getting better!"
  • 24 sec $4.99 (tres itemos) Image says:"It doesn't get any better than that!"
  • 26 sec $8.99 (2 items) Image says:"Bet you can do better!"

repeat....

on a gradually tightening cycle that will lengthen the time the correct "$4.99 (tres itemos)" is displayed

If the bot hits refresh then the cycle restarts. If the user, misses and selects the wrong # of items / price -- decide if you want to let them buy at that price.

If they "overspend" for example, they pay $24.99 for 3 items and woot was only going to charge them $4.99 for 3 items then include a coupon for $20 off their next woot purchase.

Goal #3: Don't hassle the 'normal' users with any tasks to complete to prove they're human.

You are making a logical fallacy here. You are assuming that any Turing test (http://en.wikipedia.org/wiki/Turing_test ) has to be irritating. This is not true!

Here are some ideas:

  1. Create a game. The reward for playing the game is a $5 off coupon on the next order.
  2. Pair up 2 random users and have them chat with each other. Each user is told to answer 2 questions to the other user : "Ask what color is the your hair ?" and "What are you going to do next weekend?" Some users get paired with a woot random sentence generator. Each user is then asked if the other user is a human. If a user says the woot random sentence generator is human then reply "No I am not and may be you are from Mars as well. Do you want to try again?"
  3. Simple flash game that requires the user to maneuver through an obstacle course to get a discount coupon.
  4. Ask what city they are in. The reverse geo-code the ip address to see if they are close to being correct.
  5. Ask silly questions - "Do you think John McCain is a great president?" "Whose picture is on your driver's license?"

Only ask 3 times since all you really want to do is slow down the script kidees.

Solution 42 - Scripting

I agree with the poster above who said about sometimes selling really 'crap' bags of crap.

You appear to have come up with a business model which is serverly limited by the technology through which you are trying to deliver it. Yet like most tech minded individuals (not a crticism, after all that is the what this site is for) you are trying to come up with a technical solution. BUT THIS IS A BUSINESS PROBLEM. This is being caused by a failure in the technology, but that does not mean that technology is the answer. And most all solutions that anyone comes up with (and there will be many options) will in the end by bypassed by those determined to 'auto-buy' (for want of a better short description) your 'bags of crap'.

IMHO you are asking the wrong people the wrong question and you are going to waste a lot of time and resource on the wrong solution.

Solution 43 - Scripting

I'm in agreement with OP here - no captcha's please - it's not a very woot way of doing things.

Firstly set a few bot traps. I'd mention BOC more often on the home page, to trap the bots into looking as bots aren't intelligent, so again wording different each time e.g. "BOC complaints up!" - so bots just scanning for keywords will get trapped.

However, I think the real issue here is twofold, firstly the performance issues that you have need to be addressed, today it's bots causing a problem, but it indicates to me that there is a performance issue to be addressed.

Secondly it's a business opportunity to shift some real crap at a profit. So I'd keep with the overall woot style and state "we check for bots. If we think you are a bot you will get a box of botcrap."

The bot checking would be done offline sometime after the sale has been made, using bot traps, IP numbers, cookies, sessions, browser strings etc. Do some serious analysis with the data that you've got of purchasers to decide who gets botcrap. If you decide to ship botcrap - then you can free up some normal crap to sell to someone else.

Solution 44 - Scripting

Some ideas:

  1. Simple: don't name it "Random Crap." Change the name of the item every time so that the bots will have a harder time identifying it. They may still look for the $1.00 items, in which case I suggest occasionally selling $1 sticks of gum for a few minutes. The $5 shipping should make it worth your while.

  2. Harder: don't make the users do anything extra - make the users' computers do something extra. Write a JavaScript function that performs an intensive calculation taking a good amount of processing power - say, the ten-millionth prime number - and have the user's computer calculate that value and pass it back before you accept the order (perhaps even to create the "place order" URL). Change the function for every BoC so that bots can't pre-calculate and cache results (but so that you can). The calculation overhead might just slow down the bots enough to keep them off your backs - if nothing else, it would slow the hits on your servers so that they could breathe. You could also vary the depth of the calculation - ten-millionth prime versus hundred-millionth - at random so that the ordering process is no longer strictly first-come, first served, and to avoid penalizing customers with slower computers.

  • E

Solution 45 - Scripting

Upfront caveats:

I'm not script-literate; I haven't read many of the other comments here.

I stumbled on this from the Woot description this morning. I thought a few comments from a moderate user of the woot sites (and two-time manual purchaser of BOCs) might be helpful.

Woot is in a unique position where it is both a commerce site and a destination with loyal users, and I understand the perceived delicacy of that balance. But personally I feel your concern about "negative user impact" of a Crap-CAPCHA ("CRAPCHA" - somehow I doubt I'm the first to make that gag) on users is way overstated. As a user I'd be happy to prove I'm human. And I trust Woot to make the process fun and interesting, integrating it into the overall experience.

Will this lead to the "arms race" posited? I dunno, but it can only help. If, say, key information to purchase is included in the product image or implied in the product description (in a different way each time), about the best a script could do would be to open a purchase page on detection of the C-word. Actually, I think this is fine: you are still required to be on-line and first-come-first-served still applies -- Wootalyzer and similar tools just increase awareness rather than automating purchase while I sleep or work.

Good luck figuring this out, and keep up the good work.

JGM

Solution 46 - Scripting

How about selling RSA keys to each user :) Hey, if they can do it for WoW, you guys should be able to do it.

I expect a BoC for my answer ;)

Solution 47 - Scripting

Two solutions, one high-tech, one low-tech.

First the high-tech: The BOC offerings sell out in a seconds because bots get many of them in the first few milliseconds. So instead of trying to defeat the bots, sell them what they are scanning for: a bag of crap. Worthless crap, of course: bent paper clips and defiled photos of Rosie O'Donnell. Then have built-in random delays on the server for a few seconds at a time. As the sale continues, the actual value of the product sold will increase while the sell price does not. That way the first buyers (bots in the first few milliseconds) will get something worth much less than what they paid (brown onion cakes?), the next buyers (slower bots or faster humans) will get something unspectacular but worth the purchase price (bought on consignment?), and the last buyers (almost all humans) will get something worth more than the purchase price (break out champagne?). That flat-screen TV might be in the very last BOC purchased.

Anyone that waits too long will miss out, but at the same time anyone who buys too quickly will get hosed. The trick is to wait for some amount of time...but not too much. There's some luck involved, which is as it should be.

The low-tech solution would be to change up the name of the BOC to something humans can interpret but bots can't. Wineskin of excrement? Sack containing smelliness? Topologically flat surface adjacent to assorted goods? Never use the same name twice, use marginally different pictures, and explain in the product description what is actually being sold.

Solution 48 - Scripting

I probably don't understand the problem fully, but this idea occurred to me. Use AJAX to draw and update the dynamic content at a fixed interval while making the full page deliberately slow to load using refresh.

For example, make the whole page take a full 15 seconds to draw the first time it is visited, after which dynamic content is automatically refreshed using AJAX after a set time of, say, 5 seconds. It would be a major disadvantage to do a full page reload. The page may regularly display new information (including ads), but a full page redraw using reload would be considerably slower.

It will be possible for script kiddies to figure out the AJAX query and automate it but, then, it would also be very easy to rate-limit those requests from the same IP. Since there is no typical method for a standard human user to initiate those requests from the browser, it would be obvious that high-rate requests to the AJAX URL from the same IP would be initiated by some form of automated system.

Solution 49 - Scripting

Instead of blocking suspected IPs it may be effective to reduce the amount of data you give to an address as its hits/min goes up. So if the bot hits you up more than a secret randomly changing threshold it will not see the data. Logged in users would always see the data. Logged in users that hit the server too often would be forced to re-authenticate, or be given a captcha.

Solution 50 - Scripting

The solution to this may be to attach a little bit of client side processing to actions of logging in and buying. The processing can be a negligible amount so that individuals are not affected but bots attempting to do the tasks many times will be hampered by the extra work load.

The processing can be a simple equation to solve done in javascript, unless you don't want to have to require javascript on your site.

Solution 51 - Scripting

Hm I remember having read "Linux Firewalls" Attack Detection and Response with ... The situations there seem to be very comparable. And someone else has suggested that also. Just block a client temporarily or in progressive steps to throttle them down. If it's realyl from a few sites this must be quite efficient

Regards

Solution 52 - Scripting

Use JavaScript to dynamically write the info into the page. Without a JS rendering engine, surely the screen-scrapers & bots won't be able to read the information.

Solution 53 - Scripting

The method I will describe has two requirements. 1) Javascript is enforced 2) a web browser with a valid http://msdn.microsoft.com/en-us/library/bb894287.aspx browser session.

With out either of these you are "by design" out of luck. The internet is built by design to allow anonymous clients view content. There is no way around this with simple HTML. Oh and I just wanted to say that simple, image based CAPTCHA can be defeated easily, even the authors admit to this.

Moving along to the problem and the solution. The problem is in two parts. The first is that you cannot block out an individual for "doing bad things". To fix this you setup a method that takes in the browsers valid session and generate a md5sum + salt + hash (of your own private device) and send it back to the browser. The browser then is REQUIRED to return that hashed key back during every post / get. If you do not ever get a valid browser session, then you reply back with "Please use a valid web browser blah blah blah". All popular browsers have valid browser session id's.

Now that we have an identity at least for that browser session (I know it does not lock out permanently, but it is quite difficult to "renew" a browser session through simple scripting) we can effectively lock out a session (ie; make it annoyingly hard for scripters to actually visit your site with no penalty to valid users).

Now this next part is why it requires javascript. On the client you build a simple hash for each character that comes from the keyboard versus the value of the text in the textarea. That valid key comes over to the server as a simple hash and has to be validated. While this method could easily be reverse engineered, it does make it one extra hoop that individuals have to go through before they can submit data. Mind you this only prevents auto posting of data, not DOS with constant visits to the web site. If you even have access to ajax there is a way to send a salt and hash key across the wire and use javascript with it to build the onkeypress characters "valid token" that gets sent across the wire. Yes like I said it could easily be reversed engineered, but you see where I am going with this hopefully.

Now to prevent constant abuse via traffic. There are ways to establish patterns once given a valid session id. These patterns (even if Random is used to offset request times), have a lower epsilon than if say a human was attempting to reproduce that same margin of error. Since you have a session ID, and you have a pattern that "appears to be a bot", then you can block out that session with a simple lightweight response that is 20 bytes instead of 200000 bytes.

You see here, the goal is to 1) make the anonymous non-anonymous (even if it's only per session) and 2) develop a method to identify bots vs. normal people by establishing patterns in the way they use your system. You can't say that the latter is impossible, because I have done it before. While, my implementations were for tracking video game bots I would seem to think that those algorithms for identifying a bot vs. a user can be generalized to the form of web site visits. If you reduce the traffic that the bots consume you reduce the load on your system. Mind you this still does not prevent DOS attacks, but it does reduce the amount of strain a bot produces on the system.

Solution 54 - Scripting

I think that sandboxing certain IPs is worth looking into. Once an IP has gone over a threshold, when they hit your site, redirect them to a webserver that has a multi-second delay before serving out a file. I've written Linux servers that can handle open 50K connections with hardly any CPU, so it wouldn't be too hard to slow down a very large number of bots. All the server would need to do is hold the connection open for N seconds before acting as a proxy to your regular site. This would still let regular users use the site even if they were really aggressive, just at a slightly degraded experience.

You can use memcached as described here to cheaply track the number of hits per IP.

Solution 55 - Scripting

To solve the first problem of the bots slamming your front page, try making the honeypot exactly the same as a real bag of crap. Make the html markup for the front page include the same markup as if it were for a bag of crap, but make it hidden. This would force the bots to include CSS engines to determine if the bag of crap code is displayed or hidden. Alternatively, you could only output this 'fake' bag of crap html a random amount of time (hours?) before a real bag of crap goes up. This would cause the bots to sound the alarm too soon (but not know how soon).

To cover the second step of actually purchasing the bag of crap, add simple questions. I prefer common sense questions to the math questions suggested above. Things like, "Is ice hot or cold?" "Are ants big or small"? Of course, these would need to be randomized and pulled from a never-ending supply of questions, else the bots could be programmed to answer them. These questions, though, are still much less of an annoyance than CAPTCHAs.

Solution 56 - Scripting

What about using Flash?

Yes, I know the overhead of using Flash, plus the fact that some users will be locked out of buying the bag-o-crap (i.e.: iPhone users) might make this detrimental, but it seems to me that Flash would prevent screenscraping or at least make it difficult.

Am I wrong?

Edited to add

What about including a couple of "hidden" fields on your submissions form like what I found below:

> Actually, best practice seems to be to > use two hidden fields, one with an > initial value, and one without. It's > the rare bot which can ignore both > fields. Check for one field to be > blank, and the other to have the > initial value. And hide them using > CSS, not by making them "hidden" > fields: > > > >

Please > don't change the next two fields.

> id="address2" value="xyzzy"> type="text" name="address3" > id="address3" value="">
> > Bots tend to like fields with names > like 'address'. The text in the > paragraph is for those few rare human > beings who have a non-CSS capable > browser. If you're not worried about > them, you can leave it out. > > In the logic for processing the form, > you'd do something like: > > if (address2 == "xyzzy" and address3 > == "") { /* OK to send / } else { / probably have a bot */ }

Solution 57 - Scripting

  • Go after the money stream. It is much easier than tracking the IP side. Make bots pay too much a few times (announcement with white text on white background and all variants of it) kills their business case quickly. You should prepare this carefully, and make good use of the strong points of bots: their speed. Did you try a few thousand fake announcements a few seconds apart? If they are hitting ten times/second you can go even faster. You want to keep this up as long as they keep buying, so think carefully about the moment of the day/week you want to start this. Ideally, they will stop paying, so you can hand over your case to a bank.
  • Make sure your site is fully generated, and each page access returns different page content (html, javascript and css). Parsing is more difficult than generating, and it is easy to build-in more variation than bot developers can handle. Keep on changing the content and how you generate it.
  • You need to know how fast bots can adapt to changes you make, and preferably the timezone they are in. Is it one botnet or more, are they in the same timezone, a different one, or is it a worldwide developer network? You want your counterattack to be timed right.
  • Current state of the art bots have humans enter captcha's (offered against porn/games).
  • Make it unattractive to react very fast.
  • Use hashes and honeypots, as Ned Batchelder explains.

[edit] It is simply not true that you cannot defend against botnets. Especially my second suggestion provides for adequate defense against automated buyers. it requires a complete rethinking about the technology you're using, though. You might want to do some experiments with Seaside, or alternatively directly in c.

Solution 58 - Scripting

Assumed non-negotiables:

The first screen needs to be dead simple low overhead HTML, with a single easily identiable (bot-wise or people-wise) button to click or equivalent to indicate unambiguously "I want my Crap". Because we assume worst-case - you have the equivalent of a DOS attack from a combination of bots and nonbots, all first click on the site (as far as identfiability). So let's hand these out as quickly as we can from caches, benign echobots, etc.

(Note: As far as wooters are concerned, this is what happens anyway; it's just as painful for users as for Woot, so anything that helps absorb or mitigate the first screen acquisition is in the interests of all of the 3 parties involved.)

Then, the process needs to be no more aggravating for non-bots than it currently is, with no additional steps (or pain) for legits. (Background note on current design: Current wooters usually will be already signed on, or can sign on during the purchase process. New buyers need to register during purchase. So it's practically quicker to be already registered, and quicker yet to already be logged on.)

To complete the crap sale, a progression of transaction screens need to be navigated (say 5, plus or minus, depending on circumstances). The winners are the first who complete the full navigation. The current process rewards bots (or anyone else) who complete the entire sequence of 5 screens the most quickly; but the entire progression is biased toward fast responses (i.e. bots).

No question the bots will have the advantage for the first screen; and whatever edge they have achieved from that point, they keep through the rest of the screens, plus whatever advantage botness provides at other stages as well.


What if Woot were to intentionally decouple the queuing process after the first screen, and feed every session from that point into a sequence of fixed-minimum-time steps? The second screen wouldn't even be presented until 30 seconds had passed; after it was submitted, same for the following screens. I bet wooters would have no problem if they were told that, after the first screen, they would wait in a queue (which is already true) that would spread the load over time in a way that should take no longer than before, be more robust, and help weed out the bots. At this point you can throw in some of the bot speedbumps listed above (subtle variations in DOM objects, etc.) Just the benefit from the perception that Woot is a little more in control of things would help.

If a much higher proportion of the BOC initial hits could segue into a bot-unfriendlier non-time-critical process on their first hit (or close to it), rather than retrying, then real people who get past that point would have more confidence. For sure it would be less hostile than the current situation. It might cut down on the background-noise-ambient-bot-rate that's going on all the time even under normal Woot-Off circumstances. And the bots would lay off the main page and sit in the queue with each other (and everyone else) where they have no advantage.


Hmmm... The concept "apartment-threaded" comes to mind. I wonder if the pattern is approximately useful?
A useful core concept here is being able, after the first screen, to track accumulated total time in queue and be able to adjust to standard. As a bot-mitigation strategy, you would have a little bit of flexibility to maybe fudge the very earliest sessions by maybe 5-10 seconds; doing so would probably be undetectable, but would result in a richer non-bot purchase mix. I'm sure you have statistics to help evaluate stuff like this after the fact.
Just for fun, you could (at least for one wootoff) put together your own bot that combines the best features you've seen, and then hand it out to everyone the day before. Then at least everyone would be equally armed. (Then duck ... incoming ...)

Solution 59 - Scripting

I like BradC's answer (using the suggestions in Ned Batchelder's article), but I want to add another level to it. You may be able to randomize not only the field names, but also the field positions and the code that makes them invisible.

Now, this last bit is hard part and I don't know exactly how to do it, but someone with more JavaScript and CSS experience might be able to figure it out. Of course, you can't just keep the same positions all the time, because the scripters will just figure out that the element with position (x,y) is the real one. You would have to have some code that changes the positioning of form elements relative to other elements in order to move them off the page, overlay them on each other, etc. Then obfuscate the code that does this with some randomness introduced into it. Automatically change the obfuscation daily, before a new item is made available. The idea is that without a proper CSS and JavaScript implementation (and code to read layout of the page as a human would) a bot won't be able to figure out which elements are being shown to the user. Your server-side code, of course, knows which fields are real and which are fake.

In summary:

  • The field names are random
  • The field order is random
  • The field hiding code is complex
  • The field hiding code is obfuscated - randomly
  • The random factors are automatically changed every day by server-side code

With the constraints you've given I don't think there is a way to avoid an "arms race" of some kind, but that doesn't mean all is lost. If you can automate your side of the arms race and the scripters cannot then you would win it every time.

Solution 60 - Scripting

Make it unprofitable for the bot users and they'll go away pretty quickly - that is, occasionally sell something that no human being could possibly ever want (a bag of literal crap maybe).

Solution 61 - Scripting

How about a delay page where the user must wait for a delay that is shown in an image?

You only do the ordering from the page they get to if they click within a short enough time period of that specified in the image, maybe the image could be doing a countdown within an animated gif or very small javascript or flash timer.

If they jump to the details page outside the time limit, they see an expensive item as discussed in previous answers.

Solution 62 - Scripting

I am not 100% sure this would work, at least not without trying.

But it seems as if it should be possible, although technically challenging, to write a server-side HTML/CSS scrambler that takes as its input a normal html page + associated files, and outputs a more or less blank html page, along with an obfuscated javascript file that is capable of reconstructing the page. The javascript couldn't just print out straightforward DOM nodes, of course... but it could spit out a complex set of overlapping, absolute-positioned divs and paragraphs, each containing one letter, so it comes out perfectly readable.

Bots won't be able to read it unless they have employ a complete rendering engine and enough AI to reconstruct what a human would be seeing.

Then, because it's an automated process, you can re-scramble the site as often as you have the computational power for - every minute, or every ten minutes, or every hour, or even every page load.

Granted, writing such an obfuscater would be difficult, and probably not worth it. But it's a thought.

Solution 63 - Scripting

There's a lot of suggestions here so pardon me if this has already been posted.

The first thing I would do is make the ordering a two step process. The first step would pass back a GUID while logging the IP Address. The second step would receive the GUID and compare it against IP Addresses that have been logged. In conjunction with blocking IP Addresses which are spamming the site (IE: faster than a human can click refresh) this technique could stop spammers from successfully making purchases thereby solving 1 & 3.

The second item is problematic but I would keep a running list of your regular user's IP addresses and throttle traffic for any newcomers. This could leave first time visitors and dial up users (due to changing IP addresses) out in the cold, but I think it's just making the best out of a bad situation by giving preference to repeat business... and dialup users, well it's questionable whether they'd "win" even if there weren't any spammers anyway.

Solution 64 - Scripting

Why don't you block the credit cards of users you identify as bots?

  1. Publish that using bots is illegal on your website
  2. Find certain heuristics that identify bots (this can be done for example by short-term IP tracking or by the time it takes them to feel up the form)
  3. If someone you tagged as a bot purchased the item, block his credit card for future use
  4. Next time he tries to make a purchase, disallow it and return the item to stock

I guess even the professionals will run out of credit cards eventually.

Your server load should decrease with time once the botters give up on you. Another idea is to separate your pages between servers - e.g., RSS feed on one server, homepage on another, checkout on another one.

Good luck.

Solution 65 - Scripting

I'm pretty sure your server already logs all the IPs of incoming requests (most do) - so the data is already there.

Maybe you could:

Just validate the "winner" by verifying that it's IP shows up less than a certain threshold value in the logs (I use "grep | wc -l" to get the count). If it's over your threshold, temporarily block that IP (hour or so?).

Disqualify any "winner" with the same shipping address or payment info as the "last" winner, or that has won within a certain time frame to spread the "winning" around.

The bots won't get 'em all that way.

To annoy the crap out of the scrapers: When the "random crap" item goes up, run the HMTL output for that page through a "code obfuscator" ... which doesn't change the "display" of the page ... just scrambles the code with randomly generated Ids etc.

More insidious:

Increase the price charged for the "won" item based on how many times the winning IP shows up in the logs. Then even if the bots win, so do you. :-)

Solution 66 - Scripting

Trying to target the BOTs themselves will never solve the problem - whoever is writing them will figure out a new way around whatever you've put in place. However forcing the user to think before buying would be a much more effective solution. The best way of doing this that I can think of is run a Dutch auction. Start the price high (2x what you buy it for in the shop) and decrease it over time. The first person to hit buy gets it. I don't think any bot is intelligent enough to workout what the best price is for the item.

Solution 67 - Scripting

Restrict the times at which you release offers: For example: only from 7 minutes to 8 minutes past the start of an hour. Do not deviate from this, and give penalties on the order of a couple seconds to IPs which check a lot in the half hour before the release time. It then becomes advantageous for bot owners to only screen scrape for a couple minutes every hour instead of all. the. time. Also, because a normal person can check a site once every hour but not every second, you put normal people on a much more even footing with the bots.

Cookies: Use a tracking cookie composed of only a unique ID (a key for a database table). Give "release delays" to clients with no cookie, invalid cookies, clients which use the same cookie from a new IP, or cookies used with high frequency.

Identify likely bots: Cookies will cause the bots to request multiple cookies for each IP they control, which is behavior which can be tracked. IPs with only a single issued cookie are most likely normal clients. IPs with many issued cookies are either large NAT-ed networks, or a bot. I'm not sure how you would distinguish those, but companies are probably more likely to have things like DNS servers, a web page, and things of that nature.

Solution 68 - Scripting

Perhaps you need a solution that makes it totally impossible for a bot to distinguish between the bag-o-crap sales and all other content.

This is sort of a variation on the captcha theme, but instead of the user authenticating themselves by solving the captcha, the captcha is instead the description of the sale, rendered in a visually pleasing (but perhaps somewhat obscured by the background) manner.

Solution 69 - Scripting

I think your best bet is to watch IP's coming in, but to mitigate the issues you mention in a couple of ways. First, use a probabilistic hash (eg, a Bloom Filter) to mark IP's which have been seen before. This class of algorithm is very fast, and scales well to absolutely massive set sizes. Second, use a graduated response, whereby a server delay is added to each request, predicated by how much you've seen the IP 'recently'.

Solution 70 - Scripting

At the expense of Usability by those with screen readers you could just, on 90% of the pages use unlabelled, undenotable picture buttons. Rotate the pictures regularly and use a random generator and random sorting to lay out two buttons that say "I want this" and "I am a bot". Place them side by sort in a different order. At each stage a user can make progress torwards their target but a bot is more likely to make a mistake (50% * number of steps). It's like a capture at every stage on easier for the user and slower for bots who need to prompt their master at EVERY single step. Put the price, the confirm button, the item description in pictures. It sucks but likely more successful.

Solution 71 - Scripting

Just make the bots compete on even ground. Encrypt a timestamp and stick it in a hidden form field. When you get a submission decrypt it and see how much time has passed. If it surpasses the threshold of human typing ability reject it. Now bots and humans can only try to buy the bag of crap at the same speed.

Solution 72 - Scripting

If you can't beat them... Change the rules!

Why not provide a better system than the scripters have made for themselves?
Modify your site to be fairer for people not using bot scripts. People register (CAPTCHA or email verification) and effectively enter a lottery competition to win!

'Winning' makes it more fun. and each person pays a small entry fee so the Winner gets the product for EVEN less

Solution 73 - Scripting

I'm not a web developer, so take this with a pinch of salt, but here's my suggestion -

Each user has a cookie (containing a random string of data) that determines whether they see the current crap sale.

(If you don't have a cookie, you don't see them. So users who don't enable cookies never see crap sales; and a new user will never see them the first time they view the page, but will thereafter).

Each time the user refreshes the website, he passes his current cookie to the server, and the server uses that to decide whether to give him a new cookie or leave the current one unchanged; and based on that, decides whether to show the page with or without the crap sale.

To keep things simple on the server side, you could say at any given time, there's only ever one cookie that will let you see crap sales; and there are a couple of other cookies that are labelled "generated in the last 2 seconds", which will always be kept unchanged. So if you refresh the page faster than that, you can't get a new one.

(...ah, well, I guess that doesn't stop a bot from restoring an older cookie and passing it back to you. Still, maybe there's a solution here somewhere.)

Solution 74 - Scripting

Stopping all bots would be quite difficult, especially without using a CAPTCHA. I think you should approach this from the standpoint of implementing a wide variety of measures to make life harder for the scripters.

I believe this is one measure that would weed out some of them:

You could try randomizing the IDs and class names of your tags with each response. This would force bots to rely on the position and context of important tags, which requires a more sophisticated bot. Furthermore, you could randomize the position of the tags if you want to use relative or absolute positioning in your CSS.

The biggest drawback with this approach is that you would have to take steps to ensure your CSS file is not cached client-side, because it would of course need to contain the randomized IDs & class names. One way to overcome this is to not use external CSS files and instead put the CSS with the randomized selectors in the <head></head> section of the page. This would allow the randomized CSS to be client-side cached along with the rest of the page.

Solution 75 - Scripting

Steps:

(combining ideas from another poster and gif spammers)

  • Display the entire offer page as an image, ad-copy and all.

  • Encrypt the price in the URL.

Attacks:

  1. Bots going to the URL to view the price on the checkout page
  • turn the checkout price tag into an image, or

  • apply a captcha before users can go to the order page.

  1. chewing up bandwidth
  • Serve special offers using images, normal offers using HTML.
  1. reckless bot ordering
  • some of the special "image" offers are actually at normal prices.
  1. RSS Scraping
  • RSS feeds must be paid for by hashcash or captchas.

  • This has to be on a per-request basis.

  • It can be pre-paid, for instance user can enter 20 captchas for 200 RSS look ups

  • Once the threat of DDOS has been mitigated, you can implement e-mail notification of offers

Solution 76 - Scripting

How about coming up with a way to identify bots, probably IP based, but not block them from accessing the site, just don't allow them to actually buy anything. That is, if they buy, they don't actually get it, since bots are against the terms of use.

Solution 77 - Scripting

The problem with CAPTCHA is that when you see a crap sale on Woot, you have to act VERY fast as a consumer if you hope to receive your bag of crap. So, if you are going to use a form of CAPTCHA , it must be very quick for the customer.

What if you had a large image, say 600 x 600 that was just a white background and dots of different colors or patterns randomly placed on the image. The image would have an image map on it. This map would have a link mapped to small chunks of the image. Say, 10 x 10 blocks. The user would simply have to click on the specific type of dot. It would be quick for end the user and it would somewhat difficult for a bot developer to code. But this alone may not be that difficult for a good bot creator to get past. I would add ciphered URLs.

I was developing a system some time back that would cipher URLs. If every URL on these pages is ciphered with a random IV, Then they all appear to be unique to the bot. I was designing this to confuse probing bots. I have not completed the technique yet, but I did have a small site coded that functioned in this manor.

While these suggestions are not a full solution, they would make it way harder to build a working bot while still being easy for a human to use.

Solution 78 - Scripting

There's probably no good solution as long as the surprise distribution of the bag o' crap is tied only to a point in time - since bots have plenty of time, and the resources to keep slamming the site at short time intervals.

I think you'd have to add an extra criterion that bots can't screen-scrape or manipulate from their end. For instance, say at any time there's 5000 humans hitting the page a few times a minute looking for the bag of crap, and 50 bots slamming it every second. In the first few seconds after it appears, the 50 bots are going to snap it all up.

So, you could add a condition that the crap appears first to any users where the modulus 30 of their integer IP is a random number, say 17. Maybe another random number is added every second, so the crap is revealed incrementally to all clients over 30 seconds.

Now imagine what happens in the first several seconds: currently, all 50 bots are able to snap up all the crap immediately, and the humans get 0. Under this scheme, after 6 seconds only 10 bots have made it through, while 1000 humans have gotten through, and most of the crap goes to the humans. You could play with the timings and the random modulus to try and optimize that interval, depending on user counts and units available.

Not a perfect solution, but an improvement. The upside is many more humans than bots will benefit. There are several downsides, mainly that not every human gets an equal shot at the crap on any particular day - though they don't have much of a shot now, and I'd guess even without bots, most of them get shut out at random unless they happen to refresh at just the right second. And, it wouldn't work on a botnet with lots of distributed IPs. Dunno if anyone's really using a botnet just for woot crap though.

Solution 79 - Scripting

Your end goal is to spread out to a larger user base who gets to buy stuff.

What if you did something like releasing your bags of w00t over a period of an hour or two, and over a range of IP addresses, instead of releasing them all at the same time and to any IP address.

Let's say you have 255 bags of w00t. 1.0.0.0 can buy in the first minute, 2.0.0.0 can buy in the second minute (potentially 2 bags of w00t available), etc, etc.

Then, after 255 minutes, you have made bags of w00t available to everybody, although it is highly likely that not all 255 bags of w00t are left.

This limits a true attack to users who have >255 computers, although a bot user might be able to "own" the bag of w00t assigned to their IP range.

There is no requirement that you match up bags to IP's fairly (and you definitely should use some type of MD5 / random seed thing)... if you distribute 10 bags of w00t incrementally, you just have to make sure that it gets distributed evenly across your population.

If IP's are bad then you can use cookies and exclude the use case where a non-cookied user gets offered a bag of w00t.

If you notice that a particular IP, cookie, or address range has an extreme amount of traffic, make the bag of w00t available to them proportionally later / last, so that occasional / steady / slow visitors are given opportunities before heavy / rapid / probable bot users.

--Robert

Solution 80 - Scripting

I would recommend a firewall-based solution. Netfilter/iptables, as most firewalls, allows you to set a limit to the maximum number of new page requests per unit time.

For example, to limit the number of page views dispensed to something human -- say, 6 requests every 30 second -- you could issue the following rules:

iptables -N BADGUY
iptables -t filter -I BADGUY -m recent --set --name badguys

iptables -A INPUT -p tcp --dport http -m state --state NEW -m recent --name http --set
iptables -A INPUT -p tcp --dport http -m state --state NEW -m recent --name http --rcheck --seconds 30 --hitcount 6 -j BADGUY
iptables -A INPUT -p tcp --dport http -m state --state NEW -m recent --name http --rcheck --seconds  3 --hitcount 2 -j DROP

Note that this limit would apply to each visitor independently, so one user's misuse of the site wouldn't affect any other visitor.

Hope this helps!

Solution 81 - Scripting

You could reduce the load on your server by having the RSS and HTML update at the same time, so there's no incentive for the bots to screenscrape your site. Of course this gives the bots and advantage in buying your gear.

If you only accept payments via credit card (might be the case, might not be, but it shows my line of thinking) only allow a user to buy a BOC once every 10 sales with the same account and/or credit card. It's easy for a script kiddie to get a swarm of IPs, less easy for them to get a whole heap of credit cards together. And as you've said IPs are really hard to ban, while temporary bans on credit cards should be a walk in the park.

You could let everyone know what the limit is, or you could just tell them that because of the high demand and/or bot interest there's throttling implemented on the purchasing while being unspecific about the mechanism.

Each attempt to purchase during the throttling period could trigger an exponential backoff - you buy a BOC, you have to what for 10 sales to pass before you try again. You try again anyway on the next sale, and now you have to wait 20 sales, then 40, then 80...

This is only really useful if it's really unlikely that a human user would manage to get a BOC twice in less than 10 sales. Tune the number as appropriate.

Solution 82 - Scripting

There are a few solutions you could take, based on the level of complexity you want to get into.

These are all based on IP tracking, which falls apart somewhat under botnets and cloud computing, but should thwart the vast majority of botters. The chances that Joe Random has a cloud of bots at his disposal is far lower than the chance that he's just running a Woot bot he downloaded somewhere so he can get his bag of crap.

Plain Old Throttling

At a very basic, crude level, you could throttle requests per IP per time period. Do some analysis and determine that a legitimate user will access the site no more than X times per hour. Cap requests per IP per hour at that number, and bots will have to drastically reduce their polling frequency, or they'll lock themselves out for the next 58 minutes and be completely blind. That doesn't address the bot problem by itself, but it does reduce load, and increases the chance that legitimate users will have a shot at the item.

Adaptive Throttling

An variant on that solution might be to implement a load balancing queue, where the number of requests that one has made recently counts against your position in the queue. That is, if you keep slamming the site, your requests become lower priority. In a high-traffic situation like the bag of crap sales, this would give legitimate users an advantage over the bots in that they would have a higher connection priority, and would be getting pages back more quickly, while the bots continue to wait and wait until traffic dies down enough that their number comes up.

End-of-the-line captcha

Third, while you don't want to bother with captchas, a captcha at the very end of the process, right before the transaction is completed, may not be a bad idea. At that point, people have committed to the sale, and are likely to go through with it even with the mild added annoyance. It prevents bots from completing the sale, which means that at a minimum all they can do is hammer your site to try to alert a human about the sale as quickly as possible. That doesn't solve the problem, but it does mean that the humans have a far, far better chance of obtaining sales than the bots do currently. It's not a solution, but it's an improvement.

A combination of the above

Implement basic, generous throttling to stop the most abusive of bots, while taking into account the potential for multiple legitimate users behind a single corporate IP. The cutoff number would be very high - you cited bots hitting your site 10x/sec, which is 2.16 million requests/hour, which is obviously far above any legitimate usage, even for the largest corporate networks or shared IPs.

Implement the load balancing queue so that you're penalized for taking up more than your share of server connections and bandwidth. This penalizes people in the shared corporate pools, but it doesn't prevent them from using the site, and their violation should be far less terrible than your botters, so their penalization should be less severe.

Finally, if you have exceeded some threshold for requests-per-hour (which may be far, far, far lower than the "automatically drop the connection" cutoff), then require that the user validate with a captcha.

That way, the users who are legitimately using the site and only have 84 requests per hour, even when they're mega-excited, don't notice a change in the site's slow at all. However, Joe Botter finds himself stuck with a dilemma. He can either:

  • Blow out his request quota with his current behavior and not be able to access the site at all, or
  • Request just enough to not blow the request quota, which gives him realtime information at lower traffic levels, but causes him to have massive delays between requests during high-traffic times, which severely compromises his ability to complete a sale before inventory is exhausted, or
  • Request more than the average user and end up getting stuck behind a captcha, or
  • Request no more than the average user, and thus have no advantage over the average user.

Only the abusive users suffer degradation of service, or an increase in complexity. Legitimate users won't notice a single change, except that they have an easier time buying their bags of crap.

Addendum

Throttle requests for unregistered users at rates far below registered users. That way, a bot owner would have to be running a bot via an authenticated account to get past what should be a relatively restrictive throttling rate.

The inventive botters will then register multiple user IDs and use those to achieve their desired query rate; you can combat that by considering any IDs that show from the same IP in a given period to be the same ID, and subject to shared throttling.

That leaves the botter with no recourse but to run a network of bots, with one bot per IP, and a registered Woot account per bot. This is, unfortunately, effectively indistinguishable from a large number of unassociated legitimate users.

You could use this strategy in conjunction with one or more of the above strategies with the goal to produce the overall effect of providing the best service to registered users who do not engage in abusive usage patterns, while progressively penalizing other users, both registered and unregistered, according to their status (anon or registered) and the level of abuse as determined by your traffic metrics.

Solution 83 - Scripting

my first thought was that you say the bots are scraping your webpage, which would suggest they are only picking up the HTML content. So having your order screen verify (from the http-logs) that an offer-related graphic was loaded from the bot

Solution 84 - Scripting

Develop a front page component and shopping cart that do not run natively in the brower. If you use something like Flex/Flash or Silverlight, it is much more difficult to scrape, and you have full control over the server communication, and thus can shield the content completely from scripters.

Solution 85 - Scripting

This only needs to be a problem if the bot users are paying with invalid credit cards or something. So how about a non-technical solution?

Treat the bot users as normal users as long as their payments are valid and make sure you have enough in stock to satisfy the total demand.

Result: more sales. You're in business to make money, right?

Solution 86 - Scripting

To guarantee selling items only to non-scripted humans, could you detect inhumanly quick responses between the item being displayed on the front page and an order being made? This turns the delay tactic on its head, instead of handicapping everyone artificially through a .5 second delay, allow requests as fast as possible and smack bots that are clearly superhuman:)

There is some physical limit to how fast a user can click and make a decision, and by detecting after all the requests have gone through (as opposed to purposely slowing down all interacts), you don't effect performance of non-scripted humans.

If only using CAPTCHAs some of the time is acceptable, you could increase the delay time to fast-human (as opposed to superhuman) and require a post confirmation CAPTCHA if someone clicks really fast. Akin to how some sites require CAPTCHA confirmation if someone posts multiple posts quickly.

Sadly I don't know of any good ways to stop screen scrapers of your product listings :(

Solution 87 - Scripting

I'm just wondering if there might be a simple solution to this.

I assume that the message indicating the crap sale is posted in text and this is the bit of information the scrapers look for.

What if you made the announcement using an image instead? Doing so might pose some design problems but they could be overcome and possibly serve as the impetus for some ingenious creativity.

Issue #1
There would have to be some design space dedicated to an image. (Want to be really tricky? Rotate a local ad through this slot. Of course the image's name would need to be static to avoid giving scrapers a scent. That's one slot that would never have to worry about ad-blindness...)

Issue #2
RSS. I'm not sure if everyone can view images in their feed readers. If enough of your users can, then you could start sending a daily feed update consisting of an image. You could send whatever miscellaneous stuff you wanted on most days and then switch it for your crap sale alert as desired.

I don't know... would they just program their bots to hit your site every time a feed item went out?

Other issues? Probably a lot. Maybe this will help with some brainstorming, though.

Take care,
Brian

Solution 88 - Scripting

Here are some valid assumptions for you to make:

  • Any automated solution can and will be broken.
  • Making the site completely require human input (eg CAPTCHA) will greatly increase the difficulty of logging in/checking out/etc.
  • You have a limited number of Bandoliers of Cabbage to sell.
  • You can track users by session via a client-side cookie.
  • You aren't dealing with extremely hardcore criminals here; these are simply technical people who are bending, but not breaking, the law. Successful orders via bots will go to the person's home, and likely not some third-party mail drop.

The solution isn't a technical one. It's a policy one.

  • Log all client session ids on your webserver.
  • Enact a "limited bots" policy; say, one screen scrape every X seconds, to give people with regular browsers the ability to hit refresh. Any user found to be going over this limit doesn't win the woot.
  • Follow this up by sending known bot owners a bunch of Leakfrogs.

Solution 89 - Scripting

Here is what I'd do:

  1. Require all bidders for bag of crap sales to register with the site.
  2. When you want to start a sale, post "BOC sale starting soon, check your email to see if you are eligible" on your main page.
  3. Send out invitations to a random selection of the registered players, with a url unique to that particular sale when sale starts.
  4. Ensure the URL used is different for each sales event.
  5. Tweak the random selection invitation algorithm to pull down elibiblity for frequent winners, based upon Credit Card used for purchase, paypal account, or shipping address.

This thwarts the bots, as your main page only shows the pending BOC event. The bots will not have access to the URL without recieving it in email, and have no guarantee they will recieve it at all.

If you are concerned about sales impact, you could also incentivize participation by giving away one or two BOC's for each sale. If you don't see enough uptake on an offer in a given time interval, you automatically mail additional registered users, increasing the participant pool in each offer.

Viola. Level playing field, without tons of heuristics and web traffic analysis. System can still be gamed by people setting up huge numbers of email accounts, but tweaking participant selection criteria by CC#, paypal account, shipping address mitigates this.

Solution 90 - Scripting

What about the NoBot Control from the ASP.net AJAX control toolkit?

It does some automated javascript request and timing tricks to prevent bots from accessing the site with NO user interaction.

Sorry if this doesn't meet some requirement, i'll just have to call
tl;dr >D

Solution 91 - Scripting

Turn certain parts of the page into images so the bots can't understand them.

For example create small images of the integers 0-9, the dollar sign, and the decimal point. Cache the images on the client's computer when the page loads... then display the price using images chosen via code running server-side. Most human users won't notice the difference and the bots won't know the prices of any items.

Solution 92 - Scripting

My Opinion as a longtime WOOTer

I would be happy to have a CAPTCHA on ordering, turned on only for the BOC. I think most wooters would agree. Plus, 99.9% of the time you don't even get to the order screen because it sells out so fast, so hardly anybody would even know!!

If you make the CAPTCHA a really hard math problem, I'll be able to finally explain to my mom the practical benefit of so many years of studying math.

Solution 93 - Scripting

I don't see why IP address filtering HAS to be prohibitively expensive. With IIS you can build an ISAPI filter to do this in native code. I am sure apache has similar interfaces. Using the IP address of the client, you can write a simple rate-limiter for HTTP requests that does not depend on a banned list or other such nonsense.

Solution 94 - Scripting

  1. Tarpit. Limiting page views to 1 per second won't bother human users.
  2. Links via JavaScript. Simple bots don't dig that. as of usability, statistics show, that less then 1% of users doesn't use JS. 2a. hard-core version of above. Links in Flash.
  3. parameters stored in session, rather then in query string. Most bot are stateless.

Solution 95 - Scripting

Never thought I'd recommend flash for anything, but what about flash? Let your server send down asynchronous, encrypted content to the flash file signaling if it's deal time or not. As long as the response is the same size deal or no deal, the bot can't tell which it is.

At a more general level, you need to focus on the resources a human plus a browser have that a scripted bot doesn't and take advantage of things that are easy for humans/browsers and hard for bots. Captcha is obviously a simplistic attempt at doing this, but doesn't suit your site as you say. Flash would weed out a ton of bots, leaving only the (slower) ones that drive a real browser. The solution could be much simpler than captcha if it just requires the user to click in the right spot.

Take advantage of humans' massively parallel image processing power!

Solution 96 - Scripting

Make scanning the site expensive.

There is no way I know that can keep a bot out of your site. I even know a service, where there are humans that scan sites for you. How would you handle that?

The worst thing for bots is, when a site changes. After a while it gets to expensive or to boring to keep the bot running. There might be updates on the your site that look like a new product, but actually are not. If you update unregularly and undpredictable things are getting realy hard to the bot.

Banning IPs might be a countermeasure, as long as it is a known IP. The offender needs to use a proxy. The proxies I know work well, but slow you down a lot.

Solution 97 - Scripting

My thoughts (I haven't checked all the others, so I don't know if it's novel)

Dealing with swarming:

  1. Convert the front-page matter for each day's stuff to be a flash/flex object.
  • Yes, some people will complain, but we're looking for the common case here, not the ideal.
  • You should also randomize the name of your flash objects, so they aren't in any predictable pattern of names.
  1. Using Akamai or another CDN, deploy this flash object in advance to the outside world. Akamai produces what appears to be random URLs, so it makes it hard to predict.
  2. When it is time for a new sale, you just have to change your URL locally to refer to the appropriate object at Akamai, and people will go fetch the flash object from them to discover if the deal is a BoC or not.

End-of-the-day - you now have Akamai handling your swarms of midnight traffic

Dealing with auto-buy

  1. Each of the flash objects you create can have lots and lots of content hidden inside - images, links, arbitrary ids, including 'bag of crap' in a thousand places. you should be able to obfuscate the flash as well.
  2. When the flash object "goes live", people will start to attack it. But there are so many false positives that a simple string scan is useless - they'll have to simulate running the flash locally.
  3. But the flash doesn't write text. It draws lines and shapes. Shapes in different colors, all connected to timers that make them appear and disappear at different times.
  • If you've seen the Colbert Report, you know how the intro has hundreds of words describing Colbert. Imagine something like that for your intro, which will always include Bag O Crap.
  • Now, imagine that the intro takes an arbitrary amount of time - sometimes a few seconds, sometimes as long as a minute or more (make it funny)
  • Meanwhile, "Bag O Crap" is constantly showing up, but again, clearly as part of the intro.
  • Finally, the actual deal of the day is revealed, with an active 'shimmer' effect that makes it difficult for any single snapshot of the canvas to reveal the actual product name. This is floating above an animated background that still says 'bag O crap' and is constantly in motion
  • again, all of this is handled with lines and shapes, not with text strings

End result - your hacker is forced to take lots of image snapshots of the deal, figure out how to separate all the false positives and identify the actual deal. Meanwhile, humans just look at it, and between eye fatigue and our ability to fill in gaps in the text, we can read the deal as is.

This won't work forever, but it would work for a while.

Another idea is to simply restrict people from buying BoCs unless they've bought something before with that account, and to never let them buy a BoC again.

Solution 98 - Scripting

  1. Identify bots via IP or a suit of other mechanisms.

  2. Always serve those identified as bots the normal front page.

Real people falsely identified as bots will not get the specials, but they won't notice anyway.

Bot owners won't realize you've identified them, so they will stop adapting their scripts.

Solution 99 - Scripting

My solution is a combination of marketing changes and technology changes.

Currently the technical side of sellng portion of bags of crap promotions are handled as a normal woot sale. The sale starts, people race to buy, all items are sold. The same statistcal charts used for daily sales are used bag of crap sales.

There are several market goals involved:

  • Get customers to visit the site once every day (impluse purchasing). The possiblility of a seeing a bag of crap sale is the reason/reward.
  • Network/viral/gossipy effect where a customer sees a bag of crap sale is on they will IM/EMail/Telephone their friends.
  • There is also what I'd call general "good will". Woot is a really cool place because it occasionally rewards its customers with amazing sales (bag of crap that included a flat panel tv)... AND its done in a fair "first comes first served" manner.

The first 2 seem to be the most important. The sheer number of visitors has an effect on how fast normal deals sell (or sell out). New customers have traditionally been attracted pretty much by word of mouth, and having customers sending their friends to woot.com is a win.

So... my solution is to change the promotion delivery into more of a lottery.

Occasionally users can do something fun to see if they are eligable for a bag of crap. The something fun could be a silly flash game along the lines of "punch the monkey" or Orbitz mini-puts, baseball, hockey. The goal here is game that a bot can't script so some considerable care will be needed. The goal is also not to only award bag of crap to game winners... but to all game players.

The technical core of the game is that at the end of the game a request is made to a server that does an "instant lottery" to determine if the user has won a bag of crap sale opportunity. The server request will need to include something calculated by the game itself (roughly speaking "hash cash"... a complex, CPU cycle consuming, calculation, and hopefully one that is difficult to reproduce). This is to prevent a bot from repeatedly entering the lottery just be querying the lottery server/service.

The game itself can change over time. You can do special event games for halloween, christmas, valinties, easter, etc. There's lots of room for fun marketing ideas that can match woot's "wootiness".

If the user wins they can purchase N bags of crap (in a time limited window)... but they can also send N friends a time limited invitation to purchase a bag of crap (good for 24 hours). This provides a super strong network effect... customers will definately tell their friends. Or you could also do it as "buy 1 give 1"... let customers buy up to a total of N but force every second one to be shipped to a friend. The key here is to make the network/gossip effect an full fledged part... help the customer tell the world about the wonderfulness of woot.

The promotional material arounnd bag of crap sales concept will also need to be revamped. The graphs of how quickly a bag of crap sold out are no longer relevant. Something along the lines how frequently through the month people had the opportunity to purchase. How many people told their friends. The marterials should subtley emphasize the point that a daily woot visit is a good idea.

You can also promote the heck out of why bag of crap sales are changing. Especially that you hired the best bag of crap consultants available for free.

Solution 100 - Scripting

Honestly, I think your best solution is to make items during a Woot-Off only be visible to logged in users, and limit each logged-in user to one home page refresh every 500ms or so. (Or possibly make only a picture of the item be visible to unauthenticated users during a Woot-Off, and make sure you don't always use the same picture for Random Crap.) I think Woot users would be willing to accept this if you sell it as a measure to help them get their Bowls of Creaminess, and you can also point out that it'll help them check out quicker. Anything else--even using captchas--is subject to your typical arms race.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionDave RutledgeView Question on Stackoverflow
Solution 1 - Scriptinglc.View Answer on Stackoverflow
Solution 2 - ScriptingChristopher MahanView Answer on Stackoverflow
Solution 3 - ScriptingabelenkyView Answer on Stackoverflow
Solution 4 - ScriptingBradCView Answer on Stackoverflow
Solution 5 - ScriptingSteven A. LoweView Answer on Stackoverflow
Solution 6 - ScriptingZac ThompsonView Answer on Stackoverflow
Solution 7 - ScriptingOvidView Answer on Stackoverflow
Solution 8 - ScriptingChris UpchurchView Answer on Stackoverflow
Solution 9 - ScriptingAdam DavisView Answer on Stackoverflow
Solution 10 - ScriptingRobert VenablesView Answer on Stackoverflow
Solution 11 - Scriptinglc.View Answer on Stackoverflow
Solution 12 - ScriptingPeter MorrisView Answer on Stackoverflow
Solution 13 - ScriptingozyView Answer on Stackoverflow
Solution 14 - ScriptingPaul DixonView Answer on Stackoverflow
Solution 15 - ScriptingJens RolandView Answer on Stackoverflow
Solution 16 - ScriptingfalstroView Answer on Stackoverflow
Solution 17 - ScriptingJoe PhillipsView Answer on Stackoverflow
Solution 18 - ScriptingToybuilderView Answer on Stackoverflow
Solution 19 - ScriptingDenis HennessyView Answer on Stackoverflow
Solution 20 - ScriptingDave SherohmanView Answer on Stackoverflow
Solution 21 - ScriptingShacharView Answer on Stackoverflow
Solution 22 - ScriptingjwanagelView Answer on Stackoverflow
Solution 23 - ScriptingBrianView Answer on Stackoverflow
Solution 24 - ScriptingOliView Answer on Stackoverflow
Solution 25 - ScriptingShawn MillerView Answer on Stackoverflow
Solution 26 - Scripting1800 INFORMATIONView Answer on Stackoverflow
Solution 27 - ScriptingChris LattaView Answer on Stackoverflow
Solution 28 - ScriptingAsaf RView Answer on Stackoverflow
Solution 29 - ScriptingMatt BoehmView Answer on Stackoverflow
Solution 30 - ScriptingDFectuosoView Answer on Stackoverflow
Solution 31 - ScriptingBret WalkerView Answer on Stackoverflow
Solution 32 - ScriptingCebjyreView Answer on Stackoverflow
Solution 33 - ScriptingTomáš KafkaView Answer on Stackoverflow
Solution 34 - ScriptingStingyJackView Answer on Stackoverflow
Solution 35 - ScriptingDiego SevillaView Answer on Stackoverflow
Solution 36 - ScriptingdfasdljkhfaskldjhfasklhfView Answer on Stackoverflow
Solution 37 - ScriptingEric RiesView Answer on Stackoverflow
Solution 38 - ScriptingMorendilView Answer on Stackoverflow
Solution 39 - Scriptinguser6288View Answer on Stackoverflow
Solution 40 - ScriptingAistinaView Answer on Stackoverflow
Solution 41 - ScriptingPatView Answer on Stackoverflow
Solution 42 - ScriptingCharlieView Answer on Stackoverflow
Solution 43 - ScriptingRichard HarrisonView Answer on Stackoverflow
Solution 44 - ScriptingEJCView Answer on Stackoverflow
Solution 45 - ScriptingJGMView Answer on Stackoverflow
Solution 46 - ScriptingAelverView Answer on Stackoverflow
Solution 47 - ScriptingMitsuratiView Answer on Stackoverflow
Solution 48 - ScriptingbhsimonView Answer on Stackoverflow
Solution 49 - ScriptingPhilView Answer on Stackoverflow
Solution 50 - ScriptingDaveCView Answer on Stackoverflow
Solution 51 - ScriptingFriedrichView Answer on Stackoverflow
Solution 52 - ScriptingMeffView Answer on Stackoverflow
Solution 53 - ScriptingjwendlView Answer on Stackoverflow
Solution 54 - ScriptingtwkView Answer on Stackoverflow
Solution 55 - ScriptingjasonkarnsView Answer on Stackoverflow
Solution 56 - ScriptingGregDView Answer on Stackoverflow
Solution 57 - ScriptingStephan EggermontView Answer on Stackoverflow
Solution 58 - ScriptingdkretzView Answer on Stackoverflow
Solution 59 - ScriptingEMPView Answer on Stackoverflow
Solution 60 - Scriptinguser42092View Answer on Stackoverflow
Solution 61 - ScriptingAndy DentView Answer on Stackoverflow
Solution 62 - ScriptinglevandView Answer on Stackoverflow
Solution 63 - ScriptingSpencer RuportView Answer on Stackoverflow
Solution 64 - ScriptingidophirView Answer on Stackoverflow
Solution 65 - ScriptingRon SavageView Answer on Stackoverflow
Solution 66 - ScriptingWairapetiView Answer on Stackoverflow
Solution 67 - ScriptingCraig GidneyView Answer on Stackoverflow
Solution 68 - ScriptingSingleNegationEliminationView Answer on Stackoverflow
Solution 69 - ScriptingJohnny GraettingerView Answer on Stackoverflow
Solution 70 - ScriptingPhilluminatiView Answer on Stackoverflow
Solution 71 - ScriptingJason ChristaView Answer on Stackoverflow
Solution 72 - ScriptingAndrew HarryView Answer on Stackoverflow
Solution 73 - ScriptingLaurie CheersView Answer on Stackoverflow
Solution 74 - ScriptingSeibarView Answer on Stackoverflow
Solution 75 - ScriptingChui TeyView Answer on Stackoverflow
Solution 76 - ScriptingDavid DurhamView Answer on Stackoverflow
Solution 77 - ScriptingKonradView Answer on Stackoverflow
Solution 78 - ScriptingDogwelderView Answer on Stackoverflow
Solution 79 - ScriptingRobertView Answer on Stackoverflow
Solution 80 - Scriptingv4lkyriusView Answer on Stackoverflow
Solution 81 - ScriptingDaveView Answer on Stackoverflow
Solution 82 - ScriptingchealdView Answer on Stackoverflow
Solution 83 - ScriptingbotView Answer on Stackoverflow
Solution 84 - ScriptingcdonnerView Answer on Stackoverflow
Solution 85 - ScriptingSeun OsewaView Answer on Stackoverflow
Solution 86 - ScriptingRyanView Answer on Stackoverflow
Solution 87 - ScriptingView Answer on Stackoverflow
Solution 88 - ScriptingMarkView Answer on Stackoverflow
Solution 89 - ScriptingSimon View Answer on Stackoverflow
Solution 90 - ScriptingTJBView Answer on Stackoverflow
Solution 91 - ScriptingMrDatabaseView Answer on Stackoverflow
Solution 92 - ScriptingMark HarrisonView Answer on Stackoverflow
Solution 93 - Scriptinga_moleView Answer on Stackoverflow
Solution 94 - ScriptingvartecView Answer on Stackoverflow
Solution 95 - ScriptingscottynomadView Answer on Stackoverflow
Solution 96 - ScriptingMathias FView Answer on Stackoverflow
Solution 97 - ScriptingjohnbrView Answer on Stackoverflow
Solution 98 - ScriptingBen NolandView Answer on Stackoverflow
Solution 99 - Scriptinguser53794View Answer on Stackoverflow
Solution 100 - ScriptingBecca Royal-GordonView Answer on Stackoverflow