Think That Employee Harassment Complaint Is Too Stupid To Take Seriously? Just Write Your Check To Me Now.

Last week some writers at Jezebel made a public complaint about its parent, Gawker Media:

For months, an individual or individuals has been using anonymous, untraceable burner accounts to post gifs of violent pornography in the discussion section of stories on Jezebel. The images arrive in a barrage, and the only way to get rid of them from the website is if a staffer individually dismisses the comments and manually bans the commenter. But because IP addresses aren't recorded on burner accounts, literally nothing is stopping this individual or individuals from immediately signing up for another, and posting another wave of violent images (and then bragging about it on 4chan in conversations staffers here have followed, which we're not linking to here because fuck that garbage). This weekend, the user or users have escalated to gory images of bloody injuries emblazoned with the Jezebel logo. It's like playing whack-a-mole with a sociopathic Hydra.

The writers further complained that they had repeatedly informed Gawker Media of the problem, but higher-ups failed or refused to do anything about it. A couple of days later, the writers announced that Gawker Media had responded and was taking steps to deal with trolls barraging them with rape porn.

This complaint was ridiculed in some circles. No, I won't link them. The ridicule seemed to be based on the propositions that (1) it's silly to think that Gawker should be responsible for what some third-party troll is doing to its employees, and (2) it's silly to be upset by that sort of thing.

This is a good example of the phenomenon I like to call "bless your heart for thinking that, but it's not the law, dipshit."

American employers are, in fact, responsible for taking reasonable steps to protect their employees from racial or sexual harassment by third parties. This is the example I use when I train companies on sexual harassment prevention: if the UPS guy is constantly and creepily hitting on your receptionist, you need to do something about it. You may think that it is outrageous that this is the rule. Cool story, bro. That's what the law is, and if you employ people or advise anyone who employs people, you're a fool to ignore it. Here's how the United States Court of Appeals for the Fourth Circuit — hardly a bastion of liberalism — recently summarized it:

Similar to the reasoning we set forth for employer liability for co-worker harassment, “an employer cannot avoid Title VII liability for [third-party] harassment by adopting a ‘see no evil, hear no evil’ strategy.' “ Ocheltree v. Scollon Prods., Inc., 335 F.3d 325, 334 (4th Cir.2003) (en banc). Therefore, an employer is liable under Title VII for third parties creating a hostile work environment if the employer knew or should have known of the harassment and failed “to take prompt remedial action reasonably calculated to end the harassment.” Amirmokri v. Baltimore Gas & Elec. Co., 60 F.3d 1126, 1131 (4th Cir.1995) (quoting Katz v. Dole, 709 F.2d 251, 256 (4th Cir.1983)) (internal quotation marks omitted) (applying this standard to co-worker harassment).

In that case, the Circuit overturned a trial court judgment for the employer, finding that there was sufficient evidence to go to trial on the employee's complaints that an asshole customer had created a hostile environment and the employer didn't do anything about it:

Applying this standard here, we conclude that a reasonable jury could find that Dal–Tile knew or should have known of the harassment. Here, Freeman presented evidence that Wrenn, her supervisor, knew of all three of the most major incidents: the two “black b* * * * ” comments, and the “f* * *ed up as a n* * * *r's checkbook” comment. Wrenn was present for the first “black b* * * * ” comment, which Freeman complained about to Wrenn afterward. Freeman also complained to Wrenn specifically about the other two comments from Koester almost immediately after they occurred.5 When Freeman complained to Wrenn about the “f* * *ed up as a n* * * *r's checkbook” comment, Wrenn “scoffed and shook her head and put her head back down and continued on with trying to pick the nail polish off of her nails.” J.A. 102. When Freeman complained about the second “black b* * * * ” comment, Wrenn simply rolled her eyes and went on talking to a co-worker. J.A. 112. In addition to these most severe incidents, Wrenn was also present the time Koester passed gas on Freeman's phone and Freeman began crying and had to leave the room.

That supervisor, Wrenn, reacted rather like the critics of the Jezebel writers: "why, exactly, is this an issue we should care about?" That attitude was rather expensive for the defendant company in this case.

Or maybe you think that trolls constantly posting rape porn isn't severe or pervasive enough to create a hostile working environment. No, thanks, I don't think I'll borrow your laptop. Everyone is entitled to their own opinion, but everyone isn't entitled to the law being what they think it is. Minimal exposure to pornography isn't severe or pervasive. If someone puts up a centerfold and you complain and it's gone the next day, courts wont' find that to be sufficient to create liability. But being constantly exposed to pornography calculated to upset you — meant to troll you? That's probably over the line. "Although most cases involving pornography in the workplace include other elements such as threatening or offensive remarks, see, e.g., Waltman, 875 F.2d at 471, there is no necessary reason why the presence of pornography alone could not create a hostile work environment so long as the pornography was sufficiently severe or pervasive." Adams v. City of Gretna, 2009 WL 1668374 (E.D. La. June 12, 2009).

Let's put it this way: Gawker Media made the wrong choice when they ignored complaints, and the right choice when they started taking steps reasonably calculated to address the complaints. I'm not certain that the writers would win a lawsuit if Gawker had continued to put its head in the sand, but if I had to choose the stronger case, I'd choose the writers.

Preventing harassment is, for whatever reason, a subject that upsets people. Go ahead, be upset. Say it's ridiculous! But part of my job is training companies to minimize liability risks, and I'm here to tell you: if you don't take it very seriously as an employer, you might as well start writing checks to litigators right now.

Last 5 posts by Ken White


  1. Barfy says

    Let me see if I understand this: Someone looked at a comments on a Gawker site, found them offensive and was surprised?

  2. Mike says

    I think it's certainly an interesting iteration of the issue that provides opportunity to really think about what the various factors entail. Would Gawker be right if they argued that giving the employees the power to delete offending posts is a sufficient remedy, and that overhauling the system entirely to try to address such an issue is not required because it is not reasonable? The issue of a guy disguising his IP address to get around bans is a pervasive issue for many websites. But the ability for visitors to come to your site, get through a non-cumbersome registration process, and participate is critically important to a lot of websites. It costs a lot to run a system where posts by new visitors require moderation or are otherwise limited (both in manpower to review pending posts and lost consumer interest when new visitors can't jump immediately into the conversation).

    If I own a restaurant and some guy runs in at random times to flash the hostess, is it enough that I install an alarm button on the hostess stand that will alert security to come running? Or do I have to add a further layer of security so that nobody can even get into the restaurant without proving that they're wearing underwear?

  3. Chris says

    To me it doesn't seem like the case you cite is completely on point. The plaintiff in question was a receptionist for a tile company. Dealing with that sort of harassment is clearly outside her job description. On the other hand, dealing with online commentators is clearly part of Jezebel employees' job description. The internet being what it is, a certain percentage of these commentators are going to be creepy assholes. So this seems rather like taking a job at a septic servicing company and complaining about having to deal with shit all day.

  4. Kat says

    In before gigantic blowup.

    (Oooh, I love the edit feature. Been too long since I used this site.)

  5. Z says

    American corporations- and Americans generally- often get upset upon hearing that they have obligations to others and have to do something even without deriving an immediate tangible benefit from their actions.

  6. Kathryn says

    Mike – usually I'd agree with you, but you're missing a piece of the equation (mentioned in the original post on Jezebel, but not by Ken here): the anonymous jerk or jerks weren't disguising their IP addresses to get around security because there was no such security, and — even more importantly — Gawker Media refused to put such security in place. The writers and mods at Jezebel were only asking to allow IP logging and blocking, which Nick Denton and his cronies refused to enable out of respect for anonymous tipsters. You know, the same anonymous tipsters who could easily create an anonymous email address and email the tipline. In your metaphor, this would be like the restaurant owner saying, "I know this jerk is coming in and flashing you every day, but I refuse to put up security cameras that might aid in identifying and reporting him to the police because I want my other customers to keep their anonymity."

    Chris – as with Mike, I somewhat agree with you, but at some point, isn't there an argument to be made for frequency and severity of creepiness? And as with my reply to Mike, the fact that Gawker Media refused to put in the most basic of security measures, measures that would have stemmed the tide of rape GIFs for at least a little while, is worthy of consideration. In your metaphor, this would be like the septic servicing company's CEO said, "I know you keep having to shovel the same shit from the same broken system every single day, but I'm not going to fix it or allow anyone else to do so, and you'll just have to shovel the exact same pile of shit tomorrow."

  7. says

    Perhaps there are decent arguments to be made that (1) it's not reasonably technologically feasible to stop this, or (2) dealing with rape porn is part of the job of moderating internet comments.

    The best time to explore those arguments would be in a dialogue with the complaining employees, as opposed to ignoring them and waiting to make the arguments after they sue you.

  8. Tristan says

    "the anonymous jerk or jerks weren't disguising their IP addresses to get around security because there was no such security, and — even more importantly — Gawker Media refused to put such security in place."

    Ah, now we've got a case. As described I assumed the complaint was that Gawker refused to instate a system requiring they moderate new users, which would at best have foisted the responsibility of seeing all the gore and porn to a different employee and solved nothing.

  9. Jacob Schmidt says

    On the other hand, dealing with online commentators is clearly part of Jezebel employees' job description. The internet being what it is, a certain percentage of these commentators are going to be creepy assholes. So this seems rather like taking a job at a septic servicing company and complaining about having to deal with shit all day.

    A certain percentage of any large group are going to be creepy assholes, UPS employees included. Dealing with deliveries is clearly part of the receptionists job description.

    Turns out, just because an employee's is to deal with people in some capacity, doesn't mean an employers responsibility towards their employees suddenly vanish.

    (And really, it's just silly to try and sneak in "dealing with rape porn" under "dealing with commentators.")


    Perhaps there are decent arguments to be made that (1) it's not reasonably technologically feasible to stop this, or (2) dealing with rape porn is part of the job of moderating internet comments.

    I do, however, accept that "dealing with rape porn" could fall under "moderation" in some capacity.

  10. Tristan says

    Perhaps there are decent arguments to be made that (1) it's not reasonably technologically feasible to stop this

    The challenge with defending it as technologically infeasible is that, while it may well be technologically infeasible, they can't actually know that an IP block won't help until they try it.

  11. Mike says

    Kathryn – Thanks, I misread that part. Although I still think my point stands as (1) I don't believe that the analysis changes significantly if Gawker adds IP logging/blocking and then the troll just starts masking his/her IP address; and (2) if the availability of anonymity is a feature that Gawker believes adds value/customer, then taking that away is a cost just like adding pending moderation.

    Ken – rape porn:employee::dialogue with employee:higher-up

  12. Z says

    Anonymity is a key journalistic tool but there are ways to safeguard that while also keeping out rape comments; perhaps one could have a comment box, email address or, to get analog, a phone line for tipsters. However I have a hard time with the argument that rape comments are the price you pay for getting juicy stories. That's a bit like saying that if you want to be a public defender you gotta expect that some clients will try to shoot you in the courthouse.

  13. Kathryn says

    Mike – I'm no lawyer, and every case is different, but I'd be surprised if taking any steps at all (like IP logging, even if it's ultimately ineffective) wouldn't make Gawker a better hypothetical defendant than taking no steps at all. Just my two cents. And for the record, I think IP logging would ultimately be ineffective, but I can't for the life of me understand why they weren't willing to try it for even a few days.

    Z – Oddly enough, they do have an email address AND a phone line already; apparently their desire to better understand Lindsey Lohan's coke habits requires anonymous comments as well. Go figure.

  14. Z says

    I'm confused- they can't have an anonymous tip line and close monitoring of comments? Seems that with the tip line as an outlet they'd have less of an argument as to why they can't block IP addresses.

  15. Matthew Cline says

    You could also log IPs for comments, then create a separate page for submitting tips, no IP logging, with only the staff seeing the tips, and where when staff viewed the tips the images weren't automatically loaded. This would be better than having tips be emailed, since the tipster would have to create a temporary email account.

    Of course, with either that or using email, someone could spam the system without abusive "tips", but it seems likely that whoever is doing this is doing so because the comments are visible to everyone before they get taken down, and that motivation would be taken away if what they did was only visible to the staff.

  16. naught_for_naught says

    "…put her head back down and continued on with trying to pick the nail polish off of her nails."

    Wow. Nothing says, "I'm not taking you seriously," like like losing your listener's attention to remnants of three-week old nail polish.

  17. eddie says

    I once worked for a web hosting company. A substantial number of our customers were pornographers. During orientation, we were advised of that fact and informed that we could expect, during the normal course of business, as a necessary part of our job activities supporting our customers, to be exposed to pornography.

    Was that company simply one giant lawsuit-in-waiting?

    Consider a staffer at a web filtering company whose job is to investigate new websites reported by customers or discovered by searching and then classify them. Their job necessarily requires them to be regularly exposed to pornography. And not merely a glimpse of it and then it's gone; their job actually requires them to investigate each site in sufficient depth so as to render an accurate distinction between sites which might be racy-but-artistic versus those which are patently offensive versus those which might in fact be outright criminal.

    How can that job even exist given what Ken's written above regarding an employer's responsibility to avoid creating a hostile workplace?

    Heck, what about photographers and airbrush artists and layout staff at porn mags? How do their employers dodge the hostile workplace bullet?

  18. ysth says

    I don't buy it. IP address logging/blocking is as effective as most DRM – it only penalizes the good guys.

  19. Aelfric says

    I'm no Ken White, but let me take a couple stabs at questions. eddie–one of the key phrases in hostile workplace jurisprudence is that the harassment must be so pervasive as to "alter the conditions of employment." (e.g., Harris v. Forklift Systems, Inc., 510 U.S. 17 (1993)). Working for a web filter, or a porn mag, etc., would mean that dealing with such would be one of the conditions of your employment, and therefore not actionable. I would argue it might, theoretically, alter the conditions of employment for a Jezebel writer, at least enough to shift the burden to the employer to do something, whether effective or not.

    NickM–Section 230 essentially says that you can't treat a website as the author of third party comments ("No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."). But that's not the issue here; the issue is whether Gawker (or any other employer) is allowing third parties to create a hostile workplace for the employees. Since authorship is effectively irrelevant for a hypothetical hostile workplace claim, the safe harbor would not help Gawker here. Of course, I could be wrong on one or both points. Let me know.

  20. Grifter says

    I'm inclined to be supportive of people who are having a problem with harassment–even more so given some recent problems Mrs. Grifter has been having which I have been prohibited by her by solving in any of the ways I'd prefer.

    That said, in this particular case, I am *very* curious how the bar would go. Don't get me wrong–ignoring is definitely *not* the proper course of action, and dialogue (as you already said, Ken) is. But I'm curious–if they said that the anonymity of not logging IPs was important, could they then sort of shift the burden on the editors to come up with a solution? I mean, if they say "Look, a part of journalism, particularly ours, is encouraging anonymous information and, while yes, there are other ways of keeping things anonymous we do not want there to be *any* barriers that we can avoid placing", could they say that they're happy to help find a solution (possibly even hiring someone whose specific job is dealing with this sewage), but that they will not do the IP logging, no matter how "simple" a solution it is? Particularly given that, though it changes the working conditions, it only does so because of how the douchenozzles are behaving, not because of anything Gawker itself is doing?

    Again, just because of how these things tend to go: I don't support ignoring a problem. I don't support harassment. I *do* support *not* having to deal with graphic violence rapiness. I'm just curious about how the clash of values would work out here, where there may (conceivably) not *be* a solution, because people are terrible.

    I mean, in the case of the UPS guy, there were LOTS of options available–calling UPS to complain, having someone else deal with him, etc. etc. and so on. In this case, isn't the problem that there is a method of dealing with (expected levels) of abuse, but that things have exceeded that expected level?

    **Edited because my first sentence originally could have been read as though I supported people being harassed, when I was trying to say I supported the PEOPLE being harassed– if that makes sense.

  21. lelnet says

    IP logging (and bans based on those logs) isn't as useful as the Jezebel folks seem to imagine it is, but a credible argument could be made that it was "remedial action reasonably calculated to end the harassment" (whether it was "prompt" is of course a different question). It would, at least, _limit_ the ability of persistent trolls to do their thing. It pains me more than most could imagine, to side with Jezebel writers on any question whatever, but intellectual honesty compels the admission that they do have a colorable point, if the Gawker admins indeed did nothing at all.

    No, what I'd like to see is what happens when the attempt to limit harassment by customers and the UPS guy slams up against contemporary non-discrimination law, considering how "we reserve the right to refuse service" is, in modern America, another one of those "bless your heart" ideas.

  22. Midnight Rambler says

    It sounds like most people commenting here didn't read the article to see what the situation was. This wasn't commenters "disagreeing" with Jezebel writers' views or even posting the standard sexist/misogynist crap we've come to expect on comment boards. It was basically one person who was obsessively spamming hundreds of rape, decapitation, and mutilation gifs all over every comment page. They were able to do it because 1) the comment system allowed photo embedding, and 2) it allowed for burner accounts with no verification, supposedly so people could send in anonymous tips but which allowed this guy to simply set up a new account immediately after one was blocked. Yes, you can get around IP blocks, but it takes some effort; with this system, it didn't take any effort at all. The staffers were basically having to spend all their time removing these pictures (which means having to look at them themselves), cancelling accounts one by one, and the higher-ups at Gawker refused to change the system until they went public with it.

  23. Smiling Dog says

    Why even allow GIF images in comments?

    They're pretty much universally juvenile and/or stupid, if not necessarily offensive.

    How about this Perl-ish pseudo-code to eliminate them:

    if ($postText =~ m/\.gif/i )
    print "GIFs not allowed – deleting this post, try again.";

  24. says

    Bears adding that Gawker didn't do anything about it until it started overflowing onto other Gawker sites' comment sections. If we're to give management the benefit of the doubt, they may not have realized how bad it had gotten until more people started complaining; if we're not to give management the benefit of the doubt, they may not have cared until staff at sites other than Jezebel started complaining.

    Regarding the technical limitations of IP bans: it's largely a question of scale. If you've got everyone on /b/ trolling your comments section, then it's hydra/whack-a-mole/choose your favorite metaphor here and yeah, no number of IP blocks is going to be enough to stop them. But if it's just one guy, well, even a dedicated troll will get tired of trying to find a new proxy every five minutes.

    That and sooner or later he's going to forget to use a proxy in the first place and you're going to get a real IP address. And, while (1) an IP address is not enough to positively identify the person who was using the router and (2) there have been quite a lot of instances of IP addresses being abused in legal actions, those options are at least open to a site with the resources of Gawker.

  25. Dion Starfire says

    The best time to explore those arguments would be in a dialogue with the complaining employees,

    And the second best place is on a similar* yet unaffiliated internet message board. Close enough to understand both sides of the issue yet far enough to remain rational.

    *in the same way that the SCOTUS and a municipal court are similar. (I'll let you decide which is which)

  26. Ryan says

    For those somewhat nitpicking at the practical reality, Ken didn't cover the details in his post (because that's not what his post was about), but you would all benefit from Googling the case for those details.

    Essentially, Jezebel staff were being repeatedly subject to rape GIFs and other abuse, complained repeatedly to Gawker about the lack of *any* filtering, and Gawker proceed to do absolutely nothing. They only stopped ignoring the situation when people started contacting advertisers to pull their listings. They took zero measures to protect their staff, despite the flood of legitimate complaints. As a temporary measure, they've now done the simple thing and disabled image posts while they come up with another permanent solution, something they should have done to begin with.

  27. Nicholas Weaver says

    IANAL, but I've tolerated the every 2 years mandatory sexual harassment training (and no, its not harassment to have to go through the state mandated harassment training). I am, however, an expert in computer security with important sounding letters after my name.

    I agree with Ken, 100%: There are very reasonable and modest steps that can be taken: logging and banning IP addresses based on signup (which actually puts a fair dent in the low level trolls: there's only so many proxy servers and starbucks available), having new accounts unable to post images for X time and even when they can no more than Y rate, etc. Such changes would not impose a substantial burden on Gawker's business or web-site, but would act to limit the ability of these creatures who inhabit 4chan.

    And if asked by a plaintiffs attorney, someone like me would testify as such.

    This really is the heart of the matter: An employer does have a responsibility to protect their employees from pervasive bad behavior on the part of customers and others they interact with as part of their ordinary job duties. If there is a creepy UPS guy who is always harassing the receptionist, the employer has not just a legal but a human obligation to do something about it, especially when something is as simple as "call up UPS, tell them that sleazy driver shouldn't deliver here". The same applies here.

    When, like Gawker, they don't for the longest time (and only when phrases like "customer boycot" start being banded around actually change policy at all), and being a company which probably does near everything in email, have a gobsmackingly huge amount of material which will show this neglect come out in discovery, IMO they deserve to get sued.

  28. Nicholas Weaver says

    For those who are going "but, but…": context matters. If you are, say, the webmaster for the local artisanal bondage pornographer (San Francisco has one, you mean your town doesn't?), you're expected as part of your normal duties to deal with this sort of stuff. The employee knows going in what they are going to have to deal with right from the start.

    But the defense of "dealing with rape porn is part of the job of moderating internet comments" is going to be a pretty hard case to make here. Yet this really is Gawker Media's only avenue which would be helpful, since there are technical controls that would work at shutting down the 4chan-ers.

    Remember, the goal was not to get them seen by just the moderator, but to get them seen by everyone. So typical anti-spam policies (e.g. automatic moderation of images, or just images from new posters) would work.

  29. Fasolt says


    Would that be all American corporations and most or all Americans in general? If so, I disagree with your statement. The facts don't support your statement. I think your statement can fall under other ones like calling the French people "cheese eating surrender monkeys". It appears the French do like their cheese, but I don't think they're any less courageous than other peoples.

    I would assume you're an American that disagrees with Capitalism as an economic method or a citizen of another country with an unfair perception of Americans.

  30. Joel says

    I'm sorry, I don't understand the other side of this issue at all. Even if dealing with offensive comments is the employee's job, isn't the role of an employer to provide the employee with the means to properly do their job? From what I can tell, they weren't saying "we don't want to deal with this" they were saying "we're having trouble getting our job done–please give us better tools."

  31. alexmegami says

    It seems like this could also be solved by hiring someone specifically to deal with comment moderation, where it's explicitly laid out in the hiring contract that this will frequently involve dealing with graphically sexual & violent images and language. No?

  32. Nicholas Weaver says

    Alexmegami: Even easier than hiring someone is to do what was eventually going to do once public pressure (and a steady troll epidemic) forced their hand: Just go back to the community moderation system, of the type which they used to do and which is very effective at keeping down the trolls.

    Posters with good karma are seen instantly. Posters with no karma are under a "click to reveal", and can gain good karma when those with good karma click enough to promote from the swamp.

    Positive reputation community moderation systems have a long and proven track record of keeping 4chan style trolls out, since it takes a bit of work to get to the point where your trolling would be seen.

    The big problem is the months of neglect before the Jezebel staff went public. Gawker may have (finally) decided to do the right thing, but that doesn't erase the months of "being notified and doing nothing".

    I'm pretty sure Ken will use this in the next revision of his sexual harassment training as an example of a possible liability bomb…

  33. Nicholas Weaver says

    An even bigger problem on the liability bombshell: they already HAVE click-to-reveal karma-based moderation now working, and had it working within 72 hours of a decision being made to reinstitute it and within 4 days of Jezebel going public!

    So discovery would find months of email documentation of Jezebel staff complaining and management doing nothing, and only when the affected staff went public and the problem spread to other Gawker sites did management do anything about it, and they were able to respond very quickly.

    Doing the right thing now really does help, but daaamn, was not doing the right thing something that could possibly come back to bite em in the ass.

    Even if the Jezebel staff doesn't sue, any other sexual harassment plaintiff in the Gawker empire in the future is going to be able to point to this as part of a pattern of willful neglect on the issue.

  34. Stu says

    “f* * *ed up as a n* * * *r's checkbook”

    Who are these people? You actually say these things to another person in your daily course of business?
    I am still in awe that someone would say something like this to another person, and not expect to be punched in the mouth.

  35. Vince Clortho says

    I've always enjoyed explaining this concept to client employers with the Folkerson v. Circus Circus case (107 F.3d 754, if anyone's interested): you must prevent third party customers from molesting your mimes.

  36. manybellsdown says

    Minor detail: The troll spammers were actually not using Burner accounts, they were making throwaway Twitter accounts and registering that way. So just disabling the creation of anonymous Burner accounts wasn't sufficient.

  37. mKr says

    @Vince Clortho: As opposed to 68 F.3d 480, which apparently is the same parties but on a sex discrimination suit two years previous to the 107 case you refer to. Apparently, the circuit court reversed Circus Circus's original summary judgment:

    Folkerson's conduct appears proportionate to the degree of threat this man posed. Therefore, we are unable to affirm the award of summary judgment to Circus Circus on the ground that Folkerson did not engage in protected oppositional conduct.

    The later (107) circuit court judgment affirming a Circus Circus summary judgement claim that the mime could not prove retaliation on a technicality: the patron assaulting her was not an action supported by the employer, so "she wasn't engaged in protected activity/opposition" that the employer was responsible for. The fact that there was a causal link between her activity and her termination ("adverse employment decision") was deemed irrelevant.

    I'm not sure, Vince, that the 107 decision really follows your conclusion of "you must prevent third party customers from molesting your mimes" since the plaintiff lost this one.

    Folkerson failed to establish that she was opposing an unlawful employment practice of Circus Circus, summary judgment was properly granted

  38. princessartemis says

    I wonder why they allowed GIF embedding for so long. Turning that off would have helped some.

    @Stu, the people who say such things are likely rather old. At least, any person I've come across who would was pretty old. I had an uncle who used to say crap like that, though to be very honest, I don't think he thought of it as insulting. He was in his late 70's and had held onto a much older mindset than many do now. He was also quite proudly a Korean War vet who wore his medals, was rather rickety on his cane and always looked like he was in tremendous pain, which he was, and looked much older than his years. I would have been shocked if anyone ever actually punched him in that state. He was an asshole, but in some ways, a judgement-proof asshole.

  39. Dan Weber says

    I don't like Jezebel. But when I saw the first sentence, I still took their side.

    Jezebel represents the demographic I call "highly annoyed white woman." For various reasons I find their worldview silly and counterproductive to a well functioning society. But Gawker is the one trying to tap into that for economic gain. Even if I were to accept that all of Jezebel's arguments were based in bad faith, Gawker is the one who knew that going in.

    I've posted a comment on another topic here a few years ago about radio companies that love their shock jocks until the shock jock says something that gets Too Many People offended, at which point they clutch their pearls and say "this is not in line with our corporate values." Nonsense. It was perfectly in line with your values when it was making you money. Don't raise a scorpion farm and act put out when one of them bites you.

    I think the argument of "how far do we need to go to stop harassment by third parties" is a reasonable theoretical question. Saying "IP bans might not work" a priori is like philosophers debating how many teeth a horse has. Give it a shot and see what happens. Besides, I've always been in the camp (even in the Prenda topics) that IP is highly correlated with identity. Maybe you will ban some IP that was a corporate gateway. Perfect: when the corporation asks to get off the blacklist, tell them at time X on day Y someone posted Z from their IP. I'm hardly a social justice warrior, but if I found out that someone was using my company's network to send rape porn, that would be the easiest instant firing in the world.

  40. Ken in NJ says

    … must not be a pedantic ass….

    … must not be a pedantic ass….


    Don't raise a scorpion farm and act put out when one of them bites you.

    Stings. When one of them STINGS you.

  41. Sinij says

    Lets be clear here, this isn't some knitting website. Jezebel is in business of manufacturing gender-related outrage, it is not unreasonable to expect that dealing with relevant outrageous material is part of job responsibilities.

    With that said, how could one limit such actions? TOR and public proxies exist. There is absolutely no way to connect IP to identity no matter what self-proclaimed 'experts' would try to tell you.

  42. BC says

    I pretty much agree with Dan Weber, and beyond that will merely opine that Gawker and Jezebel truly deserve one another.

  43. Nicholas Weaver says

    TOR and public proxies exist

    Any comment system worth its salt just blanket-blocks posts/account signups from Tor and public proxies when faced with a spam problem.

  44. Dan Weber says

    To follow on that, even if they didn't blanket-ban Tor from the start, the trolls would quickly burn through all the Tor exit nodes and they'd get back on the black list.

    This isn't like trying to stop a plague where you have to cure every single carrier or else it mutates and re-infects everyone. Stopping 90% of the problem is 90% of the problem solved. Even in computer security, many times Good Enough really is Good Enough. (Crypto is a big exception.)

  45. rmd says

    … the Folkerson v. Circus Circus case (107 F.3d 754, if anyone's interested): you must prevent third party customers from molesting your mimes.

    "If that is the law then…"

    How about whacking them with an olive loaf?

    P.S. Is it weird that I can post that with full confidence that everyone here will get both references?

  46. Municipal D1 says

    There are a ton of cases where a restaurant or club is liable for known sexual harassment of an employee by patrons.

    Here's a relevant quote from the Ninth Circuit– an employer is "liable for sexual harassment
    on the part of a private individual, … where the employer either ratifies or acquiesces in the harassment by not taking affirmative and/or corrective actions when it knew or should have known of the conduct.’’ Folkerson v Circus Enterprises, 107 F3d 754 (CA 9, 1997). That's just what I found in 2 minutes on Google. It's well established. (Sorry, didn't see it was earlier cited. Yes, it's the mime-groping case).

    I think any practicing attorney (or true Scotsman, for that matter) would recognize this as open & shut.

  47. sinij says

    I am surprised I have to explain technology here. The only way to 100% block offensive images is to disable ability to post images. The only other alternative is to moderate them. Ether community moderation or admin moderation, and these moderators will be exposed to these images.

    IP bans are useless, because IP does not link to identity. Period. This is with trying to get rid of a single person, never mind if you have a group. You have proxies, you have TOR, you have wardriving, you have public WiFi everywhere, you have DSL with dynamic IPs, you have bots.

    In a case like Jezebel, your IP ban to moderate posts ration will be nearly 1:1. This is what Gawker people know, because they are technology/news site. This is what Jezebel people don't know and think can be solved by a petition and/or sexual lawsuit.

    As to actually what happened to Jezebel – it is firmly in SOCIAL CONSEQUENCES camp. They spent a significant part of their journalistic career rallying up various internet lynch mobs. Well, there is now one that came after them.

  48. sinij says

    Effectively, Jezebel are trying to force Gawker Media to moderate it for them under a threat of litigation.

  49. says

    But these are the comments. Are Jezebel writers obliged to look at them? What if someone prints these images out and physically mails them as letters to the ombudsman?

  50. George William Herbert says

    Sinji, you have it exactly wrong.

    You're correct, in that all those workarounds allow a full-time dedicated pro troll to get around just about any block mechanism other than an image ban. But you're wrong.

    You don't have to be 100% effective with technical mechanisms. You just have to reduce the flow of misbehavior enough that social mechanisms or manual moderation can (without forming an oppressive working environment) handle what remains.

    If this guy dedicates his life to the harassing activity and works around it anyways, you establish in the terms of service that that's prohibited activity, and force him to create at least a tort and hopefully a criminal action to continue harassing, and then you sue and/or criminally charge him.

    Even if it takes months or years, that's good enough.

  51. Carl says

    Off the subject of law and human decency, it's pretty pathetic that Gawker/Jezebel didn't already have the ability to handle trolls.

  52. NotaLawyer says

    Let's say that I'm a minority female who goes to work at a site that is already notorious for comboxes filled with "rape porn". Do I still have a case?
    What if I only took the job to force the site to clean up?

  53. Nicholas Weaver says

    Sinji:, and you are doubly wrong: The positive reputation based community moderation (where those with good reputation have posts that are seen, and can promote others) which they have ALREADY put in place in response (and they used to have but disabled earlier!) has shown, in practice, to nearly eliminate these bottom feeding trolls. Its not 100% but it is close enough to count as "reasonable effort" to eliminate the problem because it works.

    Basically, in order to troll in a reputation system like this, you have to build up a good reputation first. Which takes considerable effort: its not just a matter of creating an account, but having to create an account and positively contribute to the community first. Yes you can in theory play games with sibyls, but that is a lot of work, and the bottom feeders, both trolls and spammers, quickly go away, since a single troll posting requires far more than just creating a twitter account, and once you do a troll post, all the good reputation is evaporated.

  54. AlphaCentauri says

    I should think that a company as big as Gawker could come up with a custom-designed solution. It's not so much that the employees were exposed to offensive material, because they knew that would happen sometimes. But when it became a barrage, when it became difficult for them to fulfill their commitment to their readers, risking losing their web traffic, the Gawker management didn't have their backs. I can put up with a lot if I feel like my employers understand there is a problem and are sincerely working on a solution. It becomes a hostile work environment when they don't give a shit and treat me as if I'm the one that has a problem.

    @Nicholas Weaver: We've heard in recent news that the digital fingerprint of known child porn photos was used to identify the people sending them. It's likely that if the Jezebel trollers were posting as many highly offensive photos as in this case, they were posting the same photos over and over. How practical is it for the moderators to be able to identify a particular photo as offensive and have the site autoban the user/IP if the same photo is posted in the future? Or even post their submissions in an invisible part of the site to make them available for law enforcement or for generating complaints to the owner of the IP address? Or maybe automatically have that user see a duplicate website where their comments do show up but which no one else can see, so they don't realize their posts are being blocked and don't make any attempt to change their IPs, alter their photos, or create new accounts? It seems like a site like Gawker could justify the expense of coding something like that.

  55. Nicholas Weaver says

    Fingerprinting images would probably work due to reuse, but its not as effective as "moderate low karma" which is what they put in place already.

  56. A Person (Probably?) says

    Isn't this slightly more complex than is being portrayed? According the Jezebel article linked, when this was raised in a staff meeting the answer was "We have better moderating tools in development, and this is an issue with our publication platform. We don't believe the advantages of the changes we could make outweigh their costs."

    That doesn't… really sound like ignoring with me. Unsatisfying, sure, but I guess I'm curious about how far Gawker would need to go to satisfy the complaint. Should they have to change to a different platform that doesn't have this issue?

  57. A Person (Probably?) says

    Wait, hold on now. Did people actually read the article? The argument Gawker made, according to Jezebel isn't "ip blocking won't work" but "Gawker's leadership is prioritizing theoretical anonymous tipsters[…]"

    Even if that's just a pretext, that's different from "we are lazy and/or don't care."

  58. Doctor X says

    … the Folkerson v. Circus Circus case (107 F.3d 754, if anyone's interested): you must prevent third party customers from molesting your mimes.

    True or not that is still an awesome sentence.

  59. says

    And why do anonymous accounts have to be allowed to post comments anyway? If Gawker wants them so people can submit anonymous tips, only allow just that and have the tips go straight to the editors (with no in-line images allowed so editors have an idea of the quality of the tip before going into any attachments). Gawker allowing accounts based on accounts at other sites complicates things a bit, the amount of verification those other sites do has to be taken into account when deciding whether the new account is considered anonymous or not.

    A Person: That'd make it worse. Gawker not caring is bad enough. Gawker knowing about the problem and making a deliberate decision to allow it to continue because doing something about it would interfere with Gawker's business priorities is the kind of thing that makes judges and juries decide to throw the book at the offender.

  60. Robert What? says

    "everyone isn't entitled to the law being what they think it is"

    How quaint. Unfortunately that memo hasn't reached the Obama administration. Why then don't we all get to pick and choose?

  61. AlphaCentauri says

    @Todd Knarr: Jezebel is aimed at women, and women with two brain cells to rub together are cautious about posting any personally identifiable information on the internet .. lest all those violent rape porn gifs start showing up in their email inbox at work. 4channers have been known to look up phone numbers and troll that way, too.

    @Nicholas Weaver: The karma system filters out trolls effectively, but it also filters out thoughtful comments from all but the usual vociferous contributors. It reduces the diversity of ideas in the discussion. I find it frustrating to read comments where half of them are hidden and half the rest are commenting on hidden comments.

  62. The Invisible Man says

    "[Jezebel] spent a significant part of their journalistic career rallying up various internet lynch mobs"

    Jezebel is widely known to write articles for the sole purpose of being click-bait, to 'outrage the squares', and stir up controversy to drive clicks, which drives ad revenue. That is their business model.

    There is a reason Gawker wanted it to go on… and its exactly the reason Jezebel exists in the first place – it makes them money. The outrage jezebel is expressing, is exactly the outrage they try to drum up outside of this controversy happening. In fact, I wouldn't be a bit surprised if this was 'one of their own' that is doing this.

    Now stick with me here, because reading this story fresh is like walking into a bar fight at the end, and picking sides;

    Previously, the 'mods' over at jezebel, had more power to moderate posts. I use quotes because these were VOLUNTEER people who wanted to be given the duties of moderating the site, unpaid. Then, gawker took away their 'power', and the mods got upset(because they are entitled to these powers of moderating on sites someone else owns and runs, after all). Shortly afterwards, these 'attacks' started showing up taking advantage of EXACTLY the abilities that were removed from the volunteer moderators, and their 'demand' was to be given EXACTLY that power again that they previously held, and were vocally upset about having taken away(search around, you will find these discussions). here's just one of them;
    "Yeah it's the unpaid, volunteering our free time aspect that really pisses me off about the lack of mod powers nowadays. We invested a LOT of time and emotions on our respective forums."

    When this return of powers was denied(because it was being abused to the point of turning the site into an echo-chamber), the complaints were escalated publicly, and 'demands' for it to be taken care of began to include the person who took away their 'powers' of moderation.

    The big picture is the removal of anonymity in comment sections of sites. While it may be repulsive what is being done, I would rather put up with repulsive comments than take away anonymity. Especially if it is being done in the way I think it is.

  63. Cog says

    Just curious, but does this mean they will have to disable comments? And how will they filter their email without exposing someone else? I honestly can't think of lesser measures likely to be effective against a determined troll. Fwiw, I have in fact had a troll mail bomb me with gross port before. I also got said troll's Internet access taken away for that, but only because he screwed up on exactly one of the thousands of emails and I caught it.

  64. Cog says

    One more thing. If they moderate comments and create an anonymous tip line, won't the "tips" be mostly porn? Can you get people to agree to take care of the tips for them, sans lawsuit?

  65. Mercury says

    Perhaps Gawker Media will conclude that having Jezebel in their stable/protfolio of media assets simply isn't worth the hassle.

  66. markm says

    I am under the impression that in most cases IP blocking does not block an individual, but blocks everyone whose internet connection goes through the same server, unless they evade blocking by spoofing the IP, or redirecting through a different server. IOW, it's a collective punishment that the guilty are more likely to evade (because they know what's going on) than the innocent.

    Blocking images in anonymous comments would prevent anonymous tipsters from attaching the evidence for their tips when that evidence is a photograph. That often reduces "tips" to "rumors", although that may not be much of a drop in quality…

    Requiring tips with images to be sent to an editor for review before posting may or may not discourage the porn spammers. It depends on their motivation. If they're out to splash their crap across everyone's screen, it would stop them. If they are out to shock and harass an editor, it's still working. However, it could make a difference legally. The entire staff is no longer exposed, but just the smaller group that took the job of filtering the crap. In a sane legal system, that would be sufficient to protect the company; in our legal system, I suspect it only makes it possible for the defense to win the case after 5 to 10 years and enormous legal fees, with a good chance that some government agency is paying everything for the plaintiffs.

  67. Dan Weber says

    Markm, "server" is a bit broad. In general[1] each internet connection has its own IP, so each home would have one, and each business would have one. And you can have a help line where people who have been banned can find out why. (If someone in my house was sending out rape porn, I would REALLY REALLY appreciate knowing about it, so I can have a chat with the teenager in my house. I've already commented about business addresses.)

    [1] There are exceptions. Since the goal is to significantly reduce the problem, that's okay. It could certainly in theory turn out that it doesn't work or has side effects that are unacceptable. Each problem domain is slightly different. IANAL, but I expect one would say there is a legal difference between "we tried to stop employee harassment but it didn't work" and "trying to stop employee harassment might not have worked so we didn't bother."

  68. Robert says

    Jezebel screams "RAPE" whenever someone posts a racy photo of a woman, but they have no problem with this filthy porn that objectifies men:

    [Link Removed. Please don't post creepy-ass cartoon porn here]

    The "women" at Jezebel are just misandrist lesbian man-haters.