Facebook Charged Over Targeted Housing Ads Allegedly Discriminating By Race, Gender, Zip Code and Religion


  1. Fuckanator


    Discrimination by zip code? Why would you want to pay to advertise say an upscale penthouse to lower income people that life in the shitty part of town?

  2. smuhta


    Good. Those that support liberal bullshit, should be devoured by the same system.


    Target audience is illegal 😉





    I work in apartment leasing and have a Leasing Agent License. Federal Fair Housing Laws are a BIG DEAL and are not taken lightly by HUD. This is nothing short of a massive fuckup by Facebook. This honestly pisses me off more then the Cambridge Analystica because it’s much more targeted.

  4. JacobWonder


    Why these laws are ridiculous:

    If I want to sell 10,000 Tupac hats,

    Let’s say I follow “laws”, I target blankety, so your 80 year old white grandma that doesn’t know who is HAS to see the add.
    I’ll sell about 10 hats.

    If I let Facebook target the audience based on conversions, it will probably close in on blacks, and people under the age of 55.
    I’ll sell 100 hats!

    If I target black men between the ages of 22 and 40 I can sell 200 hats because that age group is most likely to love Tupac.

    Same thing goes with a MAGAHat, why spend money targeting towards Hispanic old men that follow left leaning political pages, it’s a waste of money. Will some buy it? Maybe yeah, but the likelihood is lower.

    Want to know a cool fact? Goto Facebook business, audience insights, you can actually see the demographics of certain pages.

    Type Britney Spears, and it will show you the averages and top groups following her, add a filter, only people who show interest in hats, or purchasing things online.

    It’s that simple, if certain neighborhoods live in poverty and can’t afford to move to a new area, agencies don’t remove them to be racist, they remove them because they statistically won’t move out.

  5. Toliver182


    Facebook are not violating the Fair Housing Act, the businesses placing adverts are.

    They are the ones that should know the law and not use the audience targeting features they are not allowed to use for their particular type of advert.

    By this logic, the HUD could charge Ford for providing cars for advertisers to drive round and do leaflet drops to certain neighborhoods and not others.

  6. monchota


    The truth is if you grew up poor, in certain neighborhoods, schools, lack of parents. Its always going to point to the same groups of people until those problems are fixed. these groups of in some of these problems because of institutional racism of the past? Yes! But that doesnt change the situations they are in now or stop a poverty cycle they continue to hold on to. We as a society need to face our social economical problems and fix them. That will never happen unless we face the facts, these algorithms are just taking data and giving you an answer. Its not bias, its just taking the “cant do this because it offends people”part out of the equation. We start by fixing education and parenting, we really need to stop parents who are not ready for children from having children. There is a direct connection to kids born to uneducated paraents at the average age of 21 to parents who are educated and having kids at an average age of 31. Education needs to be better , more programs helping young parents. Especially single parents, these programs need to teach people to fish not just give them the fish. That is the major problem with welfare programs today.

  7. Jen-Diki


    Zuck: Yeah so if you ever need info about anyone at Harvard
    Zuck: Just ask
    Zuck: I have over 4,000 emails, pictures, addresses, SNS
    [Redacted Friend’s Name]: What? How’d you manage that one?
    Zuck: People just submitted it.
    Zuck: I don’t know why.
    Zuck: They “trust me”
    Zuck: Dumb fucks

  8. Megazor


    Sooner or later people are going to have to accept that AI and machine learning is probably not going to be very politically correct.

    The alternative is to neuter these tools (as some did with [Weight](https://www.myshapa.com/) lol) as to not hurt our feelings and moral sensibility because it doesn’t match reality. You’re not an Obese fucker, you’re light Grey. yay you! Keep doing it till your coronaries pop!

  9. Derigiberble


    Not only is HUD charging Facebook with blatantly illegal stuff (Facebook let real estate advertisers literally draw a red line to exclude neighborhoods… maybe they thought that real estate folks would be able [easily recognize the interface?](https://en.wikipedia.org/wiki/Redlining)), but if you [read HUD’s filing itself](https://www.hud.gov/sites/dfiles/Main/documents/HUD_v_Facebook.pdf) they are going much further:


    >20 . To group users by shared attributes, to create a Lookalike Audience, to determine an ad’s “actual audience” during the ad delivery phase, and to price each ad for each user, Respondent combines the data it has about user attributes and behavior on its platforms with data it obtains about user behavior on other websites and in the non-digital world. Respondent then uses machine learning and other prediction techniques to classify and group users so as to project each user’s likely response to a given ad. In doing so, Respondent inevitably recreates groupings defined by their protected class. For example, the top Facebook pages users “like” vary sharply by their protected class, according to Respondent’s “Audience Insights” tool. Therefore, by grouping users who “like” similar pages (unrelated to housing) and presuming a shared interest Respondent’s mechanisms function just like an advertiser who intentionally targets or excludes users based on their protected class.


    That’s basically saying that machine learning created shared-interest groups so often fall along protected class lines that using such grouping to target ads is illegal. I mean it makes sense, not many people without kids are going to be interested in things like mommy meetups and stroller brands.

Leave a Reply

Your email address will not be published.