Bombshell new reporting from 404 Media found that Flock, which has its cameras in thousands of US communities, has been outsourcing its AI to gig workers located in the Philippines.

After accessing a cache of exposed data, 404 found documents related to annotating Flock footage, a process sometimes called “AI training.” Workers were tasked with jobs include categorizing vehicles by color, make, and model, transcribing license plates, and labeling various audio clips from car wrecks.

In US towns and cities, Flock cameras maintained by local businesses and municipal agencies form centralized surveillance networks for local police. They constantly scan for car license plates, as well as pedestrians, who are categorized based on their clothing, and possibly by factors like gender and race.

In a growing number of cases, local police are using Flock to help Immigration and Customs Enforcement (ICE) agents surveil minority communities.

It isn’t clear where all the Flock annotation footage came from, but screenshots included in the documents for data annotators showed license plates from New York, Florida, New Jersey, Michigan, and California.

Flock joins the ranks of other fast-moving AI companies that have resorted to low-paid international labor to bring their product to market. Amazon’s cashier-free “just walk out” stores, for example, were really just gig workers watching American shoppers from India. The AI startup Engineer.ai, which purported to make developing code for apps “as easy as ordering a pizza,” was found out to be selling passing human-written code as AI generated.

The difference with those examples is that those services were voluntary — powered by the exploitation of workers in the global south, yes, but with a choice to opt out on the front-end. That isn’t the case with Flock, as you don’t have to consent to end up in the panopticon. In other words, for a growing number of Americans, a for-profit company is deciding who gets watched, and who does the watching — a system built on exploitation at either end.

  • DickFiasco@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    The term “AI” has been misused and misrepresented so much that it’s more of an aesthetic than a technology now.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 hours ago

      It sounds like you’re interpreting this as Flock using people to generate data that’s passed off to customers as being AI generated.

      That’s not what’s happening. They’re using people to generate data to train the AI model.

      It doesn’t make it better because they’re still using slave labor, but it’s different than them passing off the work of humans as AI.

    • Basic Glitch@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      The main thing flock is really supposed to do is capture and match pictures of license plates at different locations. It’s not even complex.

      So how tf did they get the green light for the first government contract if they never even had that capability?

      • OldQWERTYbastard@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 days ago

        Yeah I don’t understand how a private company deploying cameras on the side of roads is even legal.

        Does that mean I can build solar powered raspberry pi units with cameras that do the same thing and pepper them around the country without question?

        • CmdrShepard49@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 day ago

          Thus has been the norm for years. Those red light and speed cameras in your town are also owned by private companies and it happens because these leaders are the type of people who go to Bing and type in “google” or who print their emails out only to scan them back into their computer in order to “save them” and these slimy company salespeople saw them coming from a mile away.

      • DickFiasco@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 days ago

        Either they conned the government org in charge of purchasing it, or that org just didn’t care enough to look deeper. They got a professional-looking demo that made it look like the tech worked, and signed the contract without a second thought.

        • Basic Glitch@sh.itjust.worksOP
          link
          fedilink
          English
          arrow-up
          12
          ·
          edit-2
          2 days ago

          The history of the organization seems very odd

          https://en.wikipedia.org/wiki/Flock_Safety

          It began as a side project in which the three co-founders built their first video surveillance cameras by hand around Langley’s dining room table. When a DeKalb County detective told Langley that his camera product had helped with solving a home break-in, Langley called the two other co-founders and told them to quit their jobs.

          What?? How did a detective use it to solve a crime? Who was he? And based off of this one dude you all 3 just quit your jobs??? What??

          Then we just jump ahead to 2022 and these cameras that didn’t even work had raised over $380 million in venture funding?

          Then by the next year they were being used to sub for actual police due to a shortage of police officers?

          So they just go from the Hardy Boys help solve a mystery in Georgia in 2017 and then suddenly by 2023 Marc Andreesen (big surprise) is suddenly funnelling millions into their business.

          Oh, good, this citation will probably help make clear what the fuck actually happened between 2017 and 2023: Flock Safety. “Media Kit: Our Founding Story”. Flock Safety. Retrieved April 8, 2022.

          • pornpornporn@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            20 hours ago

            It began as a side project in which the three co-founders built their first video surveillance cameras by hand around Langley’s dining room table. When a DeKalb County detective told Langley that his camera product had helped with solving a home break-in, Langley called the two other co-founders and told them to quit their jobs.

            That reads exactly like those made up masturbatory LinkedIn anecdotes. I guarantee that this “Langley” guy is the one that edited the page to put it there.