kstrauser 1 day ago

I wrote this. I had/have absolutely no expectation that Flock would comply with my request, but figured I should try anyway For Science. Their reply rubbed me wrong, though. They seem to claim that there are no restrictions on their collection and processing of PII because other people pay them for it. They say:

> Flock Safety’s customers own the data and make all decisions around how such data is used and shared.

which seems to directly oppose the CCPA. It's my data, not their customers'.

Again, I didn't really expect this to work. And yet, I'm still disappointed with the path by which it didn't work.

  • carefree-bob 1 day ago

    They were saying "don't write to us, talk to the people who own the cameras and ask them to delete the data". A company that manufactures video cameras is not the one to talk to when someone records you, talk to the person who recorded you.

    But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.

    But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.

    And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.

    • fudgy73 1 day ago

      AFAIK Flock owns the cameras and leases them out [0].

      [0] https://www.flocksafety.com/blog/flock-safety-does-my-neighb...

      • mminer237 1 day ago

        If you go to Rent-A-Center and rent a DSLR, that doesn't make Rent-A-Center responsible for the pictures taken by their cameras.

        • kstrauser 1 day ago

          If Rent-A-Center installed the camera in a bathroom, I'd contend that it does.

          Flock's cameras aren't in bathrooms. However, they're still recording people who haven't opted into it. ("But you have no expectation of privacy in a public place!" "You have the expectation that someone might inadvertently overhear you. You don't have the expectation that someone is actively recording you at all times.")

        • danielsunsu 1 day ago

          I think if it were only offline storage it would not be as big of an issue. A more accurate analogy would be renting a DSLR that automatically transmits every picture to Rent-A-Center servers.

        • yabutlivnWoods 1 day ago

          Your example is apples and oranges. Flock maintains private infrastructure that stores data.

          If the DSLR uploaded them to Rent-A-Center owned/leased servers it would in fact require Rent-A-Center to take the necessary steps.

          As Rent-A-Center would be the only group with proper access to data storage they would have inserted themselves into the chain of custody, and thereby have such obligation to ensure others data is wiped from systems they control.

          • tptacek 1 day ago

            AWS also maintains private infrastructure that stores data. Go write them asking to purge data pertaining to you from S3 and see how that goes.

            • itsdesmond 1 day ago

              Flock has knowledge/use of the data. Their system processes can relate the photos “owned” by two different entities. They’re interacting with it and selling their access to it as a feature. That’s obviously distinct from S3.

              But you knew that.

              • tptacek 1 day ago

                I know quite a bit about Flock, having been intimately involved in the process of evicting it from our municipality, and I don't think the distinction you're trying to draw here is meaningful. Flock will say they provide a service, one avidly sought by the actual owners of the data, to generate analysis based on that data.

                They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case.

                • dureuill 23 hours ago

                  Except their customer's data isn't actually theirs: OP requested their private data to be deleted from the system. So OP expressed a clear intent for their data not to be used by Flock's customer. We could say that the data thus becomes abusively retained on these systems. As a result, IF Flock has the technical means of performing the requested data deletion, it should be compelled to perform it.

                  This is the same situation as a web hosting provider: if it is communicated to them that one of their customers uses their service to host illegal content, then it becomes the web hosting provider's responsibility to remove that content.

                  Reasonable technical feasibility for the service provider is key here, but it can be argued since the data can apparently be shared in ways that identify OP.

                  Probably not how the law currently works (don't know, not a lawyer), but I guess it should, as otherwise it allows creating a platform that shares abusively retained data without any reasonable recourse for the subjects of this data to remove the data from the platform.

                  • tptacek 23 hours ago

                    I do not believe this is how the law works. Two totally different regimes.

                  • d1sxeyes 11 hours ago

                    The data Flock holds is not owned by OP.

                    If I as a photographer take a photograph of someone, the photo does not belong to that person—the photographer retains the IP and ownership rights.

                    You have rights too, such as privacy/likeness rights, which allow you to restrict what the IP owner is allowed to do with the image that they own, but you do not own the data, and your rights give you a claim against the data owner.

                    Flock probably have legal obligations or contractual commitments not to delete or destroy their customers' data, and changing that is not necessarily a good thing.

                    • kstrauser 4 hours ago

                      That's not the case under GDPR, CCPA, HIPAA, or other privacy regimes which codify our right to decide who can store our personal data and what they can do with it.

                      • d1sxeyes 3 hours ago

                        Can you point me to the part of the GDPR that gives you ownership of data that relates to you? I’m fairly confident that you are assigned rights over personal information as it relates to you, but it doesn’t assign ownership.

                • shye 21 hours ago

                  Flock's facilitation of data-sharing is a huge part of their value proposition over other cameras, and why their customers buy from them over their competitors.

                  As such, even if they can contract it such that they are not legally responsible for such use, they are very much knowingly facilitating it. If this was physical goods, rather than data, they would probably been as responsible as their customers.

                  • tptacek 20 hours ago

                    I've read our contract. I know what it says. This isn't an abstraction. They can do lots of things. What they actually do is not data brokerage under California Law, at least not that I can tell.

                    • shye 18 hours ago

                      What Flock names the relationship in their contract does not make it one, as the courts do very much duck type.

                      Flock knowingly collects PII of people they have no direct relationship with, and transfers it to third parties. If that transfer, which Flock seem to gain from, is legally a sale is something to be argued at a great expense in front of the court.

                      But regardless of that definition, I so think that any reasonable person (= not a corporate lawyer) would consider there is a sale of data here.

            • Mordisquitos 1 day ago

              Does AWS actively and by design parse and keep track of personally identifiable information of the data that AWS customers store on their S3 buckets? If that were the case they would absolutely be subject to CCPA (and GDPR) requests for deletion.

              However, I suspect that is not the case. AWS is agnostic as to the type of data stored on S3, and deletion of PII stored on S3 is the sole responsibility of the AWS customer that chooses to store it.

            • danudey 1 day ago

              If AWS maintained private infrastructure that stored and indexed data associated with people's license plates and vehicles and then charged customers to do searches against that data then yes, you could write them to ask them to purge data pertaining to you.

              If Flock was just an opaque cloud storage service for law enforcement to back up their mass surveillance to then sure, your argument would have merit; it's not, it's a giant database of photos, locations, times, license plate information, and likely a lot more. They're not selling cloud storage, they're selling (leasing?) surveillance devices and tools.

              • tptacek 1 day ago

                The argument you're making implicates way more than just Flock, and is in a practical sense novel. If you can cite jurisprudence (or even legal experts) backing it up, I'm interested in reading it. Otherwise, I'm happy to accept that we just have premises about the law that are too far apart for an argument to be productive.

                My experience on HN is that these kinds of discussions almost immediately devolve into debates about what people want the law to be, as opposed to what it actually is.

                • Karrot_Kream 1 day ago

                  Realistically speaking you're never going to get pro Flock people in any numbers on this site writing comments at all. The anti surveillance position's popularity when it comes to up votes, down votes, and flags on this site is such that pros will continue posting about what they want the law to be and antis will stay out. That's just how crowd voting dynamics shape out.

            • yabutlivnWoods 22 hours ago

              I don't live in a state with a law like California's so your "gotcha" isn't relevant.

              Californians would have standing under the law but need expensive lawyers to litigate.

              AWS has employed expensive lawyers to argue semantics; they host OS VMs and databases. This provides them legal cover for what AWS customers store.

              Amazon the retailer stores customer data. A non-customer would have standing under California law to litigate removal of PII should they decide to hire lawyers.

              Your reductionism is to law what a Linux beige box on a routable IP, no firewall, hosting a production health database with creds set to admin/pwd1234 is to software engineering.

              Coincidentally 1234 happens to be the code to my luggage.

            • taotau 20 hours ago

              Devils advocate here. There is currently an article on the front page about a US bill to compel operating systems to collect age verification / id data. If something like that was actually in place and every packet on the internet was stamped with your digital id then you could feasibly demand that aws purge/filter your data out of their systems.

              • tptacek 20 hours ago

                You can't even request AWS delete your actual PII from S3. If you've been to a doctor in the last 2 years, you have HIPAA PII somewhere on S3, and AWS won't do a thing about it for you. I don't know why people have this idea that service providers will scrub their customers data for you.

      • ldoughty 1 day ago

        But the data collected is property of the government and flock is not allowed to use that data for additional business gain (according to their statements)...

        So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors.

        If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests"

        • tptacek 1 day ago

          This is also true according to their contracts (we were one of the first munis in the country to ostentatiously cancel our Flock contract, and the lead up to that was a bunch of progressive legal experts poring over that contract looking for holes.)

          • fsckboy 1 day ago

            >a bunch of progressive legal experts poring over that contract looking for holes

            all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you.

            if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges.

            • shermantanktop 1 day ago

              As a practical matter, this may be good advice. But it also places a demand on someone with a legitimate concern that they go find an ideological "beard" to make themselves more palatable and sympathetic.

              It's not hard to see how this enables an institution to gate itself from criticism.

        • thaumaturgy 1 day ago

          Except that Flock very clearly benefits financially from having direct access to this data: owning (and in their own documentation, they very clearly do own it) a network of 80,000 surveillance devices across the country, and owning every single transit point for the data they collect, is what gets them to a $7.5 billion valuation from investors.

          The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data.

          (After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.)

          Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder?

          • necovek 1 day ago

            That makes them a data broker in my reading, and at least in California, Data Broker legislation should apply. CA Data Broker registry gives me access denied, but that could be because I am outside US.

            • ScoobleDoodle 1 day ago

              I looked it up at https://cppa.ca.gov/data_broker_registry/ and didn't find Flock / Flock Safety in that list of the currently registered 566 data brokers.

              • tptacek 1 day ago

                Because Flock isn't a data broker. Flock's customers own their data, not Flock, and they use Flock's platform voluntarily to share data with other customers.

                • necovek 1 day ago

                  I was referring to the claim that "Flock's cameras collect more data than is provided to police agencies" — that suggests that there is data not "owned" by the customers, which implies it's Flock's data, thus it might make them liable under Data Broker legislation.

                • cwillu 1 day ago

                  Equivocation. My stock broker doesn't own my stocks either, they merely hold my assets in a brokerage account.

                  • tptacek 1 day ago

                    I encourage you to present that analogy to an actual court and see how far it gets you. It's very easy to find the statutory definition of a "data broker" under California law.

                    This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way?

                  • jaredwiener 1 day ago

                    And you would (rightfully) be angered if your stock broker sold your shares and pocketed the proceeds, because you own them.

                  • FarmerPotato 18 hours ago

                    Technically, most stocks are registered in the name of a securities holding company, with you named as beneficial owner. That makes it frictionless for you to buy and sell. You enjoy all the rights of ownership, unless the broker lends your shares out to someone else.

                    You _can_ get shares registered in your name.

                • tadfisher 1 day ago

                  Flock charges to access the data which is voluntarily shared by other customers. I am struggling to note a difference in this practice from any other data brokerage service in existence.

                  Does Flock do some kind of P2P dance to avoid the data transiting their systems?

                  • rjmunro 20 hours ago

                    Legally how does it work if I upload a file to Google Docs and then share it with my contacts? Is Google then a data brokerage for my files?

                    • tptacek 20 hours ago

                      They are not, because they are not operating a business that acquires and resells your data. You own your document, and Google isn't selling it to third parties. Flock doesn't own municipal data, and Flock is also not "selling it to third parties"; it's facilitating a sharing system that law enforcement agencies avidly desire.

                      Presumably the California data brokerage statutes were written specifically to prevent the kind of nerd-lawyering happening on this thread.

                • close04 1 day ago

                  So… Flock uses their own platform and top to bottom tech stack to do everything technically? Your local PD doesn’t use random cameras (like Reolink), doesn’t run a custom software stack (like Frigate in a container on some random VM hosted with AWS), doesn’t store the data wherever (like Backblaze)? The customers just have to install the Flock cameras and “order” the subsequent data from Flock? But you say they’re not at all responsible or accountable for any it because despite doing everything at every step, they’re “just a broker”?

                • unethical_ban 1 day ago

                  If Flock's customers, using Flock's infrastructure or tooling, can share data with each other, that would be bad.

                  I'm not saying that's what's happening, but that's what I thought was happening before reading this thread, and now I have to go and run through their policies.

                  Either way ALPRs and AI-facial scanners in public are a huge violation of privacy and I loathe them, but I hope it's correct that Flock customers cannot easily share information with one another.

                  • FireBeyond 18 hours ago

                    > If Flock's customers, using Flock's infrastructure or tooling, can share data with each other, that would be bad.

                    Ex-employee of Flock here, that's ABSOLUTELY what's happening.

                    And what's more Flock lets them do so even when they know the agencies are legally not permitted to do so. They turn a blind eye, say it's not their problem to enforce ("oh, doing so in state X is illegal? Well, even if your agency is in state X, we didn't disable that feature"), then happily provide training to do enable those agencies to do so (and it's a nudge nudge wink wink part of the sales process.)

                  • tptacek 18 hours ago

                    Sharing data between customers is a large part of the point of the product.

        • unethical_ban 1 day ago

          This is worth validating independently, but to be clear:

          Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's?

          If so, that makes the panopticon slightly less powerful.

        • valeriozen 1 day ago

          The AWS analogy breaks down because AWS doesn't encourage customers to pool their S3 buckets into a nationwide searchable index.

          Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to?

          • giancarlostoro 1 day ago

            Start standing in front of the cameras looking sketchy long enough till police are sent out to ya, then ask the cop who called.

            • chaps 23 hours ago

              Someone once dropped some fireworks not too far from me at 3am a few years back. They were loud and, yeah, cops were called. A few minutes later about five cars drive past me about 30mph over the limit. Not sure how they didn't see me or try to see me. But I know they didn't catch the BRIGHT orange and lifted care.

              Me being me, I submitted a FOIA request for the dashcam footage of the five cop cars and the dispatch logs.

              Instead of pulling over the easily identifiable car, they pulled over some random guy. They were behind him the whole time but five cop cars pulled behind him thinking that he fired a gun a few minutes back.

              He was let go without a citation, but the official reason, despite being paired with the dispatch for the firecracker, was a broken headlamp.

              • giancarlostoro 21 hours ago

                I may or may not know a business owner who got criminals off their business' street by saying he thinks he saw a gun any time criminals showed up to do things, everything from prostitution to selling drugs. Cops showed up immediately. They stopped coming by altogether, probably the safest street in quite a rough part of town.

                It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.

                • hrimfaxi 19 hours ago

                  > It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.

                  Is it crazy? Shouldn't the response be proportional?

                  • Forgeties79 19 hours ago

                    The police should do a lot of things they fail to do.

                  • giancarlostoro 18 hours ago

                    Contrast to someone being shot dead, if the killer drives away, they might be there half an hour later.

                • fwipsy 19 hours ago

                  Police prioritizing responses to violent crimes where lives may in danger seems reasonable to me.

        • jnovek 1 day ago

          I don’t care. I don’t care who owns the data. If I can’t easily get private information like my movements removed from a database like this, the legislation does not sufficiently protect me.

          It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop.

          • lazide 1 day ago

            A reasonably nuanced defense could likely claim that to be able to do what you want, would have much worse side effects on privacy.

            For example, would you want to be able to tell Public Storage (or some other storage unit place) to remove any naked photos of you stored anywhere in their storage units?

            For them to actually be able to do that would require they have nigh omniscience on everything stored by/for everyone in every one of their storage units. Even inside closed boxes.

            Now, it's not the same thing of course - but hopefully you understand what I'm referring to?

            • LadyCailin 23 hours ago

              Except that the analogy is that they already have, or can easily create, that list. If they couldn’t, their value proposition would be lame. “We know you’re looking for a specific license plate, here’s a million hours of footage from all over the city, have at looking through it all.”

              • lazide 23 hours ago

                Only for paying customers, which you aren't of course. If those customers paid public storage to inventory their stuff, then that inventory is their property. Surely it would be inappropriate to use their inventory data to find your naked photos. A violation of privacy even. (/s, kinda)

                I was enumerating the likely defense, not that it's valid.

                • shakna 11 hours ago

                  "Existing capability" removes the argument against onerous requirements, in a legal setting.

          • tptacek 1 day ago

            The law cares about lots of things we don't care about.

            • jnovek 5 hours ago

              The law is there to serve society. If it is not effectively serving society, it should be changed.

          • rjmunro 20 hours ago

            The problem with this is where do you draw the line? If I film you with my iPhone (e.g. you walk past in the background of my video), Apple should delete my video from my phone and iCloud account based only on your instructions?

            Apple hold the data in iCloud, Apple (or a phone network) may be leasing me the phone. That sounds pretty similar to the Flock situation.

            I guess the difference is that flock might be sharing the data from a customers camera with other customers. Then they are definitely controlling it.

            I think the bigger problem with Flock is the fact that their cyber security is so laughably bad that non-customers can easily access the data.

            • psychoslave 17 hours ago

              Not pronouncing about what path is the most distopic, just for the fun of the exercise of what if we push in the direction:

              Given the rule, I would expect (IANAL), Apple should not deal with data stored on phones they sold.

              People are responsible for what they store on their device. When I take a photo in the street, if someone come to me asking to erase a photo with them or their kids as they were in the background, I'll tell I don't publish any photo online, which is generally what people are thinking of as a concern and that stop there, but if they insist I will remove it from my phone. Because I'm too lazy to actually live edit the photo and remove them from the picture, even if that is certainly doable with a simple prompt by now.

              Now if Apple store automatically photo in some remote server they own, they are the ones who should be responsible to comply with making sure they won't store something illegally. Microsoft, Google, and Apple use PhotoDNA to detect known CSAM if I'm not mistaken. Though legally they only should remove once they get a notice about it. Same way, they could proactively blur visages of people not detected as the people that were whitelisted for the uploading account. And, by that logic, they should certainly remove the information regarding a person if they get a notice, just as well as they wouldn't keep CSAM data once notified, would they?

              Anyway the underlying issue is not who store what, but what societies lose at letting mass surveillance infrastructures being deployed, no matter how the ownership/responsibility dilution game is played on top of it.

            • jnovek 5 hours ago

              Are you using your phone photographs to track my movements? I don't care about the photographs part, I care about the "collecting data that can track my movements" part.

              I don't mean my movements on the internet either. I understand that those things are easy to track. I mean in real life.

              As far as responsibility for the data goes, you're right, it's not clear. Therefore, anyone who uses the data -- Flock or their customer -- should be required to delete it on my request.

              That seems like a pretty clear delineation, no?

          • _DeadFred_ 19 hours ago

            The legal term is 'distinction without a difference'. Flock/others can't create a weaselly scenario to pretend it's something else. Otherwise people could bypass all kinds of laws/rules just by giving some weaselly description to everything.

            This also falls under the 2026 rule 'everyone Is 12 Now'. Flock is literally acting like a 12 year old to get out of following the rules. My 12 year old tried to use this dumb parsing of things to avoid rules/consequences.

        • eagleinparadise 22 hours ago

          If I lease out a property to a tenant (apartment, retail, industrial use, whatever) and that tenant is committing an illegal activity on the property. Would the landlord be liable for knowing it? Or not?

          "Sorry FBI, the tenant renting my warehouse out to manufacturing cocaine is not my responsibility. I won't do anything about it. You deal with them."

          Nope, that's a failure of a duty to act and aiding and abetting a criminal activity if you hace constructive knowledge.

        • wavefunction 16 hours ago

          I assume they are building "meta-data" profiles of people based on the data they say they can't use directly. That seems like an easy work-around that satisfies the lip-service they've given to the issue.

        • gorgoiler 15 hours ago

          That’s a pretty compelling argument, but what if I went round to AWS’ house, peeked into their kitchen, and saw a crate of photos on their table with me in them?

          I’d absolutely say:

          “Hey, that’s me! Give me those right now!”

          I’d also be pretty angry if they told me:

          “Sorry we’re storing those for Corp Inc. Go ask them.”

          To refute my own point though, this only sounds annoying because the data processor is being irritating by manually referring me to the data controller. In practice, it would be trivial for them to automatically forward communications between me and the controller.

          That’s what feels is amiss with the top level article.

        • marcus_holmes 13 hours ago

          > But the data collected is property of the government

          I thought this was the get-out clause from the constitutional problems with Flock? That because Flock is a non-government organisation it isn't restricted by the constitution (i.e. the constitution only restricts what the government can do).

          They can't have it both ways - if Flock are collecting the data then they are subject to the privacy laws. If it's the government collecting the data via Flock as just a service, then they are subject to constitutional restrictions.

      • samrus 19 hours ago

        Yeah but their argument is that if someone takes a photo of you with thier iphone and its uploaded to icloud, you cant ask apple to delete the photo, you need to ask the person who took it

    • mistrial9 1 day ago

      the way this has been addressed in complex product liability in the past in the USA is that the public-facing Brand Owner has certain legal liability for the product, despite contractors or supply chains. In this case, it appears that the Flock company is the brand owner and is public-facing.

    • halJordan 1 day ago

      It easily goes both ways. But we do sue American gun makers for deaths caused by lunatics. We sue drug makers for drugs prescribed by a doctor. We sue cloud providers for not reporting illegal photos. Printers are forced to id every printed page to combat counterfeiting. Banks are forced to do close accounts even though it's not their dirty money

      • thebaine 1 day ago

        Most of those examples have to do with the manufacturers knowing that their products were dangerous, addictive or illegal and advertising them aggressively as safe. They're mostly litigated on product liability claims. Banks are regulated by an entire architecture of laws, and we could enact laws that would regulate Flock too, as one of the other commenters pointed out they're doing in Oregon.

    • dozerly 1 day ago

      I don’t think you’re informed on the topic. They do not just manufacture cameras.

    • robot-wrangler 23 hours ago

      > They were saying "don't write to us, talk to the people who own the cameras and ask them to delete the data".

      The response to this should just be, "Yes, very well, please divulge a complete list of your customers, their contact information, and information about camera locations so I will be able to pursue this per instructions".

      When that obviously doesn't work either then we can all agree the law as written is completely useless, and feel great about rewriting it in a way that's calculated for maximum damage to both the vendor and their customers, and collateral damage to the whole panopticon. Or, just spitballing here, we can just skip to the punchline here and do all that anyway

      • MaxikCZ 13 hours ago

        > we can all agree the law as written is completely useless

        But they wont agree, they find the law very nice in its current form, and have more brib.. ehm, lobying money for it to stay that way.

    • beambot 22 hours ago

      Isn't this the equivalent of asking Google to delete your image off every Android phone (not just yours)?

    • justinclift 20 hours ago

      Wonder if Flock could be DMCA'd to remove the data they're hosting then? :)

  • themafia 1 day ago

    These laws get complicated quickly. There's a specific ALPR law in the CA civil code which seems to carve out several exceptions for a business like Flock:

    https://leginfo.legislature.ca.gov/faces/codes_displayText.x...

    The enforcement provisions are rather bleak as well and afford no opportunity to directly bring a case against the agency that operates the system but instead just the individual who misuses it.

    I think one of the more direct attacks would be going after jurisdictions that chronically have officers misusing the system. I think you're going to have to create precedent in this way to foment actual change.

  • everdrive 1 day ago

    Nice work all the same. These systems need to be prodded and tested. Even unsuccessful results such as this tell us something about the situation we're in.

  • nainachirps_ 1 day ago

    I am not a lawyer myself but can't one argue that this company has duty to ensure that data it is processing for client is legally obtained.

    If they are processing data after being told it was not obtained with consent do they not have any liability?

  • Glyptodon 1 day ago

    I think you should write them back and ask that they provide you with a customer list and continually update you as they get new customers so that you may follow the advice they've given you.

    • kstrauser 1 day ago

      Ooh. I like this.

      • kube-system 1 day ago

        Why? The response is predictable. No company is going to give you a customer list.

        • kstrauser 1 day ago

          I wouldn't be so sure. In this specific case, they told me to ask their customers to comply with my CCPA rights. Without that information, it's impossible for me to exercise those rights. IANAL, but that sounds like a pursuable path.

          • kube-system 1 day ago

            I don't see any provision in CCPA that requires that. And outside of an explicit requirement to do so, nobody is required to help you.

            Edit: from https://oag.ca.gov/privacy/ccpa

            > If a service provider has said that it does not or cannot act on your request because it is a service provider, you may follow up to ask who the business is. However, sometimes the service provider will not be able to provide that information. You may be able to determine who the business is based on the services that the service provider provides, although sometimes this may be difficult or impossible.

            • bloppe 23 hours ago

              Perhaps the next step is writing to your state representative, or to Alastair Mactaggart, and complain about this hole in the legislation.

        • singleshot_ 1 day ago

          I’ve been in the lobby of hundreds of different technology companies and my understanding is that all companies are going to give you a customer list, you just have to look in the right place.

          • kube-system 1 day ago

            Ha -- you know what I meant: explicitly.

  • ratdragon 1 day ago

    maybe eff.org would be able to help you lawyer up or otherwise to push this forward. good luck!

    • kstrauser 1 day ago

      I've reached out, albeit very informally. I'll completely understand if this isn't something they have the time and energy to help with. If I accidentally caught them at a weak moment when they're looking for something to do, though, I'm all in.

  • tptacek 1 day ago

    Wait, is it your data? If you drive your car in front of a Ring camera on my house (I don't have a Ring camera don't @ me), is it your claim that you own the data on that camera?

    • kstrauser 1 day ago

      Did you put up a Ring camera on a stand in front of your house for the specific purpose of selling that I drove past at this specific timestamp? If so, yes. The CCPA[0] gives me explicit legal rights:

      * The right to know about the personal information a business collects about them and how it is used and shared;

      * The right to delete personal information collected from them (with some exceptions);

      * The right to opt-out of the sale or sharing of their personal information including via the GPC;

      This isn't someone incidentally taking pictures of license plates in an otherwise noncommercial setting. It's a company literally created to collect and sell PII. Laws are different for them than for us.

      [0]https://oag.ca.gov/privacy/ccpa

      • snowwrestler 1 day ago

        “Personal information” has a legal definition and photos of you in a public street might not satisfy it, regardless of the photographer’s intent.

        • kstrauser 1 day ago

          I think it'd be challenging to rule that a license plate number is not personally identifiable information, when the same regulations often state that an IP address is.

          • uoaei 1 day ago

            "Anyone could have been driving my car, you can't positively identify me in the driver's seat with the evidence you have submitted" is routinely used to toss out cases involving traffic violations. It's not necessarily common but it does happen. By this logic a license plate does not personally identify the person driving, only the person the car is registered to.

            • lcnPylGDnU4H9OF 1 day ago

              Right, but in this context the license plate number is still personal information, just of a different person.

              • uoaei 1 day ago

                Then the key aspect of our discussion is the "identifiable" part, which you've left out.

                • lcnPylGDnU4H9OF 1 day ago

                  Are you now saying that one cannot possibly "identify" the "person" who owns a vehicle, solely with the "information" on a license plate?

                  • ndsipa_pomu 6 hours ago

                    You may be confused, but the owner of a vehicle is not necessarily the same as the driver of a vehicle. Recording a vehicle's license plate does not necessarily identify the driver unless they also happen to be the owner. (c.f. vehicle hiring companies)

              • kube-system 1 day ago

                The CCPA explictly says:

                > “Personal information” does not include [...] Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer

                • tptacek 1 day ago

                  California has an entire statute regulating ALPR information, so we don't need to derive this axiomatically.

                  • kube-system 6 hours ago

                    One generally has to follow all of the laws, so evaluating what the CCPA says here is relevant for evaluating CCPA compliance.

            • danudey 1 day ago

              Yet the same is true of IP addresses. You (typically) cannot know for certain whether traffic from an IP address was originated by a specific person and yet it's typically considered PII because it can be used in conjunction with other information to identify you.

              Even your full legal name and birth date cannot be guaranteed to refer only to you specifically (as there could be someone else with an identical name and birth date), but it's obviously still PII because it helps narrow the field immensely if you can combine it with other information - for example, your IP address.

              So yeah, "anyone could have been driving my car", but if you also know that the car drove from your home to your work then that narrows down the list of likely individuals immensely.

              Conversely, if your license plate was spotted parked near an anti-ICE rally, then they can be pretty confident that you or someone you know was near an anti-ICE rally, which means they can harass you about it, follow you around, shoot you in the street, etc.

            • tomwheeler 1 day ago

              The standard of proving someone's guilt in a crime or civil infraction is higher than the one for inferring that someone could plausibly be the person you want. This is the basis of parallel construction, wherein a government agency plays a game of "pin a crime on the suspect."

          • adrr 1 day ago

            Or home address, phone number, etc

        • lcnPylGDnU4H9OF 1 day ago

          Indeed, that definition is included in the CCPA.

          > (v) (1) “Personal information” means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following [...]:

          > (E) Biometric information.

          > (H) Audio, electronic, visual, thermal, olfactory, or similar information.

          https://leginfo.legislature.ca.gov/faces/codes_displaySectio...

          To your point, the intent would presumably still matter for exceptions to when deletion requests must be honored (say for journalism), but a photo of someone walking down a public street would still logically be considered the subject's personal information, by the above definition.

        • necovek 1 day ago

          Personal information usually does include photos of someone in public without their consent: exceptions usually hold for taking photos of people where it is in the public interest to be able to show them or impractical to get consent. This covers large gatherings and celebrities, but a portrait photo of a stranger might put you on the wrong side of the law.

          Obviously, the idea is to not disallow having someone take a photo of you as a background, passing figure as they take a front-and-center photo of their family, but not allow you to be the main subject unknowingly and especially when you object explicitly.

          On the other hand, a photographer still owns the copyright to a photo, so a subject (including in a portrait) cannot claim it or distribute it without permission even if they can potentially stop the photographer from distributing that photo.

          IANAL, but you are not by default allowed to use anyone's "likeness" for your individual profit.

          • patrickmay 1 day ago

            > Personal information usually does include photos of someone in public without their consent

            This is not the case in the United States. There is no presumption of privacy in public. In fact, there is a whole genre known as "street photography" that involves taking pictures in public without explicit consent of the subjects.

            • tadfisher 1 day ago

              This is true, and it may also be true that location tracking through surveillance networks crosses a line into violating one or more Constitutional rights. One of Flock's revenue streams is explicitly selling access to data made available by other customers. A commonly-cited example is the ability of local law enforcement to locate abortion suspects in other states using the Flock camera network [0]; one could imagine dragnet-style or geofenced queries to also cross the line.

              [0]: https://www.eff.org/deeplinks/2025/10/flock-safety-and-texas...

              • tptacek 1 day ago

                People keep making this claim that Flock "explicitly sells access to data", but the link you provided doesn't demonstrate that, and Flock contracts I've read contradict the claim.

                I think what's happening here is that people are trying to colloquially define "selling access to data" to fit the camera data sharing that Flock enables, and then saying that because you have to pay to be a Flock customer to get access to that data, they're effectively selling it. I don' think that's how data brokerage laws work. Flock doesn't own the data they're providing access to, and they're providing that sharing access with the (avid!) consent of their customers.

            • FireBeyond 18 hours ago

              > In fact, there is a whole genre known as "street photography" that involves taking pictures in public without explicit consent of the subjects.

              Try taking an upskirt photo of someone in public without their explicit consent. You'll find that there are limitations to that under both Federal and State laws.

          • snowwrestler 1 day ago

            You’re getting mixed up about commercial use and personal information.

        • tadfisher 1 day ago

          Well, it matters if the photographer is the government (or contracted by the government, or subpoenaed by the government): see Chatrie v. United States [0]. The Fourth Amendment exists, and it remains to be tested whether querying massive surveillance networks is a "reasonable" search.

          [0]: https://www.scotusblog.com/cases/case-files/chatrie-v-united...

          • snowwrestler 1 day ago

            True but the Fourth Amendment doesn’t rely on the CCPA for authority!

      • tptacek 1 day ago

        I think you're going to find that you're wrong about this, and that you're not going to get anywhere targeting Flock in particular, as opposed to the owners of Flock cameras. Consider that California has had multiple rounds of legislation about red light camera legislation, including new limitations on it, and all of those cameras are also from commercial providers collecting your PII. Your argument proves too much.

        You're going to get a lot of cheerleading and support about this in venues like HN and Reddit, because you're narrowcasting to an audience already primed to be hyperconcerned about surveillance technology (I am too). I think you're going to find those attitudes do not in fact generalize to the public at large, and especially not to the legal system.

        Best of luck either way. It'll be an interesting experience to write up, and I'm happy to read about the outcome, even if I do think it's highly predictable.

        • john_strinlai 1 day ago

          >you're not going to get anywhere targeting Flock in particular, as opposed to the owners of Flock cameras.

          fyi, flock owns the cameras.

          "We operate using a lease model. What does that mean? Since we own the hardware, we own the problems that occur."

        • kstrauser 1 day ago

          FWIW, I just did this as an experiment and turned it into a blog post afterward. I didn't really set out with an agenda or a deliberate audience, and I didn't share it here. Don't get me wrong, I'm happy to chat about it! But this ended up here without any special effort on my part.

          • tptacek 1 day ago

            I know the feeling! My arguments here are positive, not normative. I don't know that I think it would be a worse world if your hypothesis was correct. I'm just reasonably sure it isn't.

            • kstrauser 1 day ago

              For sure. This is the sort of conversation I'd typically rather be having at a bar with appropriate beverages. If it sounds like I'm arguing, it's because it's the kind of thing I'd debate with my friends for the fun of it.

    • mindslight 1 day ago

      "Your data" isn't really a well defined term, right?

      But yes, data that can be used to track my movements in my vehicle is certainly a type of personally identifiable information. I'd argue there should be some exemptions for individuals operating on a small scale, which I believe the CCPA has (and if we actually got a US GDPR, that it should have). But also that kind of exception shouldn't apply to a camera jointly operated by and backhauling to Ring.

    • necovek 1 day ago

      I believe you are still the owner of that data, but if you are holding someone's PII — which time of passage, car model and license plates can be argued to be especially with a fixed location — according to privacy regulations, they can ask a business to remove it unless they have a legally acceptable reason to keep it (eg. they hit-and-run a parked vehicle).

      Now, with you likely not keeping that Ring tied to a business account, how that applies to non-businesses holding PII is a different matter.

    • danudey 1 day ago

      The laws say that data about you is your data, information about you is your information. No one is saying that you "own the data", but by virtue of the data being personal information about you specificially you are allowed to exert control over that data, such as asking for it to be deleted.

      • kasey_junk 1 day ago

        That view of data ownership is _highly_ jurisdiction dependent and is not the overwhelming norm in the US.

        • captaincrisp 1 day ago

          While that's definitely true, in this particular case he's invoking his rights under CCPA.

          • tptacek 23 hours ago

            They're invoking a right they do not in fact have under CCPA. Flock is a service provider under CCPA, and isn't required to respond to their request so long as they're operating under the terms of their contract with the municipality (which is, in turn, exempt from CCPA.)

    • windexh8er 17 hours ago

      Who paid for the camera? If I did with taxpayer dollars then, you're damn right I should have a say.

      The "my Ring camera" trope is a fun strawman, though.

      • tptacek 17 hours ago

        If it's the municipality holding the data it's even less an issue! Municipalities are exempt from CCPA!

        • windexh8er 16 hours ago

          They aren't. Flock is, so are they? Also, the state I live in has a GDPA that would override CCPA, so it's not exactly that cut and dry as you very well know.

          The Ring example is garbage. You paid for it and it's on your property. Nothing remotely similar.

          I guess then it's OK for Flock to be required installed on your mobile device so they can check your geo history for the last hour and that you don't match the profile of the guy who stole Billy's Huffy bike?

          What happens when the municipalities buy time on your device since, well, you don't own it and have no right to the software running on it because of ToS you agreed to. Or that awesome "save the children from CSAM pedos" bill you cheered for that paved the way for the USG and states to be guaranteed citizen introspection app slots on your device?

          Because the piece of paper says so, it should then just be accepted as is!

          • tptacek 16 hours ago

            I've lost track of what you're talking about. Flock cameras are installed by municipalities. They're physical devices. Nobody is required to have "Flock" installed on their mobile device. That's not a thing.

            • windexh8er 8 hours ago

              I never said it was, the hypothetical was clearly hard to follow.

  • mindslight 1 day ago

    Isn't this just the routine fascist playbook at this point? Start by declaring that the law doesn't even apply to them, on whatever flimsiest of bases.

    Personally I would really like to see torts for attorneys who willfully promulgate blatantly incorrect legal interpretations - they're effectively providing incorrect legal advice. A non-attorney is likely to believe such advice coming from a member of the Bar, and the net goal is to discourage the target from seeking further legal advice.

    • SoftTalker 1 day ago

      An attorney whom you have not engaged in counsel is not providing legal advice.

      • mindslight 1 day ago

        I'm well aware of how it's currently legally defined - hence "effectively". My comment is in the context of what ought, not what is.

      • efreak 4 hours ago

        An expert in any subject matter should not be allowed to provide misleading statements, regardless of whether they're speaking in an official/paid capacity. It doesn't matter if it's considered legal advice or not, the fact that you've got a license and know better should be enough.

  • snowwrestler 1 day ago

    It’s not clear to me that it is actually your data. If I take a picture of you in a public place, I own the picture, not you.

    But maybe I am unclear on how Flock works.

    • bjt 1 day ago

      Setting aside Flock, the "ownership" situation is not as clear as you say above.

      What you own is the image copyright. But the right to copy is only one of the rights at issue.

      Under various state laws (California in particular), you might not be entitled to do all the things with that picture that you could do of one that doesn't have my likeness. Privacy laws like the CCPA are one possible carve-out. A "right of publicity" is another.

      There's an old saying about property law that "property is a bundle of sticks". The bundle can be subdivided.

      https://www.law.cornell.edu/wex/publicity

    • pksebben 1 day ago

      Isn't flock's whole thing that they extract information from the pictures they have?

      Like, say I have an interview in your office and you step out for coffee. I take a picture of the applicant list on your desk. That doesn't make the list of applicants "my data".

      • lotsofpulp 1 day ago

        >I take a picture of the applicant list on your desk. That doesn't make the list of applicants "my data".

        If the list is sitting there out in the open, then yes, it does make it your data.

        • throwway120385 1 day ago

          Well, maybe not. A reasonable person might not think that about that applicant list. I bet you could make a different argument for taking a picture versus memorizing the list too.

          The legal system thrives on specifics of a situation, so simply asserting that the list of applicants is or is not "yours" because you can see it seems like a gross oversimplification. The specifics of how you came to be there, what your relationship with the officeholder is, and so on probably matters a lot in that situation and I think there might be some unwritten rules or social norms that you'd be expected to follow as well.

    • throwway120385 1 day ago

      You also might not be considered a commercial entity under the law. It's a bit more nuanced and the words written in a particular statute have to be interpreted together. So statements about "commercial entities" have to be limited to such entities even if we'd really like to be able to go to our neighbor's house and ask them to delete all of the surveillance they have of our cars driving up and down the street in front of their house. I think these laws are often narrowly written to avoid unintended consequences like de-facto banning private operation of surveillance systems on private property.

  • goodluckchuck 1 day ago

    The CCPA clearly violates the 1st Amendment. If you're out in public, then people are allowed to see you, to remember it, to communicate that it happened, etc.

    • 2bitencryption 19 hours ago

      Not exactly. One can be charged with stalking, even though the offender only went to places in public that the victim also went to. If combined with a pattern of behavior that, in aggregate, infringes upon the rights of the target, it can become a crime.

    • efreak 4 hours ago

      > Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

      This is the entire text of the first amendment. Congress did not make the CCPA. The first amendment is irrelevant. Technically the first amendment also does not prevent Congress from saying you're not allowed to remember or see things, either, though likely there's other laws about this and/or an assumption that Congress will not make laws against thought crime and reality.

  • necovek 1 day ago

    This answer is relevant: https://oag.ca.gov/privacy/ccpa#collapse6d

    In short, Flock is a "service provider" and not the entity doing the recording.

    Perhaps you can make a case that they are a "data broker" instead (https://oag.ca.gov/privacy/ccpa#collapse1i), but that is a separate law, and what you are really looking at is a combination of license plate, time and location being collected as data being collected and sold without your consent.

    Obviously, I am not a lawyer (and not even US-based), but I like when privacy is respected :)

  • zbrozek 1 day ago

    I tried the same, got a similar response, and complained to the AG. Nothing.

  • wakamoleguy 1 day ago

    The data ownership is really interesting, as many threads here are going into. I wonder if it's possible to sidestep that entirely, though! Under the CCPA, "personal information" is defined as information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked — directly or indirectly — with a particular consumer or household. That says nothing about ownership.

    To the extent that Flock is only storing the data on behalf of their customers, I'd understand they wouldn't be required to delete it. But to the extent that they are indexing it, deriving from it, aggregating it across customers, and sharing it via their platform, it seems they should be required to remove that data from those services.

    But then again, I am not a lawyer!

  • tgsovlerkhgsel 1 day ago

    Under GDPR, I believe that would be accurate. I think CCPA was to some extent inspired by GDPR so I wouldn't be surprised if they copied this point too.

    Which, hilariously, means that under GDPR, you only need to contact the web site, and they have to go talk to their 1207 partners that value your privacy to fulfill your request (I'm sure that in practice they'll say "sorry it's all 'anonymous' so we can't" or "we can't be sure that it's you even though you provided the identifier from your cookies"). I'm really disappointed that NOYB hasn't started going after web sites like that - that's quickly put a damper on the whole web surveillance economy.

  • charcircuit 1 day ago

    >It's my data, not their customers'.

    Just because data is about you, that doesn't mean it is your data.

    • john_strinlai 1 day ago

      you may be minunderstanding the california consumer privacy act (ccpa). in the ccpa, personal data is defined as:

      "Personal information is information that identifies, relates to, or could reasonably be linked with you or your household."

      and, you do have the rights set forth in the ccpa (know, delete, correct, limit exposure, etc.) regarding that data.

  • lmkg 1 day ago

    > which seems to directly oppose the CCPA.

    I have some background in data privacy compliance.

    It sounds like they are claiming to be a Service Provider under CCPA, which is similar to a Processor under GDPR. Long story short, a Controller is the one legally responsible for ensuring the rights of the data subject, and a service provider/processor is a "dumb pipe" for a Controller that does what they're told. So IF they are actually a Service Provider, they're correct that the legal responsibility for CCPA belongs to their customers and not them.

    That's a big IF, though.

    Being a Processor/Service Providor means trade-offs. The data you collect isn't yours, you're not allowed to benefit from it. If Flock aggregates data from one customer and sells that aggregate to a different customer, they're no longer just a service provider. They're using data for their own purposes, and cannot claim to be "just" a service provider.

  • jsw97 1 day ago

    You might reach out to the California AG. I suspect they are itching for this kind of thing right now.

    • kstrauser 1 day ago

      Because life is weird, my kid played little league baseball against his.

      I might have to do that.

      • orthecreedence 4 hours ago

        So maybe your kid throws some softball pitches to his kid, and in exchange he opens a quick and easy lawsuit dismantling the surveillance state's ability to operate within CA. Quid pro quo...

  • AlBugdy 23 hours ago

    Read a few of your posts. Just wanted to comment on how I like your to-the-point succinct style and how you care about privacy. :)

    As a suggestion, I saw you have RSS:

    https://honeypot.net/feed.xml

    I didn't see it mentioned in the main page or About or Archive. Maybe add it to a more visible place?

    • kstrauser 22 hours ago

      Aww, thanks! I appreciate that.

      And that's a good point. I'll look at that when I get home.

  • rkagerer 20 hours ago

    It would be revealing to see a judge scrutinize the degree of control Flock maintains over the system and deliberate on whether the company truly is as hands-off as it claims when it comes to privacy obligations.

    Personally I feel if you're going to build so turnkey a system to facilitate collection of personal data by your customers at the scale Flock seeks, then at a minimum you should build an equally turnkey method to handle requests like the one you made. It would be a service both to their customers and to consumers at large who wish to exercise their rights under legislation to opt out.

    The fundamental reason we ended up in a world where companies just pay lip service to privacy and don't take it seriously is because consumers / voters put up with it. I'm heartened when I see individuals keyed into the issue.

    Is there some way to contribute funds toward your legal fees?

    • mixdup 17 hours ago

      it is distinctly against Flock's interests to offer a turnkey system to help people opt out of data collection from among Flock's many customers. It might be a service to the general public, but it certainly would not be a service that Flock's actual customers would generally be interested in, at all

  • MrDrMcCoy 20 hours ago

    You know... you might have some success getting LeaglEagle involved, especially since they'd make money on the attention it would bring. Benn Jordan would probably love to be an expert witness.

  • pdpi 19 hours ago

    In GDPR terms, the point they're making is that people who own Flock hardware are the Data Controllers, and Flock act only as Data Processors. I'm not sure how (whether?) those roles map to the CCPA, and whether any court of law would agree with them is up for discussion, but at least the concept is not completely absurd.

    Of course, the word "owner" is almost rage baiting on their part.

    • FireBeyond 18 hours ago

      Except under Flock's own contracts and their own website, Flock are the people who own Flock hardware. And this correlates with my understanding from when I was an employee.

  • cebert 19 hours ago

    Police departments, courts, and similar entities are not subject to CCPA in the same way companies are.

  • laylower 10 hours ago

    Send it to EFF - they are big in California..

empathy_m 1 day ago

I noticed that the company is glossed as "Flock" and not "Flock Safety (YC S17)" in posts like this and last week's "US cities are axing Flock Safety surveillance technology", https://news.ycombinator.com/item?id=47689237.

Did YC house style change a while back to drop the "(YC xxx)" annotation since so many popular firms particpate / or because it's well known?

  • mikey_p 1 day ago

    Who know, maybe they're trying to distance themselves from the privacy disaster, but I doubt anyone at YC or HN is smart enough to read the room on Flock.

    • bix6 1 day ago

      Which room? The one paying them millions to spy on people? Cash Rules Everything Around Me.

  • tptacek 22 hours ago

    I see less of it now across the board but note that this headline was almost certainly created by the story's submitter; it's not like there's an automated process to apply the label.

    • FinnKuhn 11 hours ago

      There are definitely some automatic formatting rules for titles though.

      E.g. capitalization or one that removes any "How" from the beginning of a title. When I wanted to submit "How Pizza Tycoon simulated traffic on a 25 MHz CPU" I had to edit it after posting or it would have said "Pizza Tycoon simulated traffic on a 25 MHz CPU", which doesn't make any sense.

      • lionkor 8 hours ago

        How does that not make any sense? It turns into a statement. If it no longer makes sense, the original title was bad clickbait, no?

        • FinnKuhn 5 hours ago

          Titles should be descriptive and in this example the blog is doing a deep dive into how the tech actually works. So the content isn't about that "Pizza Tycoon simulated traffic on a 25 MHz CPU", but about how it did so.

          As far as I can tell this is supposed to catch "How to" titles, where it totally makes sense though. I guess there is an exception to every rule though.

  • 946789987649 13 hours ago

    Likely the person didn't know they were part of YC (I didn't).

wcv 1 day ago

Flock has stonewalled with the "we are not the controllers" excuse here in MN too. We have similar rights to opt-out and delete under the MCDPA [0].

[0] https://ag.state.mn.us/Data-Privacy/Consumer/

  • tptacek 22 hours ago

    They're not stonewalling; they're following the law. Their state and municipal customers would not want them honoring these requests!

dsr_ 1 day ago

Remember that the difference between "Flock can do whatever the hell it wants" and "Flock is required to delete your data at your request" is a law. Citizens vote for legislators. If you want this to be a higher priority for your legislators, buy them off.

Or vote for/against them, that might work too.

tedggh 20 hours ago

“United States v. Jones, 565 U.S. 400 (2012), was a landmark United States Supreme Court case in which the court held that installing a Global Positioning System (GPS) tracking device on a vehicle and using the device to monitor the vehicle's movements constitutes a search under the Fourth Amendment”

https://en.wikipedia.org/wiki/United_States_v._Jones_(2012)

  • a-posteriori 16 hours ago

    I am against Flock as a company.

    Are they "installing a Global Positioning (GPS) tracking device on a vehicle and using the device to monitor the vehicles movement"? No.

    • rkomorn 16 hours ago

      If they have sufficient camera coverage to achieve similar tracking of a vehicle within a certain area, I'd say it becomes a distinction without a difference (which I'm guessing is along GP's point).

ldoughty 1 day ago

I think you're going to have a hard time with this...

Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.

You probably would get a similar response by submitting your request to Amazon web services or Google cloud or whoever has Flocks data: "sorry, we're just holding the data on behalf of Flock"

In either my example case or your stated case, you would have a very hard time convincing the host business to destroy their customers data without a court order or court case that shows their policy is invalid and they must comply.

Not a lawyer, just noting the parallel.

I do appreciate that Flock's response says that they cannot use the data they've collected for other purposes.. which further reinforces my cloud storage analogy -- the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.

  • Barbing 1 day ago

    > the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.

    Would our main check on this be whistleblowers?

hmokiguess 1 day ago

https://www.flocksafety.com/legal/lpr-policy

> In accordance with its Terms and Conditions, Flock Safety may access, use, preserve and/or disclose the LPR data to law enforcement authorities, government officials, and/or third parties, if legally required to do so or if Flock has a good faith belief that such access, use, preservation or disclosure is reasonably necessary to comply with a legal process, enforce the agreement between Flock and the customer, or detect, prevent or otherwise address security, privacy, fraud or technical issues. Additionally, Flock uses a fraction of LPR images (less than one percent), which are stripped of all metadata and identifying information, solely for the purpose of improving Flock Services through machine learning.

In this document, to which they linked in their reply, it says clearly "address ... privacy ... issues."

Does your case not constitute a privacy issue? I would say so.

Continuing down below, their claim on "Trust Us" about how they employ machine learning would need some proper transparency into how can that be guaranteed.

  • FireBeyond 1 day ago

    > Continuing down below, their claim on "Trust Us" about how they employ machine learning would need some proper transparency into how can that be guaranteed.

    Wait til you see their "Transparency Portal" which, if my County and neighboring can be used as a sample size, doesn't even name at least 30% of agencies using Flock.

calmbonsai 1 day ago

Per my understanding of the law for these sorts of data collectors, at least in the U.S., you need to contact the local municipalities (Flock's customers) for this redaction and the jurisprudence is governed at the state and municipal level.

The best source of this information is https://deflock.org/ . FWIW, this is run by a neighbor in Boulder, CO which has been wrestling with the use of these cameras.

  • cousinbryce 1 day ago

    Automating requests to every municipality sure would be fun

gsleblanc 17 hours ago

How did we get to allowing this in the USA? I remember the zeitgeist used to be to make fun of China's mass surveillance / social credit system, and ten years ago proposing to build something like this in the USA would be unthinkable. It's wild that we're just willingly sliding into the same system here too.

  • nichos 17 hours ago

    It's been bad since the patriot act.

  • orthecreedence 4 hours ago

    > ten years ago proposing to build something like this in the USA would be unthinkable

    I think you have your history a bit mixed up. In 2013, Snowden exposed the PRISM program and nobody gave a rat's ass. It was the clear and booming signal that nobody really cares about privacy in the US, and a clear signal that fascist interests have an opportunity to expand. I think Flock would have done really well back then. There is a long, bloody road of futile fighting against the surveillance machine the US has become.

    We love to elevate ourselves above China while engaging in many of the same behaviors (although our version of insidious mass surveillance is privatized, which magically makes it better).

    All that to say, adjust your timeline by a decade or two and your statement is correct again.

barelysapient 1 day ago

If that's a valid excuse than the CCPA isn't worth the paper its written on.

  • dylan604 1 day ago

    The rule of any documentation is that it is out of date as soon as the ink is dry. By the time a regulation is enacted, workarounds/loopholes have already been found (if not intentionally worked into it).

  • ldoughty 1 day ago

    I would argue that the request was invalid in the first place.

    If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.

    The data belongs to the government, and you can't get around that right by going to business that holds the data and asking them to delete it.

    • inetknght 1 day ago

      > If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.

      Sounds reasonable to me. If the police want to put up a camera, then the police should put up a camera.

      Offloading their legal responsibilities to a third party company is shitty.

      • SoftTalker 1 day ago

        So police departments should have to develop and host all their administrative software also? I think we can all see why that would be a terrible idea. Police are like any other government agency or business in that they contract with the private sector for a variety of services that are not in their area of expertise.

        • inetknght 23 hours ago

          > So police departments should have to develop and host all their administrative software also?

          Yes. We're in an high technology and information age. Police should be well-versed and capable of understanding the technologies and informations that people use.

          > I think we can all see why that would be a terrible idea.

          I don't.

          > Police are like any other government agency or business in that they contract with the private sector for a variety of services that are not in their area of expertise.

          Why shouldn't police (or some law enforcement agency) be capable of operating and maintaining law enforcement technologies?

        • MrDrMcCoy 20 hours ago

          Develop, no. Host, yes. They should buy, own, and operate any technology like this on-prem. The only involvement that 3rd-party tech should have is sales, tech support, and maybe blind, encrypted backups accessible only by the municipality.

        • jeroenhd 13 hours ago

          In other countries, police contract companies to develop software and run and manage the software themselves. Putting up a continental drag net to sell to government agencies is something I've only heard of from the US.

          Nobody is saying cops should be writing software, but Flock shouldn't have access to the data and analysis tools it has right now. If American police can afford to be armed similarly to a small army, surely they can pay to run a couple of servers in a basement somewhere.

          I'm surprised the USA is letting this happen given the culture of individual freedom that seems to have traditionally driven American laws.

          • inetknght 7 hours ago

            > Nobody is saying cops should be writing software

            I disagree. Businesses have their own internal software development teams.

            Why shouldn't cops?

      • stephbook 1 day ago

        "Hey private prison please delete all data you have about me. And by the way, I'm locked up here by accident. Please release me."

        • jakeydus 1 day ago

          Honestly private prisons are a farce anyways, so yeah this seems valid to me. The government doesn't get to get out of its obligations to citizens by outsourcing to third parties, and third parties don't get to wield government-level authority without government-level accountability.

    • barelysapient 1 day ago

      But we're not talking about speed cameras or a private entity with exclusive contract with the police to provide traffic enforcement.

      We're talking about Flock. A company offering surveillance as a service. Per their website:

      >Trusted by over 12,000 public safety customers including cities, towns, counties, and business partners.

      If Flock's argument holds then most of the CCPA be circumvented this same way. All it takes is a few entities and clever contract language.

    • TheRealPomax 1 day ago

      Except the data does NOT belong to the government, that's the whole point of Flock operating the way it does. It's not governmental data collection it's data collection by a private company that is then made available to the government upon request. And yeah: it is literally allowed to delete data, because again: it's not a government agency, it's just private data, collected by a private company, with the exact same status as you recording an public intersection with a camera from your window.

rz2k 5 hours ago

I don’t really understand this response. I thought the entire business model of Flock was about circumventing the Fourth amendment by posing as a separate vendor selling information it has collected, rather than acting as an agent of the government.

Are they describing third entities that are between Flock and the government end consumers, when they talk about customers that own the data?

deepsun 1 day ago

If Flock collects and processes PII data, then all their customers are "subprocessors". Flock should really have a Data Processing Agreement with their subprocessors, to legally ensure they follow the same PII handling controls as Flock does.

For example, if Flock receives a legitimate request to delete some data, then Flock must forward that request to all their Data Processors (e.g. including AWS/GCP/Cloudflare) and they must delete it as well.

  • Aaargh20318 1 day ago

    It’s the other way around. Flock is the subprocessor for whoever hired them to collect data. If they are collecting data on behalf a city or municipality, those are the entities you need to address.

    • deepsun 17 hours ago

      I'm not sure about that, I'm pretty sure any company that has your PII is obliged to follow the law, regardless of their contracts with their customers/vendors. Law doesn't make you investigate who's the end customer for your data, only who has it.

      As for "subprocessor" -- it might as well be the case that both sides are subprocessors for each other, nothing wrong with that.

      • Aaargh20318 17 hours ago

        I don’t know this specific law, I just know how it works in the EU with the GDPR. Of course any company that has your PII has to follow the law, but it matters which entity is the one that has is the end customer for your data. They are the one that has to have a legal basis for even collecting that data and they are the one you as a use deal with. If they use a sub-contractor then that’s an internal matter for them and not something you as the subject has to deal with. Of course they have to have a DPA in place with the sub-contractor and they have the responsibility to make sure the sub-contractor follows the law. Likewise the sub-contractor has to make sure that their client has a sound legal basis for processing the PII.

        For example: if a bank outsources part of their KYC process to a third party, that’s not something you have to concern yourself with, you only deal with the bank.

        • deepsun 13 hours ago

          All true, but if the third party receives a delete request from you, they have to oblige (and may notify the bank). Otherwise it would be very easy to circumvent the law by saying "oh we're just keeping it for another customer, we're going to send it to them next year maybe". And that customer will say they need it for another customer etc.

          Privacy law (in your case GDPR) does not concern with who's customer. If a company processes PII -- they are subject to the privacy laws.

tptacek 20 hours ago

I'm not sure how this request even makes sense. By the logic of the demand made here, a municipality can set up an ALPR system to record evidence of crimes (after the grim failure of Flock's alerting in Oak Park where I live, we rolled Flock back to a configuration where that was all it was used for), and random people can mail Flock to be exempted from that evidence collection.

It's good to want things, and I get why people want this specific thing, but the logic that says this is a viable demand under current law seems like it would prove way too much.

pugworthy 1 day ago

An interesting quandary here is that they'd need to constantly scan for you and your vehicle, etc. so that they could know it was you then delete you. So to ensure they don't observe you, they need to observe you.

rdiddly 1 day ago

Flock's customers own the data the same way Uber drivers are independent contractors, i.e. it's designed for weaseling out of obligations.

_moof 1 day ago

They seem to be implying that because they are a "service provider," they aren't responsible for complying with CCPA rules even though they are the ones with the data.

Does this hold water? I'm reading the CCPA rules now but if anyone knows, it would save me some tedious research.

cold_tom 1 day ago

Feels like a classic “we’re just the processor” answer But in reality you have no way to find or contact whoever actually controls the data, so it doesn’t really help. Kind of shows the gap between how the law works on paper vs how these systems work in practice.

pext 23 hours ago

This reminds me of the Andrew Yang's "Data Dividend" project that ideally would have paid end users for their data rather than knowingly giving it aware for free. IMO, it was a great idea but flawed execution against all the lobbying.

nekusar 1 day ago

The only opt-out the citizenry has is with any of the following:

    2x4
    rebar
    spraypaint
    spray foam
    battery powered metal cutter

And bash those pieces of shit to chunks or completely ruin the lens and solar.

Republican community? They love corporate surveillance. Democrat community? They too love corporate surveillance.

There is no "Peoples' Party" that rejects this garbage.

  • MengerSponge 1 day ago

    https://www.techspot.com/news/108045-lidar-great-cars-but-ca...

    It would be a pity if someone made dense point clouds of these devices.

    • maccam912 1 day ago

      One could start doing tours of their city to show tourists where each and every camera is. They're kinda small though, it might be worth a strong laser pointer so you can direct their attention to the cameras easily...

    • culi 19 hours ago

      It would be a pity if someone created a little laser that aims an infrared beam at these cameras as you drive. The beam would be invisible to the human eye and quite hard for anyone to notice what's going on

  • pugworthy 1 day ago

    All materials available at major home improvement centers - which happen to be very popular Flock camera locations.

  • MrDrMcCoy 19 hours ago

    I have a better idea: we should invest in ladders and tools to safely and cleanly take them down. Then mail them back to flock with a note saying "I found this on the side of the road and thought you might want it back."

atmosx 23 hours ago

Back in 2018, CloudFormation data leaked through a public gist (misconfigured gist plugin, I thought the gist was private but it wasn't... I had change the default config) and showed up on an obscure website being served via CloudFlare. When I contacted CF, they claimed they couldn’t remove the cached content because their system “doesn’t work like that". I pushed back and then they said that they're not responsible for the content and that I should send another email to abuse@cf... to get data about the hosting provider and deal with the content provider (e.g. VPS, ISP, whatever). After a few back and forth msgs, I made it clear that if the data wasn’t taken down within a week or so, I would escalate the issue to the local and German GDPR authority (see https://www.ombudsman.europa.eu/en/european-network-of-ombud...).

And what do you know? I got not reply, but the content disappeared in ~48hrs.

gguncth 23 hours ago

It’s fascinating how America could completely get rid of Flock cameras by sending criminals to prison and leaving them there, but we won’t do that so we have these endless arguments about these cameras.

  • AlotOfReading 23 hours ago

    I'm trying to understand the argument here. Are you saying that never releasing convicted criminals would completely eliminate crime?

    That doesn't seem correct, even leaving aside the obvious moral issues with that.

    • kstrauser 22 hours ago

      My interpretation was that they were saying Flock were criminals who should be sent away for good. I don't know if that's right but it would be consistent that way.

    • gguncth 16 hours ago

      Not never releasing convicted criminals, but 1) not letting any crime go without at least 1 day in jail 2) doubling the penalty on each successive offense would incapacitate a huge number of serial offenders

  • jancsika 21 hours ago

    tldr; holding powerful people accountable is very hard.

    Take Cheney's post-911 warrantless wiretapping program. You had Bush's own top DOJ officials threatening to resign over it in 2004, and Jim Risen with a story about it ready to publish in the NYT before the 2004 election. But not only was the White House able to stave off the resignations (IIRC through some tepid FISA oversight of the program), they got NYT editor Bill Keller to scuttle the story on vague national security grounds. (NYT reluctantly published it after the election only because Risen threatened to scoop them in his upcoming book.)

    Then in 2008, Obama claimed the need to "look forward, not backward" wrt this and the Iraq War. Plus his admin renewed Bush's subpoena against Risen on another national intelligence story he'd done!

    Any effort to hold Cheney or the Bush administration accountable for this would have had to battle both parties at the same time as educating the public on the issue, without the help of and backing of media institutions like the NYT.

    I'd be fascinated to hear how anyone in America could seriously make the case that such an indictment could ever be achieved. Even now, decades after the fact when the base of both parties has absolutely nothing but disdain for people like Dick Cheney. But that's just one old example out of many-- current ones obviously are harder since people currently in power tend to be implicated.

mmmlinux 1 day ago

Lot of Flock Defenders in here.

  • pwython 1 day ago

    Not Flock defenders, just people explaining how this is not a CCPA violation. I could set up 100 cameras around town (with property owners permission) and record cars driving by, birds, etc all day. Then I could sell access to that footage to whoever I want. If they want to scrape license plates that's up to the customer and their problem. Or if they want to track birds, cool, that could be in the frame too.

    • kstrauser 1 day ago

      It gets a little weird when you explicitly market them for a purpose, though. Flock doesn't advertise a fleet of cameras suitable for birdwatching or other random activities. They market them specifically for the collection and processing of PII.

      By analogy, Google Docs isn't marketed for healthcare use. If you wanted, you could put a bunch of PHI in a Google doc and it wouldn't be their responsibility. They certainly didn't tell you to do that. However, if they marketed Google Docs as a great place to store PHI, yeah, then suddenly they're on the hook for complying with the relevant laws like HIPAA.

      (Although in this case Google will sign a HIPAA business associate agreement with you and voluntarily agree to comply. They still don't market it that way, or at least don't predominantly do so.)

sklargh 1 day ago

The concept of what constitutes a sale under CCPA is pretty expansive. An exchange of value can be a sale that occurs outside of a processing relationship. I’d say their note is inaccurate.

waterproof 14 hours ago

Happy to contribute to your lawyer fund. I want to see how this plays out.

CSMastermind 17 hours ago

We truly need a constitutional amendment granting a formal right to privacy.

kube-system 1 day ago

I don't think they need your permission to use ALPR on your publicly displayed license plate.

> (2) (A) “Personal information” does not include publicly available information [...]

> (B) (i) For purposes of this paragraph, “publicly available” means any of the following:

> (I) Information that is lawfully made available from federal, state, or local government records.

> (II) Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer

  • _moof 1 day ago

    The information being collected isn't your license plate, it's your location. (Still might not be personal information.)

lacker 1 day ago

Isn't that how it should work?

If you write the police and ask them to delete all their data about you, that isn't a thing that they do. It shouldn't matter if the police store their data on AWS or their own servers.

Flock is a tool used by the police so it should work the same way.

rbbydotdev 23 hours ago

it would be nice if flock did not and could not exist

carabiner 1 day ago

It's not much worse than all the tracking adtech used by FAANG industry. Smartest people in the world working on these systems.

  • kstrauser 1 day ago

    I'd contend that it absolutely is. Adtech is creepy and invasive and weird. Flock is going a step further and actively tracking our movement through the cities where we live.

    I don't like either of those activities, but I think one of them is much worse.

    • jakeydus 1 day ago

      One is making implicit assumptions based on data available to it. The other is literally saying "hey they're right here at this time". At least adtech has _some_ level of obfuscation to it.

      But I'm with you both suck.

annoyingnoob 1 day ago

I've had the same kind of response from Email providers like Sendgrid, they claim its not their data. There is no way to have Sendgrid block you in their entire network, you have to play whack-a-mole with their customers. Seems like a flaw in these privacy laws when you can't ask the actual record holder to remove the records.

ranger_danger 1 day ago

To me this sounds like the equivalent of visiting a website that sells your data, and then asking AWS to delete your personal data when it actually belongs to a customer of theirs and only resides within their private storage.

Would you ask your local ISP to delete data they provided to Tinder like your IP address? That doesn't make sense to me.

  • terrabitz 1 day ago

    Yeah I was getting the same feeling. I wonder if an equivalent request to California police agencies that contract Flock technologies would work though.

    • OkayPhysicist 1 day ago

      Probably not, as the law enforcement agencies get a bunch of exceptions to the CCPA.

  • monooso 1 day ago

    As I understand it, the author wrote to Flock as they are the entity collecting the PII. Your analogy would only make sense if the author had written to Flock's customers (and even then it's a rather strained comparison).

    • ranger_danger 1 day ago

      > they are the entity collecting the PII

      I'm not convinced this is the case. It might be equipment made by them, but does that necessarily mean they were ever even in possession of the data in question?

      Would you ask the manufacturer of your oven what you ate for dinner last week? No, you're just using an appliance that they made.

      In the case of Flock I don't think we have any evidence of whether Flock themselves ever hold or store any data produced by their devices when operated by a customer.

  • alt227 1 day ago

    Yes, I have asked multiple companies to destroy my data under GDPR. Its quite common in Europe.