1984 A Surveillance State
Where Is the Line?
Surveillance, Technology, and the Death of “Public”
Your phone leaks Bluetooth and Wi-Fi signals constantly. Flock Safety cameras automatically capture every license plate, vehicle make/model/color, direction of travel, and timestamp as vehicles pass. Private companies and police departments feed this into massive, searchable databases that log movements over months or years. Ring doorbells, traffic cams, and even vehicle telematics add layers. All of it gets aggregated, queried retroactively, and used to build profiles.
The old polite debate was: “Is license plate data public if you're on a public road?”
The brutal reality now: When does automated, persistent, aggregated observation cross into accusation—and why does the system so easily shift the burden of proof onto the accused citizen?
The Legal Foundation: Expectation of Privacy (That's Been Hollowed Out)
It began with Katz v. United States (1967): the Fourth Amendment protects wherever someone has a “reasonable expectation of privacy.”
Read the full Supreme Court decision
In 1967, surveillance meant a single officer with limited tools and short-term observation. Today it's thousands of always-on cameras, AI pattern recognition, indefinite data retention, and instant cross-referencing. The legal test hasn't kept pace.
The Infrastructure Problem: It's One Giant Behavioral Black Box
Stop thinking of Flock cameras, Ring doorbells, Bluetooth sniffers, or cell-tower pings as isolated gadgets. They're feeds into unified systems that create:
- Detailed movement timelines (where you went, when, how often)
- Behavioral inference (routine routes, deviations, associations)
- Proximity-based guilt (you drove near a crime scene → you're a person of interest)
- Retroactive fishing expeditions (query any plate for any reason later)
Individually, each ping is trivial. Together, it's a near-total reconstruction of your life—without a warrant, without notice, and often without meaningful oversight.
The Snowden Moment Was the Warning Shot
Edward Snowden in 2013 revealed bulk collection: grab everything, store it forever, search it when needed. Governments did it secretly. Now local police, HOAs, and private firms do it openly—and market it as “crime-solving efficiency.” The philosophy is identical: Collect it all now. Figure out who to accuse later.
When the System Accuses the Wrong Person — and Forces Them to Prove Innocence
Late September 2025 — Bow Mar and Columbine Valley, Colorado. A $25 package vanishes from a front porch. Residents report it. Local police query Flock Safety's network of automated license plate readers (ALPRs). The system flags a green Rivian truck that passed through the small town of Bow Mar around the time of the theft.
Columbine Valley Police Sgt. Jamie Milliman shows up at Chrisanna Elser's Denver home on September 27, summons already prepared. Bodycam and her Ring doorbell capture the entire confrontation.
Milliman opens aggressively: “You know why I'm here.” He tells her the town has “cameras everywhere” — “You can't get a breath of fresh air in or out without us knowing, correct?” He shows partial Flock data: her Rivian entering and exiting Bow Mar. He pairs it with Ring doorbell video from the victim's house showing a blonde woman taking the package. Same general hair color as Elser. To him, that's conclusive: “We have video of you stealing the package.” He declares it a “lock” and “no doubt” in his mind.
Elser immediately denies it and offers to show evidence. Milliman refuses to view anything she has then and there: “If you're going to deny it, I'm not going to extend you any courtesy.” He tells her to bring proof to court if she wants. He issues the citation on the spot—flipping the entire presumption of innocence.
Elser spends the next days and weeks doing police work the system should have done:
- Pulls phone location data showing her route
- Confirms a timestamped tailor appointment
- Exports footage from her Rivian's built-in Road Cams (multiple angles, continuous recording), proving her truck never stopped near the house, never approached the porch
The real thief? A different blonde woman—visible differences in face shape, nose, apparent age when the full Ring video is examined (which police had but didn't scrutinize deeply enough). Elser's vehicle was merely passing through town; Flock hit on proximity and silhouette match, not actual evidence of crime.
After weeks of Elser submitting her own evidence and pushing back, the Columbine Valley PD finally emails her: “nicely done btw,” and confirms the summons won't be filed. The officer faces unspecified disciplinary action (extra training mentioned in some reports) and possible internal review. No formal apology to Elser has been issued despite her requests via call, email, and in-person visit to the station. The department later posted on Nextdoor asking for tips about a “suspect” green Rivian—clearly referencing her vehicle—even after she was cleared.
Sources:
- Colorado Sun (Nov 2025) – Officer discipline
- CBS Colorado – Detailed confrontation and quotes
- Electrek – Rivian footage exoneration
- 9News Denver – Elser's own words on burden flip
This isn't rare. Flock systems rely on plate reads + vehicle attributes (color, make, sometimes silhouette AI), not foolproof identification. Misreads happen (e.g., "7" as "2", "H" as "M" in other documented cases). Proximity hits lead to aggressive policing. Innocent people get door-knocked, cited, publicly named in neighborhood apps, and forced to burn time/money clearing names—while the system faces almost no downside for bad hits.
The Mosaic Effect — Legal Fiction vs. Real-World Power
- One plate scan → low-value data point
- Dozens/hundreds → clear travel patterns
- Thousands over time → intimate life model (home, work, habits, relationships)
- One partial match + blurry video → “We have video of you committing the crime” → prove you're innocent
Courts still often treat each scan as isolated (no warrant needed). Databases treat them as cumulative intelligence. The gap produces cases like Elser's.
The Double Standard in Stark Relief
| Actor | Action | Consequence |
|---|---|---|
| Private citizen | Tracks someone's movements persistently | Stalking felony, restraining order, jail time |
| Police + Flock network | Tracks thousands persistently, issues citations on weak/partial matches | “Public safety tool” — accused citizen foots bill to disprove |
Real-World Concerns — With Organizations Sounding the Alarm for Years
- ACLU on ALPR risks: mass tracking of innocents, chilling effects, data sharing abuses
- EFF on Flock/ALPR: warrantless dragnet surveillance, error rates, mission creep
Elser's case crystallizes the cost: an innocent person publicly accused, aggressively confronted, handed a summons, forced into detective work—all because incomplete data + overconfident policing = “good enough” for accusation.
Texas Law: Targets Individuals, Ignores the Machines
- Texas Penal Code §16.06 – Unlawful Electronic Tracking
- §39.06 – Misuse of Official Information
- §42.072 – Stalking
These bite hard on private stalkers or rogue cops. They barely touch mass Flock deployments rolling out in Texas suburbs, HOAs, and PDs—systems doing the same persistent tracking at scale, with far less accountability.
Orwell Was an Optimist — This Is Softer, More Participatory Tyranny
1984 featured overt force, constant propaganda, visible oppression. Today's version is opt-in convenience: you buy the Rivian with cameras, install Ring for “peace of mind,” accept “public safety” Flock poles. When it misfires and accuses you, the response is: prove it didn't happen. We didn't get forced—we subscribed, clicked Agree, and drove right in.
The Power Asymmetry Is Deliberate Design
| Capability | Who Has It |
|---|---|
| Mass, persistent tracking + indefinite storage | Police, Flock, data brokers |
| Legal immunity, qualified protection, no liability for errors | Institutions & contractors |
| Burden to disprove false positives | You—the driver, the resident, the citizen |
The Uncomfortable Truth
Modern surveillance doesn't just observe—it remembers, correlates, predicts, and accuses with terrifying speed and certainty. The legal system clings to 1967 rules. The tech vendors and agencies operate in 2026 reality. When they get it wrong, you're the one who pays—in time, stress, reputation, and sometimes legal fees—to set the record straight.
What Are We Actually Willing to Accept?
For “safety”? For “package recovery”? For “efficiency”?
At the price of: routine false accusations, presumption-of-guilt policing, permanent digital dossiers, and the slow erosion of being treated as innocent until proven guilty.
Real Leverage Points Still Exist
- Courts applying Carpenter-style mosaic analysis to Flock/ALPR networks
- Local bans or strict limits on Flock-style systems (data retention caps, warrant requirements for queries)
- FOIA/public records exposing usage logs, error rates, sharing practices
- Enough people refusing the next “free” or “safety” camera installation
Final Thought
We didn't build tools that merely watch the world.
We built tools that track it exhaustively, analyze it preemptively, accuse on partial data—and then demand you disprove the machine's hunch.
The surveillance is here. The question isn't if it exists.
It's how many ruined days, reversed burdens, and eroded freedoms we're prepared to endure before we demand better.
Disclaimer: Informational only — not legal advice. All case details drawn from publicly reported mainstream sources linked above.
Comments
Post a Comment