That evening in Reno, Jason Killinger had no intention of creating legal history. All he wanted was a vacation. The Peppermill Casino seemed like a place to unwind after hours of driving a UPS rig—familiar, intelligent, and anonymous. Rather, a few minutes inside resulted in 11 painful hours of incarceration and a lawsuit that now questions the use of AI by law enforcement.
Killinger was flagged as a 99.9% match to a man who had been banned for sleeping on the casino floor by a facial recognition system designed to be extremely effective at identifying prohibited patrons. This one match, which was probably caused by similar facial features, started a chain of events that got harder and harder to undo as time went on.
| Item | Detail |
|---|---|
| Name | Jason Killinger |
| Profession | Long-haul truck driver (UPS) |
| Incident Location | Peppermill Casino, Reno, Nevada |
| Date of Incident | September 17, 2023 |
| Core Allegation | Wrongful arrest after false AI facial recognition match |
| Legal Action | Civil rights lawsuit against Officer R. Jager (Reno PD) |
| Lawsuit Claims | False imprisonment, malicious prosecution, fabrication of evidence |
| Case Status | Ongoing in U.S. District Court, Nevada (Case No. 3:2025cv00388) |
| Reference Link | https://www.casino.org/news/video-peppermill-casino-facial-recognition-wrongful-arrest-bodycam-footage-released/ |
Security at the casino responded to the warning and took him into custody right away. Officer Richard Jager from the Reno Police Department showed up a few moments later. The next event was neither violent nor noisy. Killinger said it was quiet, procedural, and seriously flawed.
His Nevada Real ID was presented. His commercial driver’s license came next. A pay stub from UPS. He registered his car. A union card, even. Every single one of them had the same name: Jason Killinger. However, these documents were dismissed because they might have been falsified. Later, Jager stated that he thought Killinger had a DMV connection who could produce phony identification. In bodycam footage, that assertion was made casually, but it soon solidified into a defense for the arrest.
After what should have been a quick miscommunication turned into a legal battle, Killinger was held for eleven hours in handcuffs. His identity was eventually verified by fingerprinting, which ought to have put an end to the matter. However, the lawsuit now contends that the harm was already done. Furthermore, the subsequent reporting—official statements, arrest notes, and internal files—painted a picture that was inconsistent with Killinger’s experiences.
The way this case combines technology, suspicion, and human fallibility is what makes it so remarkable. The AI system declared near certainty rather than merely speculating about a potential match. And whether well-earned or not, that confidence seemed to outweigh the officer’s physical presence—a man with the right credentials and more paperwork than most would have assumed he carried.
According to Jager’s report, Killinger had “conflicting identification,” even though all of his IDs pointed to the same individual. Additionally, the fingerprint confirmation that cleared him was left out. According to the lawsuit, this was not merely an oversight but rather a fabrication meant to support an arrest once the officer’s presumptions started to fall apart.
Killinger now argues in court documents that both the arrest and the denial of exculpatory evidence violated his constitutional rights. His legal team contends that this is part of a larger issue with officers’ over-reliance on AI, where they disregard real-world inconsistencies in favor of digital certainty. For communities experimenting with smart policing tools, that issue is especially crucial.
Systems for facial recognition are commended for their exceptional performance in controlled settings. They are regarded as highly adaptable in establishments like casinos, where thousands of faces pass through every day. However, as this lawsuit shows, there are significant risks when using them in enforcement situations without adequate oversight.
Even the color of Killinger’s eyes differs from that of the man the system mistook him for. One is hazel, the other is blue. Their heights are four inches apart. However, those differences were disregarded. Jager’s statement, captured on bodycam, is especially telling: “Hazel and blue eyes are very similar colors.” Inadvertently, it illustrates how quickly real identifiers can be discarded once technology produces a frame.
Long after reading it, I couldn’t help but think about how one seemingly objective digital system can reinforce a small assumption.
For an undisclosed sum, Peppermill Casino discreetly resolved its portion of the case. However, Killinger’s lawsuit against the Reno Police is still pending. Officer Jager is specifically named in the civil rights lawsuit, which also makes claims of evidence fabrication, false imprisonment, and malicious prosecution. The city emphasizes that officers are trained to be skeptical when they come across possible fraud and denies any wrongdoing. Although that training made sense in many situations, it may not have worked for Killinger when it was implemented rigidly.
Deputy City Attorney Jill Drake is also accused in the lawsuit of pursuing prosecution after Killinger was exonerated by the evidence. According to Killinger’s legal team, these actions demonstrate an institutional failure to change direction, implying that the system finds it difficult to retrace its steps once it has committed to a course.
Many people who are following this case find that the legal issues are intertwined with practical ones. How can we strike a balance between the accountability of human oversight and the promise of automation? Will technology ever be able to completely replace the subtleties of in-person judgment, particularly when freedom is at risk?
Technology abandonment is not required in Killinger’s case. Rather, it advocates for more responsible integration, acknowledging that although AI tools can be especially inventive, they are not immune to the prejudices or blind spots of their users. Pattern recognition-based systems must also allow for autonomous reasoning.
The federal case is becoming more and more well-known outside of Reno. Civil rights activists and legal experts are keeping a close eye on the outcome as well as the potential precedent. If proven correct, Killinger’s allegations could have an impact on how facial recognition software is used and audited by government organizations nationwide.
The lawsuit is still ongoing as of right now, with planned filings, a discovery phase, and the possibility of a trial. Nevertheless, it has already initiated a worthwhile discussion, regardless of its ultimate decision. It’s not about whether AI is good or bad; rather, it’s about how justice-supporting systems need to be adaptable enough to acknowledge their mistakes.
Perhaps most importantly, a man’s name, established identity, and dignity shouldn’t be so readily disregarded, particularly when the truth is in his own hands.

