Category Archives: Technology

Can Artificial Intelligence Lead to Wrongful Convictions?

Image: (Kathleen Crosby/Innocence Project)

Photo Courtesy of Kathleen Crosby & The Innocence Project

The Innocence Project published a very insightful article describing how AI-based surveillance systems lack independent verification, empirical testing, and error rate data. These shortcomings lead to wrongful arrests and potentially wrongful convictions. More worrisome, there’s a disturbing readiness among some system actors, especially prosecutors, to accept AI-based evidence at face value. As a result, the eager acceptance of AI-based evidence mirrors the same flawed embrace of misapplied forensic science, which has contributed to numerous wrongful convictions.

BACKGROUND

The use of unreliable forensic science has been identified as a contributing factor in nearly 30% of all 3,500+ exonerations nationwide. Take bite mark analysis, for example. The practice was widely used in criminal trials in the 1970s and 1980s but is poorly validated, does not adhere to scientific standards, lacks established standards for analysis and known error rates, and relies on presumptive tests. It has since been discredited as unreliable and inadmissible in criminal trials due to its shortcomings. Still, there have been at least 24 known wrongful convictions based on this unvalidated science in the modern era.

ADMITTING SCIENCE-BASED EVIDENCE 

The 1923 Frye v. United States decision introduced the “general acceptance” standard for admissibility at trial. In short, the scientific technique must have expert recognition, reliability, and relevance in the scientific community to be “generally accepted” as evidence in court. Some state courts still apply this standard today. Also, the Daubert v. Merrell Dow Pharmaceuticals Inc. decision shifted the focus to evaluating the relevance and reliability of expert testimony to determine whether it is admissible in court.

In applying the Daubert standard, a court considers five factors to determine whether the expert’s methodology is valid:

  • Whether the technique or theory in question can be, and has been, tested;
  • Whether it has been subjected to publication and peer review;
  • Its known or potential error rate;
  • The existence and maintenance of standards controlling its operation; and
  • Whether it has attracted widespread acceptance within a relevant scientific community.

Under Daubert and Frye, much AI technology, as currently deployed, doesn’t meet the standard for admissibility. ShotSpotter, for example, is known to alert for non-gunfire sounds and often sends police to locations where they find no evidence that gunfire even occurred. It can also “significantly” mislocate incidents by as much as one mile. It, therefore, should not be admissible in court.

Similarly,  facial recognition technology’s susceptibility to subjective human decisions raises serious concerns about the technology’s admissibility in court. Such decisions, which empirical testing doesn’t account for, can compromise the technology’s accuracy and reliability. Research has already shown, for instance, that many facial recognition algorithms are less accurate for women and people of color, because they were developed using photo databases that disproportionately include white men.

My opinion? If we are to prevent a repeat of the injustices we’ve seen in the past from the use of flawed and untested forensic science, we must tighten up the system. Too many investigative and surveillance technologies remain unregulated in the United States.

Please contact my office if you, a friend or family member are charged with a crime. Hiring an effective and competent defense attorney is the first and best step toward justice.

State v. Ortega: Court Upholds Forensic Search of Defendant’s Cell Phone Using “Cellebrite Touch” Software

Supreme Court cell phone ruling doesn't affect local law enforcement

In State v. Ortega, the WA Court of Appeals held that police officers executing a search warrant for an electronic device are not exceeding the scope of the warrant by manually searching through all the images on a device to find and seize images depicting specific content.

FACTUAL BACKGROUND

Mr. Ortega was investigated for sex offenses against his girlfriend’s children. Police believed Mr. Ortega’s cell phone probably contained evidence of the crimes with which he was charged. They obtained possession of the cell phone from a family member, who voluntarily turned it over to police. The court granted the police’s request for a search warrant. Pursuant to the warrant, police searched the phone and seized 35 images, many of which were incriminating.

Mr. Ortega moved to suppress the fruits of the cell phone search. He argued that the warrant was insufficiently particular, in violation of the state and federal constitutions. At his suppression hearing, officers testified they began the search by connecting Mr. Ortega’s phone to an extraction device known as the “Cellebrite Touch.”  They ran an extraction that allowed the files on Mr. Ortega’s phone to be organized into categories (for example, messages, images, etc.). Once extracted, data is not visible unless someone opens the individual category folders through Cellebrite’s physical analyzer program.

After the data extraction, police produced a thumb drive containing more than 5,000 extracted images. One officer testified it was similar to being given a physical photo album and having to flip through the pages to find what you are looking for.

The trial court denied Mr. Ortega’s motion to suppress the images seized from his cell phone. Mr. Ortega subsequently waived his right to a jury trial and his case was tried to the bench. The court found Mr. Ortega guilty as charged. Mr. Ortega timely appealed on arguments that the State’s case was tainted by evidence seized during an unconstitutional cell phone search.

COURT’S ANALYSIS & CONCLUSIONS

1. The Search Warrant Passed the “Particularity Requirement.”

The Court of Appeals (COA) began by explaining that both the Fourth Amendment and the Washington Constitution require that a search warrant describe with particularity the place to be searched and the persons or things to be seized. The particularity requirement, which aims to prevent generalized rummaging through a suspect’s private affairs, is of heightened importance in the cell phone context. This is because of the vast amount of sensitive data contained on the average user’s smartphone device. The purposes of the particularity requirement are to prevent a general search, limit the discretion of executing officers, and ensure that items to be searched or seized are supported by probable cause, said the COA.

Consequently, the COA reasoned the warrant satisfied the particularity requirement. It directed officers to search the phone and seize images and/or videos depicting Mr. Ortega engaged in sexual contact with minors.

“This did not permit a general rummaging; it was akin to a warrant allowing a search of a residence for controlled substances and indicia of ownership.” ~WA Court of Appeals

2. Officers Did Not Exceed the Scope of the Warrant.

The COA discussed the scope of a search can be limited by identifying targeted content. When a warrant authorizes a search for a particular item, the scope of the search “generally extends to the entire area in which the object of the search may be found.

The COA reasoned that police properly limited the scope of their search to the terms of the warrant. The incriminating images could have been located almost anywhere on Mr. Ortega’s cell phone—not only in a photos application, but also in e-mails and text messages.

Furthermore, had the detectives chosen to search Mr. Ortega’s phone manually, they likely would have needed to sort through data other than images in order to find the targets of their search. And they would have risked jeopardizing the evidentiary integrity of the phone. By instead using forensic software, the detectives were able to organize the data from Mr. Ortega’s phone without first viewing the phone’s contents. This enabled them to limit their search to data labeled as photos and videos, thus restricting the scope of the search to areas where the target of the search could be found.

“By using forensic software to extract and organize data from Mr. Ortega’s phone, the detectives were able to minimize their review of the phone contents and tailor their search to the evidence authorized by the warrant. This did not violate Mr. Ortega’s constitutional rights.” ~WA Court of Appeals

With that, the COA denied Mr. Ortega’s appeal and upheld his convictions.

Please review my Search & Seizure Legal Guide and contact my office if you, a friend or family member are charged with a crime. Hiring an effective and competent defense attorney is the first and best step toward justice.

Can Police Access Your Home Security Cameras?

An illustration of an police badge-shaped eyeball placed on the top of a video doorbell with a blue background.

Photo Credit: Reviewed / Tara Jacoby

Home security systems are an excellent way to protect your loved ones and belongings from unwanted intruders. With a sophisticated security setup, you can ensure a sense of control, vigilance and assurance, allowing you to focus on the moments that truly matter. Privacy is a priority for most homeowners investing in smart home security devices, especially when it comes to worries about hacking or data theft.

Although beneficial, these devices raise other concerns. Can law enforcement legally capture and/or review your home surveillance video footage whenever they want? Would you even know if they did?

REQUESTING CLOUD VIDEO UNDER “EXIGENT CIRCUMSTANCES” EXCEPTION TO  SEARCH WARRANT REQUIREMENT.

First, law enforcement may request cloud video footage in case of an emergency, better known as “Exigent Circumstances.” Here an “emergency” typically means a life-or-death situation or something else high-stakes, such as a kidnapping or a manhunt for a violent criminal.

Most security companies that offer video storage in North America will obey these emergency requests. Here’s an explanation from Google Nest on how it handles sharing user data with law enforcement. It also exlaines how it may try to narrow the scope of the request for user privacy, and how it may or may not let users know about the request. Security users may not know that their cloud videos were accessed by police.

“Before complying with a request, we make sure it follows the law and Nest’s policies,” the company says. “We notify users about legal demands, when appropriate, unless prohibited by law or court order. And if we think a request is overly broad, we’ll seek to narrow it.” ~Google Nest

In these situations, law enforcement contacts the cloud video management organization directly (usually your security brand like Arlo or Ring), and requests specific video footage from an area through channels set up to allow for such requests.

SEEKING A WARRANT FOR HOME SECURITY DEVICES

Another option police have to seize cam footage is via a warrant or similar court order. Warrants allow police to take home security devices and examine them, including any local storage that you have, so avoiding cloud storage won’t help very much.

Typically, warrants are granted only when police can provide some evidence that a crime may have been committed on the property. It depends on the court and judge where the warrant is requested, but granting warrants is common. The warrant then becomes active and has a specific scope for where and what it applies to (which is why you should always ask to view a warrant if law enforcement wants your security cameras).

Warrants raise a further important question: Will you get your home cam back if it’s seized during a legal search? That’s a subject of some deliberation, although it’s generally agreed from cases like these that the Fourth Amendment prevents law enforcement from holding onto digital devices or data indefinitely. Getting your camera back during a real-world seizure may not be so cut and dried.

REGISTERING SURVEILLANCE EQUIPMENT WITH LAW ENFORCEMENT AGENCIES

There’s an interesting third option for law enforcement that’s been growing in popularity, especially in certain cities and states where police departments are looking to tap into smart home tech. Home security owners can register their cameras and similar devices with local police departments, letting them know there is a device at a specific property that’s recording. We’re seeing programs like this everywhere from Buffalo, New York’s SafeCam to the Bay Area in California.

These programs vary, but there are several important points. First, this isn’t the same thing as registering an alarm system via a local permit, it’s specifically for video recording devices. Second, registering does not mean police can look through your cams or view any recorded footage. They know where registered residential cameras are, so they can request footage directly from participants with cameras near a crime, etc.

Finally, if you do grant permission to police to access a registered camera, they’ll be able to view and copy video images, which can be used as evidence in a criminal proceeding. Often, registration programs have requirements like banning you from sharing video with the media and other fine print. Keep in mind, police may still be able to seek a warrant to take cams and video footage if you deny a request via a registration program.

POSTING HOME SECURITY FOOTAGE ONLINE

A number of security brands offer ways to post videos online through things like the Ring Neighbors app, dedicated forums, social media groups and so on. If you post a video in a public space like this, even if you’re only asking for advice, then it’s fair game for law enforcement to use as well. Just this year, however, Ring decided to end its more liberal sharing program with police, limiting them to the life-or-death requests discussed above.

Please review my Search & Seizure Legal Guide and contact my office if you, a friend or family member are charged with a crime involving home security footage. Hiring an effective and competent defense attorney is the first and best step toward justice.

DNA + Facial Recognition Technology = Junk Science

Psychological Assessment in Legal Contexts: Are Courts Keeping “Junk Science”  Out of the Courtroom? – Association for Psychological Science – APS

Intriguing article in Wired featured a story where police used DNA to predict a suspect’s face and then tried running facial recognition technology on the photo.

BACKGROUND FACTS

In 2017, detectives working a cold case at the East Bay Regional Park District Police Department got an idea, one that might help them finally get a lead on the murder of Maria Jane Weidhofer. Officers had found Weidhofer, dead and sexually assaulted, at Berkeley, California’s Tilden Regional Park in 1990. Nearly 30 years later, the department sent genetic information collected at the crime scene to Parabon NanoLabs—a company that says it can turn DNA into a face.

Soon, Parabon NanoLabs provided the police department with the face of a potential suspect, generated using only crime scene evidence.

The image Parabon NanoLabs produced, called a Snapshot Phenotype Report, wasn’t a photograph. It was a 3D representation of how the company’s algorithm predicted a person could look given genetic attributes found in the DNA sample.

The face of the murderer, the company predicted, was male. He had fair skin, brown eyes and hair, no freckles, and bushy eyebrows. A forensic artist employed by the company photoshopped a nondescript, close-cropped haircut onto the man and gave him a mustache—an artistic addition informed by a witness description and not the DNA sample.

In 2017, the department published the predicted face in an attempt to solicit tips from the public. Then, in 2020, one of the detectives  asked to have the rendering run through facial recognition software. It appears to be the first known instance of a police department attempting to use facial recognition on a face algorithmically generated from crime-scene DNA.

At this point it is unknown whether the Northern California Regional Intelligence Center honored the East Bay detective’s request.

DOES THIS SEARCH VIOLATE CONSTITUTIONAL RIGHTS?

Some argue this search emphasizes the ways that law enforcement is able to mix and match technologies in unintended ways. In short, this search uses untested algorithms to single out suspects based on unknowable criteria.

“It’s really just junk science to consider something like this,” Jennifer Lynch, general counsel at civil liberties nonprofit the Electronic Frontier Foundation, tells WIRED. Running facial recognition with unreliable inputs, like an algorithmically generated face, is more likely to misidentify a suspect than provide law enforcement with a useful lead, she argues.

“There’s no real evidence that Parabon can accurately produce a face in the first place . . . It’s very dangerous, because it puts people at risk of being a suspect for a crime they didn’t commit.” ~Jennifer Lynch, General Counsel at Electronic Frontier Foundation.

According to a report released in September by the US Government Accountability Office, only 5 percent of the 196 FBI agents who have access to facial recognition technology from outside vendors have completed any training on how to properly use the tools. The report notes that the agency also lacks any internal policies for facial recognition to safeguard against privacy and civil liberties abuses.

In the past few years, facial recognition has improved considerably. In 2018, when the National Institute of Standards and Technology tested face recognition algorithms on a mug shot database of 12 million people, it found that 99.9 percent of searches identified the correct person. However, the NIST also found disparities in how the algorithms it tested performed across demographic groups.

A 2019 report from Georgetown’s Center on Privacy and Technology was written by Clare Garvie, a facial recognition expert and privacy lawyer. She found that law enforcement agencies nationwide have used facial recognition tools indiscriminately. They’ve tried using images that include blurry surveillance camera shots, manipulated photos of suspects, and even composite sketches created by traditional artists.

“Because modern facial recognition algorithms are trained neural networks, we just don’t know exactly what criteria the systems use to identify a face . . . Daisy chaining unreliable or imprecise black-box tools together is simply going to produce unreliable results. We should know this by now.” ~ Clare Garvie, Esq.

Please contact my office if you, a friend or family member are charged with a crime. Hiring an effective and competent defense attorney is the first and best step toward justice.

Gunshot Location Technology: Effective or Not?

How ShotSpotter CEO says technology can 'change the risk calculation' for  shooters - mlive.com

In an interesting story, the Seattle City Council has greenlit funding for a controversial gunshot locator system as part of a larger crime prevention pilot project.

WHAT IS GUNSHOT DETECTION TECHNOLOGY?

Gunshot Detection Technology (GDT) uses sophisticated acoustic sensors to detect, locate and alert law enforcement agencies and security personnel about local illegal gunfire incidents in real-time. The digital alerts include a precise location on a map. It corresponds data such as the address, number of rounds fired, type of gunfire, etc. delivered to any browser-enabled smartphone or mobile laptop device as well as police vehicle MDC or desktop.

GDT is touted to protect officers by providing them with increased tactical awareness. It also enables law enforcement agencies to better connect with their communities and bolsters their mission to protect and serve.

With GDT, officers can more quickly arrive at the scene of a crime with an increased level of safety. They know exactly where the gunfire took place. In many cases, an officer can arrive with the shooter still at the crime scene. If the criminal has fled, shell casings and/or other evidence can be recovered and used for investigative and potential prosecution purposes and key witnesses can be interviewed at the crime scene.

Below are just some of the reports showing how ShotSpotter technology is being rejected by cities and police departments. It can can hurt police response times, result in more racial bias, and violate people’s civil liberties.

POLICE CHIEFS CRITICAL OF SHOTSPOTTER, CITIES PULLING OUT OF CONTRACTS

  • San Antonio’s chief of police led the charge to end the city’s ShotSpotter program. He said, “We made a better-than-good-faith effort trying to make it work.” Instead of renewing with ShotSpotter, he said “We’re going to use that money to provide more community engagement, which ShotSpotter can’t provide.”
  • When Fall River, Massachusetts ended its contract with ShotSpotter, their chief of police said, “It’s a costly system that isn’t working to the effectiveness that we need it to work in order to justify the cost.” 
  • Portland, Oregon decided not to move forward with ShotSpotter in July after their mayor approved a pilot program in 2022. The mayor said he was interested in pursuing better strategies.
  • Atlanta decided not to move forward with the technology after two separate pilot programs led to poor results.
  • Chicago’s mayor promised to get rid of ShotSpotter in the city during his campaign. Their contract with the company is up in February.
  • New Orleans; Dayton, OH; Charlotte, NC; and Trenton, NJ also ended their ShotSpotter contracts.

INEFFECTIVE AND HURTS POLICE RESPONSE TIMES

  • study found that CCTV paired with ShotSpotter-type technology, as proposed in this budget, “did not significantly affect the number of confirmed shootings, but it did increase the workload of police attending incidents for which no evidence of a shooting was found.”
  • study published last year of 68 large metropolitan counties in the United States found “ShotSpotter technology has no significant impact on firearm-related homicides or arrest outcomes.”
  • An article by a crime analyst working for the St. Louis Police Department found ShotSpotter-type technology “simply seem to replace traditional calls for service and do so less efficiently and at a greater monetary cost to departments.”
  • report by the Chicago inspector general found that around 90 percent of ShotSpotter alerts are false positives, resulting in police being dispatched 40,000 times when no gun-related violence had taken place.
  • The technology was found to be ineffective in a report by the City of Atlanta, costing $56,000 per gun recovered – money that would have been more effective in other programs.

CIVIL LIBERTY & EQUITY CONCERNS

  • The ACLU-WA has asked the Council to reject funding ShotSpotter, “given that investing in gunshot detection and CCTV technologies will not prevent crime and violence and will adversely impact communities through increased police violence and heightened privacy risks.”
  • Privacy advocates recently asked the Department of Justice to investigate gunshot detection companies because they lead to over policing of communities of color and may be violating the Civil Rights Act.
  • Faulty evidence from ShotSpotter has been used to wrongfully imprison people like Michael Williams. He was held in Chicago for more than a year before the charges were dismissed and prosecutors admitted they had insufficient evidence, according to an AP report.

My opinion? Only time will tell whether GDT is effective and/or equitable.

Please contact my office if you, a friend or family member are charged with a Firearm Offense or any other crime. Hiring an effective and competent defense attorney is the first and best step toward justice.

AI Facial Recognition Tech Leads to Mistaken Identity Arrests

Facial recognition fails on race, government study says - BBC News

Interesting article by Sudhin Thanawala and the Associated Press describes lawsuits filed on the misuse of facial recognition technology by law enforcement. The lawsuits come as Facial Recognition Technology and its potential risks are under scrutiny. Experts warn about Artificial Intelligence (AI’s) tendency toward errors and bias.

Numerous black plaintiffs claim they were misidentified by facial recognition technology and then wrongly arrested. Three of those lawsuits, including one by a woman who was eight months pregnant and accused of a carjacking, are against Detroit police.

The lawsuits accuse law enforcement of false arrest, malicious prosecution and negligence. They also allege Detroit police engaged “in a pattern of racial discrimination of (Woodruff) and other Black citizens by using facial recognition technology practices proven to misidentify Black citizens at a higher rate than others in violation of the equal protection guaranteed by” Michigan’s 1976 civil rights act.

WHAT IS FACIAL RECOGNITION TECHNOLOGY?

The technology allows law enforcement agencies to feed images from video surveillance into software that can search government databases or social media for a possible match. Critics say it results in a higher rate of misidentification of people of color than of white people. Supporters say it has been vital in catching drug dealers, solving killings and missing persons cases and identifying and rescuing human trafficking victims. They also contend the vast majority of images that are scoured are criminal mugshots, not driver’s license photos or random pictures of individuals.

Still, some states and cities have limited its use.

“The use of this technology by law enforcement, even if standards and protocols are in place, has grave civil liberty and privacy concerns . . . And that’s to say nothing about the reliability of the technology itself.” ~Sam Starks, a senior attorney with The Cochran Firm in Atlanta.

FALSE ARRESTS BASED ON INACCURATE IDENTIFICATIONS FROM AI CAN SUPPORT A DEFENSE OF MISTAKEN IDENTITY

My opinion? AI should be abandoned if the technology incorrectly identifies perpetrators. As a matter of law, the prosecution must prove the identity of the perpetrator of an alleged crime.

According to the jury instructions on Mistaken Identity, in determining the weight to be given to eyewitness identification testimony, jurors may consider other factors that bear on the accuracy of the identification. These may include:

  • The witness’s capacity for observation, recall and identification;
  • The opportunity of the witness to observe the alleged criminal act and the perpetrator of that act;
  • The emotional state of the witness at the time of the observation;
  • The witness’s ability, following the observation, to provide a description of the perpetrator of the act;
  • The witness’s familiarity or lack of familiarity with people of the perceived race or ethnicity of the perpetrator of the act;
  • The period of time between the alleged criminal act and the witness’s identification;
  • The extent to which any outside influences or circumstances may have affected the witness’s impressions or recollection; and
  • Any other factor relevant to this question.

But what happens when the “eyewitness identifier” is, in fact, AI technology?

At trial, the defense should procure an expert witness who’d testify on the inaccuracies of AI technology. That’s an appropriate route to challenging the credibility of this “witness.”

Please review my Search & Seizure Legal Guide and contact my office if you, a friend or family member are charged with a crime involving AI. Hiring an effective and competent defense attorney is the first and best step toward justice.

Alcohol Detection Systems in All New Vehicles?

Clemson Vehicular Electronics Laboratory: Alcohol Sensor

Great article by journalist Murray Slovik says that technologies are needed for alcohol-impairment detection in cars.

Apparently, DUI remains a leading cause of injury-involved highway crashes. According to the National Highway Traffic Safety Administration (NHTSA), in 2020, roughly one in three traffic fatalities resulted from crashes involving alcohol-impaired drivers.

Since 2000, more than 230,000 people have lost their lives in crashes involving alcohol, again according to NHTSA. In 2020, an estimated 11,654 fatalities occurred in alcohol-impaired crashes. This number represented about 30% of all traffic fatalities that year and a 14% increase over the 10,196 individuals who died because of alcohol-impaired crashes in 2019. This comes at a time when vehicle miles traveled in the U.S. decreased by about 13.2% in 2020.

In response, the National Transportation Safety Board (NTSB) is making a major push to cut down on the number of alcohol-related crashes and deaths. They’ve asked the NHTSA to require that all new cars have an alcohol detection device in them. This move stems in part from an investigation into a California crash that killed nine – including seven children.

TECHNOLOGY RECOMMENDATION DETAILS

The NTSB is recommending measures leveraging new in-vehicle technologies that can limit or prohibit impaired drivers from operating their vehicles as well as technologies to prevent speeding. They include:

  • Requiring passive vehicle-integrated alcohol-impairment detection systems, advanced driver monitoring systems, or a combination of the two that would be capable of preventing or limiting vehicle operation if it detects driver impairment by alcohol. The NTSB recommends that the National Highway Traffic Safety Administration require all new vehicles be equipped with such systems.
  • Incentivizing vehicle manufacturers and consumers to adopt intelligent speed adaptation systems that would prevent speed-related crashes.

The issues of impaired driving and excessive speeding are both on the NTSB’s Most Wanted List of Transportation Safety Improvements. To prevent alcohol and other drug-impaired driving crashes, the NTSB has called for, as previously mentioned, in-vehicle alcohol detection technology as well as the lowering of the blood alcohol concentration limit to .05 g/dL or lower. They also recommend alcohol ignition-interlock devices for people convicted of driving while intoxicated and that regulators develop a standard of practice to improve drug toxicology testing.

Furthermore, the NTSB has called for a comprehensive strategy to eliminate speeding-related crashes. It would combine traditional measures like enforcement and regulation with new technological advances such as speed limiters and intelligent speed-adaptation technology.

SPEED-LIMITING TECH

The NTSB is looking for regulators to develop performance standards for such advanced speed-limiting technology targeted at heavy vehicles including trucks, buses, and motor coaches. They want regulators to require all newly manufactured heavy vehicles be equipped with such devices. NTSB also wants:

  • To collaborate with traffic safety stakeholders to develop and implement an ongoing program to increase public awareness of speeding as a national traffic safety issue.
  • To revise regulations to strengthen requirements for all speed engineering studies and remove the guidance that speed limits in speed zones be within 5 mph of the 85th percentile speed. The 85th percentile speed is the speed at or below where 85% of drivers will operate with open roads and favorable conditions.
  • To update speed-enforcement guidelines to reflect the latest automated speed-enforcement technologies and operating practices and promote these guidelines.

Research suggests speeding is a problem that’s worsening. In 2020, there were 11,258 fatalities in crashes in which at least one driver was speeding, according to the NHTSA. This simply underscores that speeding increases both the chances of being involved in a crash and the severity of crash injuries.

Please contact my office if you, a friend or family member are charged with a DUI, Reckless Driving or any other crime. Hiring an effective and competent defense attorney is the first and best step toward justice.

Extraction of Smartphone Data by U.S. Law Enforcement

Mass Extraction | Upturn

A new report from upturn.org reveals that thousands of smartphones are searched by police every day across the US. Unfortunately, most searches are done without a warrant and in violation of the Fourth Amendment’s guarantee against unreasonable searches and seizures.

THE PROBLEM

Law enforcement agencies across the country search thousands of cellphones, typically incident to arrest. To search phones, law enforcement agencies use mobile device forensic tools (MDFTs). This powerful technology allows police to extract a full copy of data from a cellphone. This data includes all emails, texts, photos, location, app data, and more. The report documents more than 2,000 agencies that have purchased these tools, in all 50 states and the District of Columbia.

“We found that state and local law enforcement agencies have performed hundreds of thousands of cellphone extractions since 2015, often without a warrant. To our knowledge, this is the first time that such records have been widely disclosed.” ~Upturn.org

According to the report, every American is at risk of having their phone forensically searched by law enforcement. Police use these tools to investigate assault, prostitution, vandalism, theft, drug-related offenses, etc. Given how routine these searches are today, it’s more than likely that these technologies disparately affect and are used against communities of color.

The emergence of these tools represents a dangerous expansion in law enforcement’s investigatory powers. In 2011, only 35% of Americans owned a smartphone. Today, it’s at least 81% of Americans. Moreover, many Americans — especially people of color and people with lower incomes — rely solely on their cellphones to connect to the internet. For law enforcement, mobile phones remain the most frequently used and most important digital source for investigation.

THE SOLUTIONS

Upurn.org believes that MDFTs are simply too powerful in the hands of law enforcement and should not be used. But recognizing that MDFTs are already in widespread use across the country, they offer a set of preliminary recommendations that, in the short-term, help reduce the use of MDFTs. These include:

  • banning the use of consent searches of mobile devices,
  • abolishing the plain view exception for digital searches,
  • requiring easy-to-understand audit logs,
  • enacting robust data deletion and sealing requirements, and
  • requiring clear public logging of law enforcement use.

Of course, these recommendations are only the first steps in a broader effort to minimize the scope of policing, and to confront and reckon with the role of police in the United States.

“This report seeks to not only better inform the public regarding law enforcement access to mobile phone data, but also to recenter the conversation on how law enforcement’s use of these tools entrenches police power and exacerbates racial inequities in policing. ” ~Upturn.org

Special thanks to authors Logan Koepke, Emma Weil, Urmila Janardan, Tinuola Dada and Harlan Yu for providing this highly informative and educational material.

Please review my Search & Seizure Legal Guide and contact my office if you are charged with a crime involving a smartphone search. Hiring an effective and competent defense attorney is the first and best step toward justice.

New Car Tech May Stop DUI

Infrastructure law mandates new technology to prevent drunk driving — here's how it would work - MarketWatch

According to an article in US News, President Biden will sign legislation which has new cars monitor and stop intoxicated drivers. It’s an auto safety mandate aimed at stopping road fatalities included within the $1 trillion infrastructure package.

The technology would roll out in all new vehicles as early as 2026. The Transportation Department must assesses the best form of technology to install in vehicles and give automakers time to comply. For now, the legislation doesn’t specify the technology. It must merely “monitor the performance of a driver of a vehicle to accurately identify whether that driver is impaired.”

In all, about $17 billion is allotted to road safety programs, the biggest increase in such funding in decades.

“It’s monumental,” said Alex Otte, national president of Mothers Against Drunk Driving. Otte called the package the single most important legislation in the group’s history that marks the beginning of the end of drunk driving. “It will virtually eliminate the No. 1 killer on America’s roads,” she said.

Last month, the National Highway Traffic Safety Administration reported an estimated 20,160 people died in traffic collisions in the first half of 2021, the highest first-half total since 2006. The agency has pointed to speeding, impaired driving and not wearing seatbelts during the coronavirus pandemic as factors behind the spike. Each year, around 10,000 people are killed due to alcohol-related crashes in the U.S., making up nearly 30% of all traffic fatalities, according to NHTSA.

THE NEW TECHNOLOGY

According to the article,  the most likely system to prevent drunken driving is infrared cameras that monitor driver behavior. That technology is already being installed by automakers such as General Motors, BMW and Nissan to track driver attentiveness while using partially automated driver-assist systems.

The cameras make sure a driver is watching the road, and they look for signs of drowsiness, loss of consciousness or impairment. If signs are spotted, the cars will warn the driver.  If the behavior persists, the car would turn on its hazard lights, slow down and pull over.

The voluminous bill also requires automakers to install rear-seat reminders to alert parents if a child is left inadvertently in the back seat, a mandate that could begin by 2025 after NHTSA completes its rulemaking on the issue. Since 1990, about 1,000 children have died from vehicular heatstroke after the highest total in a single year was 54 in 2018, according to Kidsandcars.org.

My opinion? This is an interesting development. is the technology reliable? There’s certainly good argument  over whether the technology could backfire, or prove ineffective in detecting impairment. Is your car searching you by passively monitoring your physical condition? Clearly, there’s Fourth Amendment Search and Seizure issues, involved in this too.

Please contact my office if you, a friend or family member are charged with DUI or any other crime. Hiring an effective and competent defense attorney is the first and best step toward justice.

Give Them An Inch . . .

GIVE AN INCH THEY'LL TAKE A MILE - GOSPELDADS

In State v. Boman, the WA Supreme Court held that a cell phone owner who gave consent for police to search text messages also gave police the authority to use his phone to set up a “ruse” drug bust sting. The subsequent police ruse using lawfully obtained information does not constitute a privacy invasion or trespass in violation of either our state constitution or the United States Constitution.

BACKGROUND FACTS

A Department of Homeland Security (DHS) agent sent a series of text messages to Mr. Bowman. The DHS agent claimed to be someone named Mike Schabell, a person to whom Bowman had sold methamphetamine earlier that day, and indicated he wanted to buy more drugs. The ruse led to charges of possession of methamphetamine with intent to deliver.

The trial court denied his motion to suppress the drugs and drug paraphernalia on his person and in his vehicle. At trial, Mr. Bowman was found guilty.

On Appeal, the WA Court of Appeals reversed Bowman’s conviction. The Court reasoned that the DHS Agent (1) disrupted Mr. Bowman’s private affairs and (2) was not acting under authority of law. With that, the Court of Appeals reversed Mr. Bowman’s conviction.

WA SUPREME COURT’S ANALYSIS & CONCLUSIONS

However, the WA Supreme Court found that police did not violate Mr. Bowman’s constitutional rights. The Court reasoned that under State v. Hinton, Bowman did indeed have a privacy interest in the text messages he sent to a third party’s device. That said, Schabell’s consent to search his phone gave police the necessary authority of law to view the text message conversation. Furthermore, police did not commit an unconstitutional trespass by sending text messages to Bowman’s cell phone as part of a ruse.

“Consistent with long-standing precedent, we hold that a cell phone owner’s voluntary consent to search text messages on their phone provides law enforcement with the authority of law necessary to justify intruding on an otherwise private affair. We also hold that a subsequent police ruse using lawfully obtained information does not constitute a privacy invasion or trespass in violation of either our state constitution or the United States Constitution.” ~WA Supreme Court

“That he misunderstood the identity of the person he was texting does not transform the
unsolicited incoming message into an unconstitutional trespass,” said the WA Supreme Court. “The risk of being betrayed by an informer or deceived as to the identity of one with whom one deals is probably inherent in the conditions of human society.”

With that, the WA Supreme Court reversed the Court of Appeals and reinstated Bowman’s conviction.

My opinion? This issue, and many other related issues, will likely require further consideration if such investigatory tactics continue to be used in Washington. Please review my Search and Seizure guide and contact my office if you, a friend or family member are charged with a crime. Hiring an effective and competent defense attorney is the first and best step toward justice.