Archive for the ‘Fourth Amendment’ Category

Can the government use Apple’s new iCloud scanning program to spy on citizens? – TAG24 NEWS

Aug 15, 20215:00 PMEDT

Many worry that the government will force Apple to grant access to their private photos since they now scan all iCloud uploads for child sexual abuse material.

Cupertino, California - After Apple's recent announcement that it would scan all photos uploaded to iCloud for child sexual abuse material (CSAM), many began to worry that the government could force the company to grant access to their private photos.

Matt Tait, the COO of security company Corellium, reassured users that because the US has the Fourth Amendment in place, the government wouldn't be allowed to use private scanning services to spy on American citizens, according to a summary provided by 9to5Mac.

The Fourth Amendment protects US citizens from unreasonable search and seizure.

Tait is a former analyst for GCHQ, which is the British version of the US' National Security Agency, so he should know what he's talking about.

The new concerns about spying stem from the recent Pegasus software hacking of prominent journalists and leaders who found their phone's information and private photos stolen and leaked.

A fear, according to Johns Hopkins cryptographer Matthew Green, is that the Department of Justice could go to the National Center for Missing & Exploited Children (NCMEC) and ask them to add other photos to the database that teaches Apple's program what to scan for. This could, perhaps, include photos of missing children, wanted criminals, or anyone who is a person of interest to the government.

Given that the NCMEC isn't a wholly government-run organization, there might not be much oversight when this happens.

In this scenario, the photos could then trigger Apple's new system, and if there is enough suspicion, the government could force Apple to turn over customers' information.

Yet, according to Tait, the fact that NCMEC isn't a full government entity is what will keep Americans safe, as they can't easily be forced by the government to do anything.

Apple could also blow the whistle on any requests for information that don't match CSAM parameters, signaling that the government is attempting to circumvent the Fourth Amendment and violate citizens' protection.

Likewise, Apple isn't obligated to work with NCMEC, and the relationship is voluntary.

Additionally, any perceived invasion of privacy would probably be overthrown in court, as it is unlikely the government could supply any proof this doesn't violate the Fourth Amendment.

At least for now, iCloud users can rest a bit easier.

Read the original:
Can the government use Apple's new iCloud scanning program to spy on citizens? - TAG24 NEWS

It’s Time for Google to Resist Geofence Warrants and to Stand Up for Its Affected Users – EFF

EFF would like to thank former intern Haley Amster for drafting this post, and former legal fellow Nathan Sobel for his assistance in editing it.

The Fourth Amendment requires authorities to target search warrants at particular places or thingslike a home, a bank deposit box, or a cell phoneand only when there is reason to believe that evidence of a crime will be found there. The Constitutions drafters put in place these essential limits on government power after suffering under British searches called general warrants that gave authorities unlimited discretion to search nearly everyone and everything for evidence of a crime.

Yet today, Google is facilitating the digital equivalent of those colonial-era general warrants. Through the use of geofence warrants (also known as reverse location warrants), federal and state law enforcement officers are routinely requesting that Google search users accounts to determine who was in a certain geographic area at a particular timeand then to track individuals outside of that initially specific area and time period.

These warrants are anathema to the Fourth Amendments core guarantee largely because, by design, they sweep up people wholly unconnected to the crime under investigation.

For example, in 2020 Florida police obtained a geofence warrant in a burglary investigation that led them to suspect a man who frequently rode his bicycle in the area. Google collected the mans location history when he used an app on his smartphone to track his rides, a scenario that ultimately led police to suspect him of the crime even though he was innocent.

Google is the linchpin in this unconstitutional scheme. Authorities send Google geofence warrants precisely because Googles devices, operating system, apps, and other products allow it to collect data from millions of users and to catalog these users locations, movements, associations, and other private details of their lives.

Although Google has sometimes pushed back in court on the breadth of some of these warrants, it has largely acquiesced to law enforcement demandsand the number of geofence warrants law enforcement sends to the company has dramatically increased in recent years. This stands in contrast to documented instances of other companies resisting law enforcement requests for user data on Fourth Amendment grounds.

Its past time for Google to stand up for its users privacy and to resist these unlawful warrants. A growing coalition of civil rights and other organizations, led by the Surveillance Technology and Oversight Project, have previously called on Google to do so. We join that coalitions call for change and further demand that Google:

As explained below, these are the minimum steps Google must take to show that it is committed to its users privacy and the Fourth Amendments protections against general warrants.

EFF calls on Google to stop complying with the geofence warrants it receives. As it stands now, Google appears to have set up an internal system that streamlines, systematizes, and encourages law enforcements use of geofence warrants. Googles practice of complying with geofence warrants despite their unconstitutionality is inconsistent with its stated promise to protect the privacy of its users by keeping your information safe, treating it responsibly, and putting you in control. As recently as October, Googles parent companys CEO, Sundar Pichai, said that [p]rivacy is one of the most important areas we invest in as a company, and in the past, Google has even gone to court to protect its users sensitive data from overreaching government legal process. However, Googles compliance with geofence warrants is incongruent with these platitudes and the companys past actions.

To live up to its promises, Google should commit to either refusing to comply with these unlawful warrants or to challenging them in court. By refusing to comply, Google would put the burden on law enforcement to demonstrate the legality of its warrant in court. Other companies, and even Google itself, have done this in the past. Google should not defer to law enforcements contention that geofence warrants are constitutional, especially given law enforcements well-documented history of trying novel surveillance and legal theories that courts later rule to be unconstitutional. And to the extent Google has refused to comply with geofence warrants, it should say so publicly.

Googles ongoing cooperation is all the more unacceptable given that other companies that collect similar location data from their users, including Microsoft and Garmin, have publicly stated that they would not comply with geofence warrants.

Even if Google were to stop complying with geofence warrants today, it still must be much more transparent about geofence warrants it has received in the past. Google must break out information and provide further details about geofence warrants in its biannual Transparency Reports.

Googles Transparency Reports currently document, among other things, the types and volume of law enforcement requests for user data the company receives, but they do not, as of now, break out information about geofence warrants or provide further details about them. With no detailed reporting from Google about the geofence warrants it has received, the public is left to learn about them via leaks to reporters or by combing through court filings.

Here are a few specific ways Google can be more transparent:

Google should disclose the following information about all geofence warrants it has received over the last five years and commit to continue doing so moving forward:

Google should also resist nondisclosure orders and litigate to ensure, if imposed, that the government has made the appropriate showing required by law. If Google is subject to such an order, or the related docket is sealed (prohibiting the company from disclosing the fact it has received some geofence warrants or from providing other details), Google should move to end those orders and to unseal those dockets so it can make details about them public as early as allowable by law.

Google should also support and seek to provide basic details about court cases and docket numbers for orders authorizing each geofence warrant and docket numbers for any related criminal prosecutions Google is aware of as a result of the geofence warrants. At minimum, Google should disclose details on the agencies seeking geofence warrants, broken down by each federal agency, state-level agencies, and local law enforcement.

Google must start telling its users when their information is caught up in a geofence warranteven if that information is de-identified. This notice to affected users should state explicitly what information Google produced, in what format, which agency requested it, which court authorized the warrant, and whether Google provided identifying information. Notice to users here is critical: if people arent aware of how they are being affected by these warrants, there cant be meaningful public debate about them.

To the extent the law requires Google to delay notice or not disclose the existence of the warrant, Google should challenge such restrictions so as to only comply with valid ones, and it should provide users with notice as soon as possible.

It does not appear that Google gives notice to every user whose data is requested by law enforcement. Some affected users have said that Google notified them that law enforcement accessed their account via a geofence warrant. But in some of the cases EFF has followed, it appears that Google has not always notified affected users who it identifies in response to these warrants, with no public explanation from Google. Googles policies state that it gives notice to users before disclosing information, but more clarity is warranted here. Google should publicly state whether its policy is being applied to all users information subject to geofence warrants, or only those who they identify to law enforcement.

Many people do not know, much less understand, how and when Google collects and stores location data. Google must do a better job of explaining its policies and practices to users, not processing user data absent opt-in consent, minimizing the amount of data it collects, deleting retained data users no longer need, and giving users the ability to easily delete their data.

Well before law enforcement ever comes calling, Google must first ensure it does not collect its users location data before obtaining meaningful consent from them. This consent should establish a fair way for users to opt into data collection, as click-through agreements which apply to dozens of services, data types, or uses at once are insufficient. As one judge in a case involving Facebook put it, the logic that merely clicking I agree indicates true consent requires everyone to pretend that users read every word of these policies before clicking their acceptance, even though we all know that virtually none of them did.

Google should also explain exactly what location data it collects from users, when that collection occurs, what purpose it is used for, and how long Google retains that data. This should be clear and understandable, not buried in dense privacy policies or terms of service.

Google should also only be collecting, retaining, and using its customers location data for a specific purpose, such as to provide directions on Google Maps or to measure road traffic congestion. Data must not be collected or used for a different purpose, such as for targeted advertising, unless users separately opt in to such use. Beyond notice and consent, Google must minimize its processing of user data, that is, only process user data as reasonably necessary to give users what they asked for. For example, user data should be deleted when it is no longer needed for the specific purpose for which it was initially collected, unless the user specifically requests that the data be saved.

Although Google allows users to manually delete their location data and to set automated deletion schedules, Google should confirm that these tools are not illusory. Recent enforcement actions by state attorneys allege that users cannot fully delete their data, much less fully opt out of having their location data collected at all.

* * *

Google holds a tremendous amount of power over law enforcements ability to use geofence warrants. Instead of keeping quiet about them and waiting for defendants in criminal cases to challenge them in court, Google needs to stand up for its users when it comes to revealing their sensitive data to law enforcement.

Continued here:
It's Time for Google to Resist Geofence Warrants and to Stand Up for Its Affected Users - EFF

If You Build It, They Will Come: Apple Has Opened the Backdoor to Increased Surveillance and Censorship Around the World – EFF

Apples new program for scanning images sent on iMessage steps back from the companys prior support for the privacy and security of encrypted messages. The program, initially limited to the United States, narrows the understanding of end-to-end encryption to allow for client-side scanning. While Apple aims at the scourge of child exploitation and abuse, the company has created an infrastructure that is all too easy to redirect to greater surveillance and censorship. The program will undermine Apples defense that it cant comply with the broader demands.

For years, countries around the world have asked for access to and control over encrypted messages, asking technology companies to nerd harder when faced with the pushback that access to messages in the clear was incompatible with strong encryption. The Apple child safety message scanning program is currently being rolled out only in the United States.

The United States has not been shy about seeking access to encrypted communications, pressuring the companies to make it easier to obtain data with warrants and to voluntarily turn over data. However, the U.S. faces serious constitutional issues if it wanted to pass a law that required warrantless screening and reporting of content. Even if conducted by a private party, a search ordered by the government is subject to the Fourth Amendments protections. Any warrant issued for suspicionless mass surveillance would be an unconstitutional general warrant. As the Ninth Circuit Court of Appeals has explained, "Search warrants . . . are fundamentally offensive to the underlying principles of the Fourth Amendment when they are so bountiful and expansive in their language that they constitute a virtual, all-encompassing dragnet[.]" With this new program, Apple has failed to hold a strong policy line against U.S. laws undermining encryption, but there remains a constitutional backstop to some of the worst excesses. But U.S constitutional protection may not necessarily be replicated in every country.

Apple is a global company, with phones and computers in use all over the world, and many governments pressure that comes along with that. Apple has promised it will refuse government demands to build and deploy government-mandated changes that degrade the privacy of users. It is good that Apple says it will not, but this is not nearly as strong a protection as saying it cannot, which could not honestly be said about any system of this type. Moreover, if it implements this change, Apple will need to not just fight for privacy, but win in legislatures and courts around the world. To keep its promise, Apple will have to resist the pressure to expand the iMessage scanning program to new countries, to scan for new types of content and to report outside parent-child relationships.

It is no surprise that authoritarian countries demand companies provide access and control to encrypted messages, often the last best hope for dissidents to organize and communicate. For example, Citizen Labs research shows thatright nowChinas unencrypted WeChat service already surveils images and files shared by users, and uses them to train censorship algorithms. When a message is sent from one WeChat user to another, it passes through a server managed by Tencent (WeChats parent company) that detects if the message includes blacklisted keywords before a message is sent to the recipient. As the Stanford Internet Observatorys Riana Pfefferkorn explains, this type of technology is a roadmap showing how a client-side scanning system originally built only for CSAM [Child Sexual Abuse Material] could and would be suborned for censorship and political persecution. As Apple has found, China, with the worlds biggest market, can be hard to refuse. Other countries are not shy about applying extreme pressure on companies, including arresting local employees of the tech companies.

But many times potent pressure to access encrypted data also comes from democratic countries that strive to uphold the rule of law, at least at first. If companies fail to hold the line in such countries, the changes made to undermine encryption can easily be replicated by countries with weaker democratic institutions and poor human rights recordsoften using similar legal language, but with different ideas about public order and state security, as well as what constitutes impermissible content, from obscenity to indecency to political speech. This is very dangerous. These countries, with poor human rights records, will nevertheless contend that they are no different. They are sovereign nations, and will see their public-order needs as equally urgent. They will contend that if Apple is providing access to any nation-state under that states local laws, Apple must also provide access to other countries, at least, under the same terms.

For example, the Five Eyesan alliance of the intelligence services of Canada, New Zealand, Australia, the United Kingdom, and the United Stateswarned in 2018 that they will pursue technological, enforcement, legislative or other measures to achieve lawful access solutions if the companies didnt voluntarily provide access to encrypted messages. More recently, the Five Eyes have pivoted from terrorism to the prevention of CSAM as the justification, but the demand for unencrypted access remains the same, and the Five Eyes are unlikely to be satisfied without changes to assist terrorism and criminal investigations too.

The United Kingdoms Investigatory Powers Act, following through on the Five Eyes threat, allows their Secretary of State to issue technical capacity notices, which oblige telecommunications operators to make the technical ability of providing assistance in giving effect to an interception warrant, equipment interference warrant, or a warrant or authorisation for obtaining communications data. As the UK Parliament considered the IPA, we warned that a company could be compelled to distribute an update in order to facilitate the execution of an equipment interference warrant, and ordered to refrain from notifying their customers.

Under the IPA, the Secretary of State must consider the technical feasibility of complying with the notice. But the infrastructure needed to roll out Apples proposed changes makes it harder to say that additional surveillance is not technically feasible. With Apples new program, we worry that the UK might try to compel an update that would expand the current functionality of the iMessage scanning program, with different algorithmic targets and wider reporting. As the iMessage communication safety feature is entirely Apples own invention, Apple can all too easily change its own criteria for what will be flagged for reporting. Apple may receive an order to adopt its hash matching program for iPhoto into the message pre-screening. Likewise, the criteria for which accounts will apply this scanning, and where positive hits get reported, are wholly within Apples control.

Australia followed suit with its Assistance and Access Act, which likewise allows for requirements to provide technical assistance and capabilities, with the disturbing potential to undermine encryption. While the Act contains some safeguards, a coalition of civil society organizations, tech companies, and trade associations, including EFF andwait for itApple, explained that they were insufficient.

Indeed, in Apples own submission to the Australian government, Apple warned the government may seek to compel providers to install or test software or equipment, facilitate access to customer equipment, turn over source code, remove forms of electronic protection, modify characteristics of a service, or substitute a service, among other things. If only Apple would remember that these very techniques could also be used in an attempt to mandate or change the scope of Apples scanning program.

While Canada has yet to adopt an explicit requirement for plain text access, the Canadian government is actively pursuing filtering obligations for various online platforms, which raise the spectre of a more aggressive set of obligations targeting private messaging applications.

For the Five Eyes, the ask is mostly for surveillance capabilities, but India and Indonesia are already down the slippery slope to content censorship. The Indian governments new Intermediary Guidelines and Digital Media Ethics Code (2021 Rules), in effect earlier this year, directly imposes dangerous requirements for platforms to pre-screen content. Rule 4(4) compels content filtering, requiring that providers endeavor to deploy technology-based measures, including automated tools or other mechanisms, to proactively identify information that has been forbidden under the Rules.

Indias defense of the 2021 rules, written in response to the criticism from three UN Special Rapporteurs, was to highlight the very real dangers to children, and skips over the much broader mandate of the scanning and censorship rules. The 2021 Rules impose proactive and automatic enforcement of its content takedown provisions, requiring the proactive blocking of material previously held to be forbidden under Indian law. These laws broadly include those protecting the sovereignty and integrity of India; security of the State; friendly relations with foreign States; public order; decency or morality. This is no hypothetical slippery slopeits not hard to see how this language could be dangerous to freedom of expression and political dissent. Indeed, Indias track record on its Unlawful Activities Prevention Act, which has reportedly been used to arrest academics, writers and poets for leading rallies and posting political messages on social media, highlight this danger.

It would be no surprise if India claimed that Apples scanning program was a great start towards compliance, with a few more tweaks needed to address the 2021 Rules wider mandate. Apple has promised to protest any expansion, and could argue in court, as WhatsApp and others have, that the 2021 Rules should be struck down, or that Apple does not fit the definition of a social media intermediary regulated under these 2021 Rules. But the Indian rules illustrate both the governmental desire and the legal backing for pre-screening encrypted content, and Apples changes makes it all the easier to slip into this dystopia.

This is, unfortunately, an ever-growing trend. Indonesia, too, has adopted Ministerial Regulation MR5 to require service providers (including instant messaging providers) to ensure that their system does not contain any prohibited [information]; and [...] does not facilitate the dissemination of prohibited [information]. MR5 defines prohibited information as anything that violates any provision of Indonesias laws and regulations, or creates community anxiety or disturbance in public order. MR5 also imposes disproportionate sanctions, including a general blocking of systems for those who fail to ensure there is no prohibited content and information in their systems. Indonesia may also see the iMessage scanning functionality as a tool for compliance with Regulation MR5, and pressure Apple to adopt a broader and more invasive version in their country.

The pressure to expand Apples program to more countries and more types of content will only continue. In fall of 2020, in the European Union, a series of leaked documents from the European Commission foreshadowed an anti-encryption law to the European Parliament, perhaps this year. Fortunately, there is a backstop in the EU. Under the e-commerce directive, EU Member States are not allowed to impose a general obligation to monitor the information that users transmit or store, as stated in the Article 15 of the e-Commerce Directive (2000/31/EC). Indeed, the Court of Justice of the European Union (CJEU) has stated explicitly that intermediaries may not be obliged to monitor their services in a general manner in order to detect and prevent illegal activity of their users. Such an obligation will be incompatible with fairness and proportionality. Despite this, in a leaked internal document published by Politico, the European Commission committed itself to an action plan for mandatory detection of CSAM by relevant online service providers (expected in December 2021) that pointed to client-side scanning as the solution, which can potentially apply to secure private messaging apps, and seizing upon the notion that it preserves the protection of end-to-end encryption.

For governmental policymakers who have been urging companies to nerd harder, wordsmithing harder is just as good. The end result of access to unencrypted communication is the goal, and if that can be achieved in a way that arguably leaves a more narrowly defined end-to-end encryption in place, all the better for them.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, the adoption of the iPhoto hash matching to iMessage, or a tweak of the configuration flags to scan, not just childrens, but anyones accounts. Apple has a fully built system just waiting for external pressure to make the necessary changes. China and doubtless other countries already have hashes and content classifiers to identify messages impermissible under their laws, even if they are protected by international human rights law. The abuse cases are easy to imagine: governments that outlaw homosexuality might require a classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand a classifier able to spot popular satirical images or protest flyers.

Now that Apple has built it, they will come. With good intentions, Apple has paved the road to mandated security weakness around the world, enabling and reinforcing the arguments that, should the intentions be good enough, scanning through your personal life and private communications is acceptable. We urge Apple to reconsider and return to the mantra Apple so memorably emblazoned on a billboard at 2019s CES conference in Las Vegas: What happens on your iPhone, stays on your iPhone.

More here:
If You Build It, They Will Come: Apple Has Opened the Backdoor to Increased Surveillance and Censorship Around the World - EFF

A Federal Cop Devised a Bogus Sex Trafficking Ring and Jailed This Teen for 2 Years. The Cop Can’t Be Sued. – Reason

For years, St. Paul police officer Heather Weyker was swamped. She gathered evidence, cultivated witnesses, filled out the police reports, testified under oathall in connection with an interstate sex trafficking ring run by Somali refugees. But perhaps most impressive is that she did all that while fabricating the same ring she was investigating, which resulted in 30 indictments, 9 trials, and 0 convictions.

Hamdi Mohamud, then a 16-year-old refugee from Somalia, found herself caught up in that scheme in 2011, when one of Weyker's witnesses, Muna Abdulkadir, tried to attack her and her friends at knifepoint. Mohamud called the police, and Weyker intervenedon behalf of Abdulkadir. She arrested Mohamud and her friends for allegedly tampering with a federal witness, and Mohamud subsequently spent two years in jail before the trumped-up charges were dismissed.

While Mohamud lost those two years of her life, Weyker has not paid any pricenot in spite of her position, but because of it. Since the officer conducted her investigation as part of a federal task force, she is entitled to absolute immunity and cannot be sued, the U.S. Court of Appeals for the 8th Circuit ruled last year.

It's not because the "sex trafficking" investigationwhich consisted of Weyker conjuring fake information, editing police reports, fabricating evidence, and lying under oath, among other thingswas legitimate. On the contrary, the court says it was "plagued with problems from the start" and notes that Weyker employed "lies and manipulation" to put people behind bars. Legally speaking, none of that matters.

What does matter is a line of Supreme Court jurisprudence that has made suing a rights-violating federal officer almost out of the question. Had Weyker acted in her capacity as a state or local cop, Mohamud would have been permitted to bring her claim before a jury of her peers. Yet the most powerful officers are held to the lowest standard of accountability.

Mohamud hopes to change that standard by asking the Supreme Court to hear her case, which she made official last week.

The problem here isn't qualified immunity, the doctrine that shields police officers and other state actors from federal civil suits unless the way the government violated your rights has been litigated almost exactly in a prior court precedent. That's an onerous standard to meet. It has, for example, protected two police officers who allegedly stole $225,000 while executing a search warrant, because no prior court ruling had said stealing in those circumstances is unconstitutional. The legal principle has been at the center of criminal justice reform efforts over the last year.

But Mohamud cleared that hurdle. The United States District Court for the District of Minnesota ruled that Weyker's actions so clearly made a mockery of the Constitution that she could not skirt the suit. The 8th Circuit then overturned that decision on appeal, citing Weyker's temporary federal badge, while in the same breath acknowledging the depravity of her actions.

"Qualified immunity makes it very, very difficult to sue government officials," says Patrick Jaicomo, an attorney at the Institute for Justice, the libertarian public interest law firm representing Mohamud. "This makes it impossible."

There's a Supreme Court decision that should, in theory, give Mohamud the avenue to redress she needs. In Bivens v. Six Unknown Named Agents of the Federal Bureau of Narcotics(1971), the high court allowed a victim to go before a jury after federal cops conducted a drug raid on his apartment without a warrant and later strip-searched him at the courthouse.

But since then the Court has undermined its own decision in almost comical ways. In 2017, the justices ruled in Ziglar v. Abbasi that lower courts should pinpoint "special factors counseling hesitation" when considering suits against federal cops. In practice, that has meant just about whatever a judge can cook up.

Yet even Abbasi notes that Bivens should be applied robustly for Fourth Amendment claims, and Mohamud's suit rests on the Fourth Amendment. That has been lost on the 8th Circuit.

"Bivens is actually a great decision," says Anya Bidwell, another attorney for Mohamud. "It does provide a cause of action for a violation of Fourth Amendment rights. We want Bivens to be interpreted robustly and allow individuals to seek damages for violations of constitutional rights."

Whether or not the Supreme Court will clarify its oscillating guidance remains to be seen. But last year the justices may have given a hint about where they're leaning when they unanimously ruled that a group of Muslim men should have the right to sue a group of federal cops who violated their religious freedom rights. Jaicomo distills Justice Clarence Thomas' opinion in that case down to its core: "He [essentially] says the availability of damages against federal officers is as old as the Republic itself."

A decade after wrongly losing the end of her teenage years in jail, Mohamud has not yet been able to make use of that lever against the perpetrator, who is still employed by the St. Paul Police Department. "It simply makes no sense that the Fourth Amendment applies with less rigor for someone who happens to work for the federal government," says Bidwell. "This is unsustainable. It just makes no sense."

Continue reading here:
A Federal Cop Devised a Bogus Sex Trafficking Ring and Jailed This Teen for 2 Years. The Cop Can't Be Sued. - Reason

Puerto Rico Gov Sued in Federal Court Over Vaccine Mandates – The Weekly Journal

Five career employees of the government of Puerto Rico sued Gov. Pedro Pierluisi at the federal court in San Juan for violating their constitutional rights by demanding they get vaccinated against COVID-19.

"The government of Puerto Rico is being arbitrary and capricious by coercing and tricking its public employees into getting vaccinated without regard to their fundamental right to personally refuse the vaccine," reads the lawsuit, presented by Jos Dvila Acevedo, the lawyer for the plaintiffs.

Zulay Rodrguez Vlez, Yohama Gonzlez, Leila Liborio Carrasquillo, and Julissa Piero denounce violations against the Fourth Amendment of the U.S. Constitution.

Moreover, they request a declaratory sentence order and a preliminary injunction. They argue that legal action is not capricious, nor arbitrary.

In the lawsuit, the plaintiffs contest that the COVID-19 statistics suggest that the local government is "exaggerating the severity of the pandemic." Furthermore, they state that in Puerto Rico, the pandemic has not hindered health operations and that there are fewer cases than in other U.S. jurisdictions.

The government has not responded to THE WEEKLY JOURNAL's request for comment.

Read more from the original source:
Puerto Rico Gov Sued in Federal Court Over Vaccine Mandates - The Weekly Journal