Babalugats

joined 8 months ago
[–] Babalugats@feddit.uk 1 points 13 hours ago

No. Sorry. I'm very stoned now and respelling words for ages.

I don't think the Palestinians should move. I hope that they don't. I don't want them to.

But Israel could've tried that option before committing genocide.

[–] Babalugats@feddit.uk 7 points 15 hours ago (2 children)

Israel uses money, yet again with corruption implied somewhere (I didn't read the article).

My point is, did it ever occur to them that they could've paid the 80,000+ people that they murdered to peacefully and happily move, Probably for cheaper than the genocide has cost them.

Now I doubt, and I hope, that they couldn't pay most of them enough. They definitely wouldn't offer enough anyway.

 

Cross posted from: https://feddit.uk/post/40600495

After a years-long battle, the European Commission’s “Chat Control” plan, which would mandate mass scanning and other encryption-breaking measures, at last codifies agreement on a position within the Council of the EU, representing EU States. The good news is that the most controversial part, the forced requirement to scan encrypted messages, is out. The bad news is there’s more to it than that.

Chat Control has gone through several iterations since it was first introduced, with the EU Parliament backing a position that protects fundamental rights, while the Council of the EU spent many months pursuing an intrusive law-enforcement-focused approach. Many proposals earlier this year required the scanning and detection of illicit content on all services, including private messaging apps such as WhatsApp and Signal. This requirement would fundamentally break end-to-end encryption.

Thanks to the tireless efforts of digital rights groups, including European Digital Rights (EDRi), we won a significant improvement: the Council agreed on its position, which removed the requirement that forces providers to scan messages on their services. It also comes with strong language to protect encryption, which is good news for users.

Continue reading here - https://www.eff.org/deeplinks/2025/12/after-years-controversy-eus-chat-control-nears-its-final-hurdle-what-know

[–] Babalugats@feddit.uk 2 points 1 week ago* (last edited 1 week ago) (1 children)

I agree. A proper counter movement is needed.

Big American corporations are heavily lobbying EU council and governments. Transparency is not working, EU council are rolling back on GDPR, massively eroding our privacy, which is irreversible.

With the likes of Trump in charge the US are not trustworthy with any data. The data that they already take illegally is too much.

The UDHR article 12 is supposed to protect our privacy.

We need a counter movement big enough to scare the politicians when they start bending to the Big-Tech. They are not in the least bit worried as things stand now.

Peter Hummelgaard (among others) and his arrogance does not seem even a little concerned about his position.

[–] Babalugats@feddit.uk 40 points 1 week ago (2 children)

Spain, Ireland, Slovenia and the Netherlands (so far) to hold the all new "euramazing competition"

Showcasing undiscovered talent and theatrics from European countries, without exception to prevent any bullshit further down the line.

Not European? Fuck off.

 

Cross posted from: https://feddit.uk/post/40232992

european funds recovery initiative Search Search... Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR HOME Related News

Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR Last week we at EFRI wrote about the Digital Omnibus leak and warned that the European Commission was preparing a stealth attack on the GDPR

Since then, two things have happened:

The Commission has now officially published its Digital Omnibus proposal.

noyb (Max Schrems’ organisation) has released a detailed legal analysis and new campaigning material that confirms our worst fears: this is not harmless “simplification”, it is a deregulation package that cuts into the core of the GDPR and ePrivacy.

What noyb has now put on the table

On 19 November 2025, noyb published a new piece with the blunt headline: “Digital Omnibus: EU Commission wants to wreck core GDPR principles

Here’s a focused summary of the four core points from noyb’s announcement, in plain language:

New GDPR loophole via “pseudonyms” and IDs

The Commission wants to narrow the definition of “personal data” so that much data under pseudonyms or random IDs (ad-tech, data brokers, etc.) might no longer fall under the GDPR.

This would mean a shift from an objective test (“can a person be identified, directly or indirectly?”) to a subjective test (“does this company currently want or claim to be able to identify someone?”).

Therefore, whether the GDPR applies would depend on what a company says about its own capabilities and intentions.

Different companies handling the same dataset could fall inside or outside the GDPR.

For users and authorities, it becomes almost impossible to know ex ante whether the GDPR applies – endless arguments over a company’s “true intentions”.

Schrems’ analogy: it’s like a gun law that only applies if the gun owner admits he can handle the gun and intends to shoot – obviously absurd as a regulatory concept.

arzh-CNnlenfrdeitptrues european funds recovery initiative Search Search... Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR HOME Related News

Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR Last week we at EFRI wrote about the Digital Omnibus leak and warned that the European Commission was preparing a stealth attack on the GDPR

Since then, two things have happened:

The Commission has now officially published its Digital Omnibus proposal.

noyb (Max Schrems’ organisation) has released a detailed legal analysis and new campaigning material that confirms our worst fears: this is not harmless “simplification”, it is a deregulation package that cuts into the core of the GDPR and ePrivacy.

What noyb has now put on the table On 19 November 2025, noyb published a new piece with the blunt headline: “Digital Omnibus: EU Commission wants to wreck core GDPR principles”

Here’s a focused summary of the four core points from noyb’s announcement, in plain language:

New GDPR loophole via “pseudonyms” and IDs The Commission wants to narrow the definition of “personal data” so that much data under pseudonyms or random IDs (ad-tech, data brokers, etc.) might no longer fall under the GDPR.

This would mean a shift from an objective test (“can a person be identified, directly or indirectly?”) to a subjective test (“does this company currently want or claim to be able to identify someone?”).

Therefore, whether the GDPR applies would depend on what a company says about its own capabilities and intentions.

Different companies handling the same dataset could fall inside or outside the GDPR.

For users and authorities, it becomes almost impossible to know ex ante whether the GDPR applies – endless arguments over a company’s “true intentions”.

Schrems’ analogy: it’s like a gun law that only applies if the gun owner admits he can handle the gun and intends to shoot – obviously absurd as a regulatory concept.

Weakening ePrivacy protection for data on your device

Today, Article 5(3) ePrivacy protects against remote access to data on your devices (PCs, smartphones, etc.) – based on the Charter right to the confidentiality of communications.

The Commission now wants to add broad “white-listed” exceptions for access to terminal equipment, including “aggregated statistics” and “security purposes”.

Max Schrems finds the wording of the new rule to be extremely permissive and could effectively allow extensive remote scanning or “searches” of user devices,ces as long as they are framed as minimal “security” or “statistics” operations – undermining the current strong protection against device-level snooping.

Opening the door for AI training on EU personal data (Meta, Google, etc.)

Despite clear public resistance (only a tiny minority wants Meta to use their data for AI), the Commission wants to allow Big Tech to train AI on highly personal data, e.g. 15+ years of social-media history.

Schrems’ core argument:

People were told their data is for “connecting” or advertising – now it is fed into opaque AI models, enabling those systems to infer intimate details and manipulate users.

The main beneficiaries are US Big Tech firms building base models from Europeans’ personal data.

The Commission relies on an opt-out approach, but in practice:

Companies often don’t know which specific users’ data are in a training dataset.

Users don’t know which companies are training on their data.

Realistically, people would need to send thousands of opt-outs per year – impossible.

Schrems calls this opt-out a “fig leaf” to cover fundamentally unlawful processing.

On top of training, the proposal would also privilege the “operation” of AI systems as a legal basis – effectively a wildcard: processing that would be illegal under normal GDPR rules becomes legal if it’s done “for AI”. Resulting in an inversion of normal logic: riskier technology (AI) gets lower, not higher, legal standards.

Cutting user rights back to almost zero – driven by German demands

The starting point for this attack on user rights is a debate in Germany about people using GDPR access rights in employment disputes, for example to prove unpaid overtime. The German government chose to label such use as “abuse” and pushed in Brussels for sharp limits on these rights. The Commission has now taken over this line of argument and proposes to restrict the GDPR access right to situations where it is exercised for “data protection purposes” only.

In practice, this would mean that employees could be refused access to their own working-time records in labour disputes. Journalists and researchers could be blocked from using access rights to obtain internal documents and data that are crucial for investigative work. Consumers who want to challenge and correct wrong credit scores in order to obtain better loan conditions could be told that their request is “not a data-protection purpose” and therefore can be rejected.

This approach directly contradicts both CJEU case law and Article 8(2) of the Charter of Fundamental Rights. The Court has repeatedly confirmed that data-subject rights may be exercised for any purpose, including litigation and gathering evidence against a company. As Max Schrems points out, there is no evidence of widespread abuse of GDPR rights by citizens; what we actually see in practice is widespread non-compliance by companies. Cutting back user rights in this situation shifts the balance even further in favour of controllers and demonstrates how detached the Commission has become from the day-to-day reality of users trying to defend themselves.

EFRI’s take: when Big Tech lobbying becomes lawmaking

For EFRI, the message is clear: the Commission has decided that instead of forcing Big Tech and financial intermediaries to finally comply with the GDPR, it is easier to move the goalposts and rewrite the rules in their favour. The result is a quiet but very real redistribution of power – away from citizens, victims, workers and journalists, and towards those who already control the data and the infrastructure. If this package goes through in anything like its current form, it will confirm that well-organised corporate lobbying can systematically erode even the EU’s flagship fundamental-rights legislation. That makes it all the more important for consumer organisations, victim groups and digital-rights advocates to push back – loudly, publicly and with concrete case stories – before the interests of Big Tech are permanently written into EU law.

[–] Babalugats@feddit.uk 1 points 2 weeks ago (3 children)
[–] Babalugats@feddit.uk 1 points 2 weeks ago* (last edited 2 weeks ago) (5 children)

At a guess, I'd imagine big tech companies are lobbying as most of the information that they use comes from data gathering. Using data directly from texts etc. Leaves them open to court cases.

https://www.theguardian.com/commentisfree/2025/nov/12/eu-gdpr-data-law-us-tech-giants-digital

The options are limitless to the politicians regarding money making opportunities pushing x,y and z through once our private correspondence and devices are being scanned.

For example, in years to come insurance companies could refuse to pay out on all sorts of claims using that data. Doctor may have recommended you walk a mile a day and change your diet. You don't do it, or just miss a day, your life insurance policy is voided. Car crash not your fault, no payout because you missed something else etc.

I couldn't begin to to guess the amount of ways that this information could be used, but it's a complete u-turn from what the EU was saying only a few years ago

https://gdpr.eu/what-is-gdpr/

They still recommend using signal - but only internally.

Which in itself is bizarre.

And exempting themselves from being scanned is just showing what they really think.

[–] Babalugats@feddit.uk 0 points 2 weeks ago (7 children)

The timeline is here

Currently Denmark pushing it, they hold the EU presidency at the minute. Their minister for justice - Peter Hummelgaard is responsible for the big push and the wording. Specifically trying to pull the wool over the general public. Ireland are next (they take over in January) And the minister for justice in Ireland (Jim O'Callaghan) is also in favour of it.

U.N. right to privacy

Office of the High Commissioner for Human Rights - Right to privacy in the digital age

U.N. - Universal Declaration of Human Rights

 

Cross posted from: https://feddit.uk/post/40205739

I'm posting this to hopefully stop the posts that keep appearing, suggesting that progress has been made to defeat chat control. That's not correct.

The article:

Contrary to headlines suggesting the EU has “backed away” from Chat Control, the negotiating mandate endorsed today by EU ambassadors in a close split vote paves the way for a permanent infrastructure of mass surveillance. Patrick Breyer, digital freedom fighter and expert on the file, warns journalists and the public not to be deceived by the label “voluntary.”

While the Council removed the obligation for scanning, the agreed text creates a toxic legal framework that incentivizes US tech giants to scan private communications indiscriminately, introduces mandatory age checks for all internet users, and threatens to exclude teenagers from digital life.

“The headlines are misleading: Chat Control is not dead, it is just being privatized,” warns Patrick Breyer. **“What the Council endorsed today is a Trojan Horse. By cementing ‘voluntary’ mass scanning, they are legitimizing the warrantless, error-prone mass surveillance of millions of Europeans by US corporations, while simultaneously killing online anonymity through the backdoor of age verification.” ** Continue reading here - https://www.patrick-breyer.de/en/reality-check-eu-council-chat-control-vote-is-not-a-retreat-but-a-green-light-for-indiscriminate-mass-surveillance-and-the-end-of-right-to-communicate-anonymously/

[–] Babalugats@feddit.uk 0 points 3 weeks ago (1 children)

They have already rethought it a few times. The changes made have been to encourage the politicians to vote yes. One of the biggest, of not the biggest is to make them exempt from scanning. Enough questions in the right places and at the right times, would not give them any choice.

How could they defend adding an exemption clause for themselves into such an invasion of privacy.

[–] Babalugats@feddit.uk 0 points 3 weeks ago (3 children)

No, I just wasn't sure if you knew how the proposal is attempted. There are far too many to simply remove to stop it. The majority will still continue their term, and if with this mindset, could still push it through.

A good way to help them rethink it, is to have their devices scanned too. The only way this is going forward, is if they think that they are exempt from scanning.

[–] Babalugats@feddit.uk 0 points 3 weeks ago (5 children)

But while they're still serving their elected terms, and seem adamant to push this through, the least we can all do is push for their devices to be scanned too, if it has to go through.

 

Cross posted from: https://feddit.uk/post/39979350

[TRANSLATED ARTICLE]

EU chat control comes – through the back door of voluntariness

The EU states have agreed on a common position on chat control. Data protection advocates warn against massive surveillance. What is in store for us?

After lengthy negotiations, the EU states have agreed on a common position on so-called chat control. Like from one Minutes of negotiations of the Council working group As can be seen, Internet services will in future be allowed to voluntarily search their users' communications for information about crimes, but will not be obliged to do so.

The Danish Council Presidency wants to get the draft law through the Council "as quickly as possible", "so that the trilogue negotiations can begin promptly", the minutes say. Feedback from states should be limited to "absolute red lines".

Consensus achieved

The majority of States supported the compromise proposal. At least 15 spoke in favor, including Germany and France. Germany "welcomed both the deletion of the mandatory measures and the permanent anchoring of voluntary measures", said the protocol.

However, other countries were disappointed. Spain in particular "continued to see mandatory measures as necessary, unfortunately a comprehensive agreement on this was not possible". Hungary also "seen voluntariness as the sole concept as too little".

Spain, Hungary and Bulgaria proposed "an obligation for providers to detect, at least in open areas". The Danish Presidency "described the proposal as ambitious, but did not take it up to avoid further discussion.

The organization Netzpolitik.org, which has been reporting critically on chat control for years, sees the plans as a fundamental threat to democracy. "From the beginning, a lobby network intertwined with the security apparatus pushed chat control", writes the organization. “It was never really about the children, otherwise it would get to the root of abuse and violence instead of monitoring people without any initial suspicion.”

Netzpolitik.org argues that "encrypted communication is a thorn in the side of the security apparatus". Authorities have been trying to combat private and encrypted communication in various ways for years.

A number of scholars criticize the compromise proposal, calling voluntary chat control inappropriate. "Their benefits have not been proven, while the potential for harm and abuse is enormous", one said open letter.

According to critics, the planned technology, so-called client-side scanning, would create a backdoor on all users' devices. Netzpolitik.org warns that this represents a "frontal attack on end-to-end encryption, which is vital in the digital world".

The problem with such backdoors is that "not only the supposedly 'good guys' can use them, but also resourceful criminals or unwell-disposed other states", argues the organization.

Signal considers withdrawing from the EU

Journalists' associations are also alarmed by the plans. The DJV rejects chat control as a form of mass surveillance without cause and sees source protection threatened, for which encrypted communication is essential. The infrastructure created in this way can be used for political control "in just a few simple steps", said the DJV in a statement Opinion.

The Messenger service Signal Already announced that it would withdraw from the EU if necessary. Signal President Meredith Whittaker told the dpa: “Unfortunately, if we were given the choice of either undermining the integrity of our encryption or leaving Europe, we would make the decision to leave the market.”

Next steps in the legislative process

The Permanent Representatives of the EU states are due to meet next week on the subject, followed in December by the Ministers of Justice and Home Affairs, these two bodies are due to approve the bill as the Council's official position.

The trilogue then begins, in which the Commission, Parliament and Council must reach a compromise from their three draft laws. Parliament had described the original plans as mass surveillance and called for only unencrypted suspect content to be scanned.

The EU Commission had originally proposed requiring Internet services to search their users' content for information about crimes without cause and to send it to authorities if suspected.

 

cross-posted from: https://feddit.uk/post/39495921

The EU Council seems to agree to the new compromise "without further changes"

The EU Council has received new Chat Control proposal with broad support
CSAM scanning would now be voluntary, but with some exceptions
Lawmakers met today (November 12) for further discussion

It's official, a revised version of the CSAM scanning proposal is back on the EU lawmakers' table − and is keeping privacy experts worried.

The Law Enforcement Working Party met again this morning (November 12) in the EU Council to discuss what's been deemed by critics the Chat Control bill.

This follows a meeting the group held on November 5, and comes as the Denmark Presidency put forward a new compromise after withdrawing mandatory chat scanning.

As reported by Netzpolitik, the latest Child Sexual Abuse Regulation (CSAR) proposal was received with broad support during the November 5 meeting, "without any dissenting votes" nor further changes needed.

The new text, which removes all provisions on detection obligations included in the bill and makes CSAM scanning voluntary, seems to be the winning path to finally find an agreement after over three years of trying.

Privacy experts and technologists aren't quite on board, though, with long-standing Chat Control critic and digital rights jurist, Patrick Breyer, deeming the proposal "a political deception of the highest order." Chat Control − what's changing and what are the risk

As per the latest version of the text, messaging service providers won't be forced to scan all URLs, pictures, and videos shared by users, but rather choose to perform voluntary CSAM scanning.

There's a catch, though. Article 4 will include a possible "mitigation measure" that could be applied to high-risk services to require them to take "all appropriate risk mitigation measures."

According to Breyer, such a loophole could make the removal of detection obligations "worthless" by negating their voluntary nature. He said: "Even client-side scanning (CSS) on our smartphones could soon become mandatory – the end of secure encryption."

Breaking encryption, the tech that security software like the best VPNs, Signal, and WhatsApp use to secure our private communications, has been the strongest argument against the proposal so far.

Continue Reading - https://www.techradar.com/vpn/vpn-privacy-security/this-is-a-political-deception-new-chat-control-convinces-lawmakers-but-not-privacy-experts-yet