Technology

307 readers
163 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
26
27
 
 

They never knew they were being filmed — on subway trains, in mall fitting rooms, on university campuses, at home.

Since late June, a Chinese-language Telegram group chat named “MaskPark Treehole Forum,” reportedly with over 103,000 members, has sparked outrage on Chinese social media for circulating obscene covert footage.

Secret intimate recordings of women and individuals having sex were captured using hidden cameras disguised as screws, power sockets, and even bottles of toilet cleaner — and those sharing them could be a colleague, a classmate, or even a family member.

The revelations triggered widespread outrage on Chinese social media, drawing broad coverage from domestic news outlets.

State-backed outlet Guangming Daily called the case “exceptionally egregious” and urged swift regulatory action in a commentary, saying: “Regulators must move faster to fill the gaps, and law enforcement mechanisms need to be strengthened. Only by doing so can we enhance the overall sense of security, free women from the fear of being watched, and make the boundaries of privacy truly inviolable.”

28
29
30
31
 
 

Source.

Long Response

I would like to thank all those who signed the petition. It is right that the regulatory regime for in scope online services takes a proportionate approach, balancing the protection of users from online harm with the ability for low-risk services to operate effectively and provide benefits to users.

The Government has no plans to repeal the Online Safety Act, and is working closely with Ofcom to implement the Act as quickly and effectively as possible to enable UK users to benefit from its protections.

Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements. Duties in the Communications Act 2003 require Ofcom to act with proportionality and target action only where it is needed.

Some duties apply to all user-to-user and search services in scope of the Act. This includes risk assessments, including determining if children are likely to access the service and, if so, assessing the risks of harm to children. While many services carry low risks of harm, the risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government is very concerned about small platforms that host harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting small services from the Act would mean that services like these forums would not be subject to the Act’s enforcement powers. Even forums that might seem harmless carry potential risks, such as where adults come into contact with child users.

Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures, including:

  • easy-to-find, understandable terms and conditions;
  • a complaints tool that allows users to report illegal material when they see it, backed up by a process to deal with those complaints;
  • the ability to review content and take it down if it is illegal (or breaches their terms of service);
  • a specific individual responsible for compliance, who Ofcom can contact if needed.

Where a children's access assessment indicates a platform is likely to be accessed by children, a subsequent risk assessment must be conducted to identify measures for mitigating risks. Like the Codes of Practice on illegal content, Ofcom’s recently issued child safety Codes also tailor recommendations based on risk level. For example, highly effective age assurance is recommended for services likely accessed by children that do not already prohibit and remove harmful content such as pornography and suicide promotion. Providers of services likely to be accessed by UK children were required to complete their assessment, which Ofcom may request, by 24 July.

On 8 July, Ofcom’s CEO wrote to the Secretary of State for Science, Innovation and Technology noting Ofcom’s responsibility for regulating a wide range of highly diverse services, including those run by businesses, but also charities, community and voluntary groups, individuals, and many services that have not been regulated before.

The letter notes that the Act’s aim is not to penalise small, low-risk services trying to comply in good faith. Ofcom – and the Government – recognise that many small services are dynamic small businesses supporting innovation and offer significant value to their communities. Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.

Ofcom has developed an extensive programme of work designed to support a smoother journey to compliance, particularly for smaller firms. This has been underpinned by interviews, workshops and research with a diverse range of online services to ensure the tools meet the needs of different types of services. Ofcom’s letter notes its ‘guide for services’ guidance and tools hub, and its participation in events run by other organisations and networks including those for people running small services, as well as its commitment to review and improve materials and tools to help support services to create a safer life online.

The Government will continue to work with Ofcom towards the full implementation of the Online Safety Act 2023, including monitoring proportionate implementation.

Department for Science, Innovation and Technology

32
33
43
submitted 2 days ago* (last edited 2 days ago) by Pro@programming.dev to c/Technology@programming.dev
 
 
34
35
 
 
36
 
 

The EU is currently developing a whitelabel app to perform privacy-preserving (at least in theory) age verification to be adopted and personalized in the coming months by member states. The app is open source and available here: https://github.com/eu-digital-identity-wallet/av-app-android-wallet-ui.

Problem is, the app is planning to include remote attestation feature to verify the integrity of the app: https://github.com/eu-digital-identity-wallet/av-app-android-wallet-ui?tab=readme-ov-file#disclaimer. This is supposed to provide assurance to the age verification service that the app being used is authentic and running on a genuine operating system. Genuine in the case of Android means:

The operating system was licensed by Google The app was downloaded from the Play Store (thus requiring a Google account) Device security checks have passed While there is value to verify device security, this strongly ties the app to many Google properties and services, because those checks won't pass on an aftermarket Android OS, even those which increase security significantly like GrapheneOS, because the app plans to use Google "Play Integrity", which only allows Google licensed systems instead of the standard Android attestation feature to verify systems.

This also means that even though you can compile the app, you won't be able to use it, because it won't come from the Play Store and thus the age verification service will reject it.

The issue has been raised here https://github.com/eu-digital-identity-wallet/av-app-android-wallet-ui/issues/10 but no response from team members as of now.

37
38
39
40
 
 
41
 
 

Move marks latest attempt by Trump Administration to collect unrelated, protected data to fuel mass deportation machine

42
43
44
45
46
47
 
 

With advances in facial recognition, surveillance culture is rising and remaining anonymous in public is getting difficult. In this in-depth guide, we explore how to avoid surveillance from face recognition and ways to protect yourself.

48
 
 
49
 
 

Using 3D holograms polished by artificial intelligence, researchers introduce a lean, eyeglass-like 3D headset that they say is a significant step toward passing the “Visual Turing Test.”

50
view more: ‹ prev next ›