this post was submitted on 04 Dec 2025
4 points (75.0% liked)

Technology

5256 readers
521 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

Key FindingsChinese LLMs censor politically sensitive images, not just text.

  • While prior research has extensively mapped textual censorship, this report identifies a critical gap: the censorship of politically sensitive images by Chinese LLMs remains largely unexamined.
  • To address this, ASPI developed a testing methodology, using a dataset of 200 images likely to trigger censorship, to interrogate how LLMs censor sensitive imagery. The results revealed that visual censorship mechanisms are embedded across multiple layers within the LLM ecosystem.

The Chinese Government is deploying AI throughout the criminal‑justice pipeline—from AI‑enabled policing and mass surveillance, to smart courts, to smart prisons.

  • This emerging AI pipeline reduces transparency and accountability, enhances the efficiency of police, prosecutors and prisons, and further enables state repression.
  • Beijing is pushing courts to adopt AI not just in drafting basic paperwork, but even in recommending judgements and sentences, which could deepen structural discrimination and weaken defence counsels’ ability to appeal.
  • The Chinese surveillance technology company iFlyTek stands out as a major provider of LLM‑based systems used in this pipeline.

China is using minority‑language LLMs to deepen surveillance and control of ethnic minorities, both in China and abroad.

  • The Chinese Government is developing, and in some cases already testing, AI‑enabled public‑sentiment analysis in ethnic minority languages—especially Uyghur, Tibetan, Mongolian and Korean—for the explicitly stated purpose of enhancing the state’s capacity to monitor and control communications in those languages across text, video and audio.
  • DeepSeek and most other commercial LLM models have insufficient capacity to do this effectively, as there’s little market incentive to create sophisticated, expensive models for such small language groups. The Chinese state is stepping in to provide resources and backing for the development of minority‑language models for that explicit purpose.
  • China is also seeking to deploy this technology to target those groups in foreign countries along the Belt and Road.

AI now performs much of the work of online censorship in China.

  • AI‑powered censorship systems scan vast volumes of digital content, flag potential violations, and delete banned material within seconds.
  • Yet the system still depends on human content reviewers to supply the cultural and political judgement that algorithms lack, according to ASPI’s review of more than 100 job postings for online‑content censors in China. Future technological advances are likely to minimise that remaining dependence on human reviewers.

China’s censorship regulations have created a robust domestic market for AI‑enabled censorship tools.

  • China’s biggest tech companies, including Tencent, Baidu and ByteDance, have developed advanced AI censorship platforms that they’re selling to smaller companies and organisations around China.
  • In this way, China’s laws mandating internal censorship have created market incentives for China’s top tech companies to make censorship cheaper, faster, easier and more efficient—and embedding compliance into China’s digital economy.

The use of AI amplifies China’s state‑supported erosion of the economic rights of some vulnerable groups abroad, to the financial benefit of Chinese private and state‑owned companies.

  • ASPI research shows that Chinese fishing fleets have begun adopting AI‑powered intelligent fishing platforms, developed by Chinese companies and research institutes, that further tip the technological scales towards Chinese vessels and away from local fishers and artisanal fishing communities.
  • ASPI has identified several individual Chinese fishing vessels using those platforms that operate in exclusive economic zones where Chinese fishing is widely implicated in illegal incidents, including Mauritania and Vanuatu, and ASPI found one vessel that has itself been specifically implicated in an incident.
you are viewing a single comment's thread
view the rest of the comments
[–] Sims@lemmy.ml -1 points 1 month ago

"(AI) is transforming China’s state control system into a precision instrument for managing its population and targeting groups at home and abroad"

Lol, a bunch of dimwitted propaganda. "Manufacturing consent". Ordinary people in the US/West should be getting tired of this eternal warmongering by now. I mean, come on..