Skip to main content
Underground Academia Cultural Seismography
  • Investigations
  • Methodology
  • About
  1. Home
  2. Legal
  3. AI Transparency

AI Transparency

Last updated: 20 April 2026

About this page

Underground Academia is a cultural-seismography research platform. This page explains how automated tools are used in our editorial workflow, in keeping with the transparency principle of EU AI Act, Article 50.

Where automated tools are used

Verification and research

Every date, figure, and attribution in our longform and investigation articles is cross-checked against independent public sources. Automated searches and source-matching tools assist this verification — they surface candidate sources and flag inconsistencies, but they do not replace human judgement. The editor decides what is trustworthy and what is not.

Editorial style review

Drafts pass through automated style checks before publication. These checks look for:

  • Formatting and citation consistency
  • Known AI-fingerprint patterns (overused phrases, unnatural symmetry, template markers)
  • Internal contradictions and unsupported claims
  • Compliance with editorial house style

Style checks run alongside human editorial review, not in place of it.

Hostile review

Before each longform article is published, it is reviewed by adversarial agents instructed to find weaknesses: unsupported claims, missing attribution, compliance gaps, weak arguments. When these agents surface issues, a human editor decides how to address them.

Where automated tools are not used

  • Authorship. Our longform and investigation articles are written by human authors. Automated tools do not generate the final text.
  • Opinion and framing. The argument, the judgements, and the perspective are the editor’s and author’s responsibility.
  • User profiling. The site does not use AI — or any technology — to build profiles of visitors, personalise content, or target individual readers.
  • Decision-making about readers. We do not make any automated decisions about visitors (no credit-scoring, no moderation classifiers for readers, no behavioural inference).

Human responsibility

Final responsibility for every published article — for the text, the facts, the opinions, and the compliance posture — rests with the Underground Academia editorial team, which is part of FolkUp. Automated tools are a productivity and quality-control aid; they do not dilute editorial accountability.

Known limitations

Automated tools can introduce errors. We mitigate this with:

  • Multi-stage verification (fact-check → legal review → hostile review → final editorial pass)
  • Transparent sourcing — every material claim links to a public source where possible
  • Regular review of our own workflow — when a failure mode is found, the process is updated

If you notice an error in our content, please let us know at [email protected] and we will investigate, correct, and publish the correction.

Your rights

Under the EU AI Act and related regulation, you have the right to:

  • Know when you are interacting with an AI-driven system (we do not run one on this site — it is a static publication)
  • Understand how automated decisions that affect you are made (not applicable here — no such decisions are made)
  • Contact a human for clarification or correction at any time

Contact

Questions about our editorial workflow or AI usage: [email protected].

Last updated: April 20, 2026

Part of the FolkUp ecosystem

Support this research

Privacy Policy Terms of Use Cookie Policy

Consent for embedded media

Spotify and YouTube embeds load only after you opt in. Those third parties receive your IP address and may set their own cookies. Without consent the article renders without players — platform links remain available.

Cookie policy