Automating Wiki Page Creation from a PostgreSQL Table

From tech
Revision as of 14:57, 7 November 2023 by Luca (talk | contribs)
Jump to navigation Jump to search

Managing large datasets can be daunting. A neat solution is to use MediaWiki to transform data from a PostgreSQL table into organized wiki pages. This guide walks you through an automated process to achieve this, saving time and ensuring accurate, user-friendly data representation. Ideal for database administrators and developers, this tutorial bridges the gap between raw data and accessible wiki pages, making information management a breeze.

Initial Data

  • In my case, the table is called: public.cities
  • The only field of interest to me is friendly_name
  • The mediawiki template in my case is called city_t and has the city name as a parameter. The parameter is called city_name.

Requirements

  • The name of the page should not just be the city name, but should be composed similarly to: cityname_(city).
  • The parameter passed to the template should instead be just the city name.

Execution

Here is the query executed on PostgreSQL:

SELECT xmlelement(name mediawiki,
  xmlagg(
    xmlelement(name page,
      xmlelement(name title, friendly_name || '_(city)'),
      xmlelement(name revision,
        xmlelement(name text, '{{city_t|city_name=' || friendly_name || '}}')
      )
    )
  )
)::text
FROM public.cities
where active;

Note: the XML is converted to text because it was more convenient for me with the client I was using (DBeaver) to then copy and paste into a text file.

Having saved the text file with an .xml extension, it was enough to go to the wiki at: "SpecialPages -> Page tools -> Import pages" and upload the XML from there.

Done, hundreds of pages created quickly!