swx en

What is swx?

swx is a tool to generate a static website from a file tree written in markdown, txt2tags or any other markup language.

It does'nt require any other tool except usal commands (cp, sed…) and the markup parser.

Inspired by sw, swx will generate you html pages, create links between them in a menu. You cas use simple css and templates that will be included in your pages.


  • Convert markdown or any other markup to html
  • Add menu between pages
  • Ignore some files in a blacklist
  • Generate an atom feed : see swx_atom
  • Generate a blog page. For comments, use Hashover or Disqus.
  • Generate a site map see swx_plan
  • Generate a sitemap for web indexers see swx_sitemap
  • Use a Makefile to simplify the site generation see Makefile

Example : source tree


will become in output



Download files here here and untar the archive.


Simple run :

 ./swx directory_containing_website

then, push you site the way you prefer with mercurial, git, rsync or even ftp


To configure, edit swx.conf

You can change uppercase variables to fit your needs

  • BL : blacklist, the files that won’t be in the menu
  • CONVERTER : tool to convert sources . Pour txt2tags : CONVERTER='txt2tags -t html -o- --no-headers'.
  • EXT="md" : extension of files to convert
  • FOOTER : what you want in footer
  • HEADER : what you want in header
  • TOPTEMPLATE : this will be included before the text of your pages Note that the default config file allows you to have a link to the to of the page.
  • BOTTOMTEMPLATE : text included between your text and footer.

Modyfy how your site looks with style.css


 mkdir dossier
cd dossier
wget https://framagit.org/3hg/swx/repository/master/archive.tar.gz
tar xvzf archive.tar.gz
cd swx*
mkdir monsite
vim monsite/index.md
vim monsite/page1.md
mkdir monsite/dossier1
vim monsite/dossier1/index.md

./swx/swx monsite 

Appears monsite.static containing every pages converted to html and a new file swx.log containing the list of news pages (used later)

RSS feed

Use swx_atom to generate feed.atom

 ./swx_atom > DESTDIR/feed.atom

Replace DESTDIR by the path to the output directory

Site map

Generate a list of pages in your site with

     ./swx_plan DESTDIR > DESTDIR/Divers/map.html

And replace of course : DESTDIR

Sitemap generator

     ./swx_sitemap DESTDIR > DESTDIR/sitemap.xml

Then, you can compress it :

     gzip --best -c DESTDIR/sitemap.xml > DESTDIR/sitemap.gz

Don’t forget to edit robots.txt

 User-agent: *
Sitemap: http://votredomaine.net/sitemap.gz


Example Makefile to edit to fit your needs:


    ./swx $(SOURCEDIR)
    ./swx_gopher $(SOURCEDIR)
    ./swx_sitemap  $(DESTDIR) > $(DESTDIR)/sitemap.xml
    gzip --best -c $(DESTDIR)/sitemap.xml > $(DESTDIR)/sitemap.gz
    ./swx_plan $(DESTDIR) > $(DESTDIR)/siteplan.html
    ./swx_atom > $(DESTDIR)/feed.atom
    ./swx_blog $(DESTDIR)/index.html
    rm -rf *.static
    rm swx.log
    find $(SOURCEDIR) -exec touch {} \;
    make all
    cd $(DESTDIR) && python3 -m http.server && x-www-browser http://localhost:8000

Then, just run :

  • make : build/update your site
  • make clean : Delete the site
  • make force : Force rebuild, event non-recent files
  • make serve : test site (python required) at http://localhost:8000

The end

Most of script can be improved, and simplified.

Avoid weird file names (use detox!).