establishing authorship

On creating a Komodo macro for establishing authorship of a indieweb post.

Usecase: As part of displaying comments on my post pages, I need to be able to convey/display the author of a the comment.

I am looking at the indiewebcamp Authorship Algorithm

The aim of komodo macro is to for authorship of post and if authorship established then the macro creates and stores a mini h-card by using the existing authors h-card credentials

  • the stored mini h-card can be pulled from the store whenever we need to display authorship in our post comments section or in the posts e-content section.

  • The mini h-card is built for visual hyperlinked ‘inline display’ in our html posts page. It displays the persons name, their avator, and online hyperlinked identity.

  • The mini h-card is a h-card with base subset of h-card properties

    1. logo/photo/avator as either u-logo or u-photo
    2. name as p-name and or p-nickname
    3. url (of author which will be thier profile/homepage)
  • Establishing a p-nickname (nickname/alias/handle) might be useful so we can go @handle in our markdown text and the handle will get auto replaced with hyperlinked mini h-card version.

  • The h-card URL establishes uniqueness so the filename could be be a hash of the URL because the authors representative h-card should be at this URL. However with the basis of the indieweb is owning your own personal-domain to create online identity, then we just need to store a hash of the domain name.

Tests

... continue to read article
article first published on the and updated
tagged - komodo

from url to unique resource name

On creating on way hashes of URLS to be used as file names

To get a unique file name for my citations collection I needed to create a oneway hash of URLs. I do so with a base64 flag. To make hashed base64 string safe to use as a file name or a existdb resource name I use the xPath translate function to replace the bad chars with the good.

Posted as a gist on github 9577957

1·version·"3.0";
2import·module·namespace·util="http://exist-db.org/xquery/util";
3let·$href··:=·"${url}"
4let·$base64flag·:=·true()
5let·$alogo·:=·'md5'
6let·$hash·:=·replace(util:hash($href,·$alogo,·$base64flag),·'(=+$)',·'')
7return
8translate(·$hash,·'+/',·'-_')

Modified to remove end = at end of string ref. line 6

article first published on the and updated
tagged - xquery - existdb
commenting on my note 083751
this page is the comment source, while the target url of the comment is
http://markup.co.nz/archive/2014/03/16/083751
The 'reply-context' (what I am replying to)
should appear 'header' for the note or perhaps in a 'aside'.
comment first published on the and updated
tagged - test
Test note: I am going to try to comment on this note
This note will be the target of a mention.
The mention will appear in the footer of the article.
note first published on the and updated
tagged - test - webmentions

generating a reply context

Outlining my system for generating a reply context for replies

The reply context

A ‘comment’ is always about another ‘post’ so to understand what the comment is in reply to, its helpful to provide some reply context.

This reply context stuff becomes a section in our permalink page marked up as a microformat h-cite container. Its not part of our content, but is there to give the reader some understanding of the origin of the content. I might ‘cite’ this page in another post so it might be best to store a collection of citations with the unique resource identifier being a hash of the URL

Sugggest Citations items for a website via wikipedia

Web site: author(s), article and publication title where appropriate, as well as a URL, and a date when the site was accessed.

... continue to read article
article first published on the and updated
tagged - indieweb - replies - komodo - xPath - comments

svg workflow

About pre-proccessing SVG with Scour, then Gziping, then storing in existdb, then serving the SVG gzipped file via Nginx

When we save our svg in our www/resources/images/svg dir in our editor we trigger a call to an ant target named store-svg which does our grunt work. Before we store the SVG we want to pre-proccess the file.

  1. to create a smaller file size

  2. and generate a gzipped version of the file.

In our pre-proccess stage we run scour an SVG scrubber.

1<echo>Run·Scour</echo>
2<exec·executable="/bin/sh">
3<arg·line='-c·"scour·-i·${srcfile}·-o·${outfile}·--strip-xml-prolog·--indent=none"'·/>
4<redirector·outputproperty="scour-out"·/>
5</exec>

We should see some improvement in size, however we can reduce the file size even further by gziping. We could also gzip with scour however, we won’t, as we have compiled Nginx -with-http_gzip_static_module, and with the http_gzip_static_module we have no support for .svgz files. . So instead we create gzip the file upload to eXistdb and allow Nginx to send the compressed file with the “.gz” file name extension instead of the regular svg file.

... continue to read article
article first published on the and updated
tagged - svg - scour - existdb - nginx

storing scalable vector graphics


I have included some svg icons in my pages on this site, however they are loaded with the page. I want to add them to the images/svg and then dymamically load them and insert into the dom.

First make sure svg ’on save’ is uploaded to localhost.

Then add the store-md target to the build file.

Note we remove www from the path with a filterchain. We add the collection path Then store the file. curl to check if we can fetch it and its returning the right content type

Ok. lets go to our templates/includes folder open up head.html and add

... continue to read article
article first published on the and updated
tagged - svg

optimising javascript


download js

article first published on the and updated
tagged - javascript

hacking repo nginx exist ubuntu


re. nginx-eXist-ubuntu

Thanks for the mention Joe. The scripts set up for ‘remote-host production’, enable Nginx to act as a reverse proxy and cache server for eXist. For local-host development Nginx just acts as a reverse proxy because you want to see your development changes before you deploy.

Its my dog-food, it works for me. I make no claim to be an expert on caching.

If anyone thinks it useful I’ll create a ant file with a properties file to alter variable properties like the period the cache is valid which is hard coded. Set to one day below.

ref: proxy-cache.conf

... continue to read article
article first published on the and updated

nginx as reverse proxy for exist

Nginx The Web Server, Exist The XML Application Server

nginx-eXist-ubuntu

Nginx as a reverse proxy and cache server for the eXist-db Application Server

eXist-db provides a XML document-oriented schema-less data-store and an xQuery engine to access and serve this data.

ubuntu with it’s server and deskstop vesions, pretty much the best OS enviroment for developing web apps.

Nginx The Web Server, Exist The XML Application Server

... continue to read article
article first published on the and updated
tagged - nginx - ubuntu - existdb
A new note.
note first published on the and updated
tagged - test
Test: Draft test 2
note first published on the and updated
tagged - test

publishing drafts


Now I have a local to remote trigger, I don’t want every save to appear live on the remote production server. So I have included a draft publishing control to be included in the front matter.

The draft value, either ‘yes’ or ’no’ TODO: maybe add true false If there is no draft key in the front matter then the default is to publish to remote.

This value md front matter value is converted to a atomPub control when saved to our data store. The control element is in its own namespace. http://www.w3.org/2007/app

I don’t know why it has its own namespace but it does so I’ll add it

In our stored atom entry

... continue to read article
article first published on the and updated
tagged - atom
This post is a comment, thanks Barry for the clear instructions
comment first published on the and updated
tagged - test

sending webmentions

Some background notes on sending webmentions.

A work in progress

Two parts

  1. This article: a ‘publishing client’ which can send webmentions
  2. Next article: my servers capability of receiving webmentions an doing something useful with the received mention.

A Webmentions Publishing Client.

I write markdown in a text editor (Komodo) publishing environment. The markdown text ’on save’ is preprocessed and published to localhost server which in turn publishes to my remote server. My publishing environment pretty much behaves like a ‘static site generator’. Meta data is provided via markdown front-matter block.

... continue to read article
article first published on the and updated
tagged - webmentions
Test: this note mentions another note http://markup.co.nz/archive/2014/02/20/091049
note first published on the and updated
Started coding for webmentions
note first published on the and updated
tagged - webmentions
Well I screwed that up. Wobbly site now back on an even keel ... until the next time
note first published on the and updated
tagged - selfdogfood
Just created a gist. #xquery3 function that uses 'group by' to list entries by year month day https://gist.github.com/grantmacken/9084989 used in http://markup.co.nz/archive
note first published on the and updated
tagged - xquery3 - existdb
also posted on twitter