This is a reminder to self, for those times when you just gotta have
fishdots on a recent version of
fish shell (and someone upstream has completely screwed up the CA certificates so you can’t use
apt or linux
brew because they can’t be configured to relax cacert security).
First, create a place for them to live
mkdir -p ~/bin; cd /tmp
Now get fish shell
wget --no-check-certificate https://download.opensuse.org/repositories/shells:/fish:/nightly:/master/AppImage/fish-latest-x86_64.AppImage
chmod a+x fish-latest-x86_64.AppImage
mv ./squashfs-root/ ~/bin/fish_root
abbr -a fish ~/bin/fish_root/AppRun
Then download neovim
chmod u+x nvim.appimage
mv ./squashfs-root/ ~/bin/nvim_root
abbr -a nvim ~/bin/nvim_root/AppRun
Lovely. Now you have the one true shell and the one true editor. All’s well with the world.
Knowledge Graphs provide a neat and easy way to segment your data, called ‘Named Graphs‘. This post shows how you access them, and different uses they may be put to.
I put together a list of OWL2 snippets for Visual Studio Code, for use with Turtle (
The shortcuts cover most of the OWL2 Reference Card. Enjoy.
With this installment we finally get to the part of knowledge graphs that I personally find really exciting: Semantics. In this installment, I will introduce some of the simple rules of entailment that are a part of the RDFS standard.
This installment moves beyond the simple graph model of RDF to introduce the modelling support of RDF Schema. I will go on to show you how using the W3C Standard RDFS imbues your data with another layer of meaning, and makes it easier for you to enrich your raw data with meaning over time.
This installment leaves the CLI behind to show how we consume a knowledge graph within our programmatic environments. The framework I use to work with RDF is dotNetRdf.
Last time I showed you how to use CLI tools to build out your RDF data to more depth using Turtle files and how to query it using the Apache Jena CLI toolchain using SPARQL Query language. This time I’ll show how to insert and retrieve data from a remote triple store. I’ll continue using the CLI tools for now.