contexts

Formal Contexts

This repository contains a collection of formal contexts to pursue Formal Concept Analysis.

The metadata for the contexts is contained in this YAML file.

More contexts can be found in the repository for ConExp-CLJ, the repository for the concepts Python module, and on Uta Priss’ page.

How to use the contexts

You can either manually download contexts or you can access them directly in your program code with a URL generated as follows: append the file name of the context (e.g., livingbeings_en.cxt) to the prefix https://github.com/fcatools/contexts/raw/main/contexts/. For example, in Python 3 you could do:

import urllib.request

url = "https://github.com/fcatools/contexts/raw/main/contexts/livingbeings_en.cxt"
context = urllib.request.urlopen(url).read().decode("utf-8")

How to contribute contexts

Additional formal contexts are highly welcome if they fulfil the following criteria:

  1. They should be about real things and not contain invented or random data.
  2. They should preferrably be small, that is, have not too many attributes and objects (each less than 100).

If you think your context is suitable, then proceed as follows:

  1. Fork this repository and make the following changes in your fork:
    1. Add your ASCII-encoded CXT file to the contexts directory, using a meaningful name (English, all lowercase, with two letters indicating the ISO 639 language code at the end, e.g., bodiesofwater_de.cxt for the German bodies of water context).
    2. Describe your context in contexts.yaml following the example of the other contexts. Try to be concise and precise.
  2. Make a pull request to merge your changes into this repository.

Working group

The repository is managed by a working group that communicates using a mailing list.

Further information

The idea for the repository has been described in

Hanika, T., Jäschke, R.: A Repository for Formal Contexts. In: Cabrera, I.P., Ferré, S., and Obiedkov, S. (eds.) Conceptual Knowledge Structures. pp. 182–197. Springer Nature Switzerland, Cham 2024. doi:10.1007/978-3-031-67868-4_13