For a long time, openness movements and initiatives with labels like “Open Access”, “Open Educational Resources” (OER) or “Linked Science” have been working on establishing a culture where scientific or educational resources are by default published with an open license on the web to be read, used, remixed and shared by anybody. With a growing supply of resources on the web, the challenge grows to learn about or find resources relevant for your teaching, studies, or research.
+
In this post, we describe the SkoHub project being carried out in 2019 by the hbz in cooperation with graphthinking GmbH. The project seeks to implement a prototype for a novel approach in syndicating content on the web by combining current web standards for sending notifications and subscribing to feeds with knowledge organization systems (KOS, sometimes also called “controlled vocabularies”).*
+
Current practices and problems
+
What are the present approaches to the problem of finding open content on the web, and what are their limitations?
+
Searching metadata harvested from silos
+
Current approaches for publishing and finding open content on the web are often focused on repositories as the place to publish content. Those repositories then provide (ideally standardized) interfaces for crawlers to collect and index the metadata in order to offer search solutions on top. An established approach for Open Access (OA) articles goes like this:
+
+
Repositories with interfaces for metadata harvesting (OAI-PMH) are set up for scholars to upload their OA publications
+
Metadata is crawled from those repositories, normalized and loaded into search indexes
+
Search interfaces are offered to end users
+
+
+
With this approach, subject-specific filtering is either already done when crawling the data to create a subject-specific index, or when searching the index.
+
Maintenance burden
+
When offering a search interface with this approach, you have to create and maintain a list of sources to harvest:
+
+
watch out for new relevant sources to be added to your list,
+
adjust your crawler to changes regarding the services’ harvesting interface,
+
homogenize data from different sources to get a consistent search index.
+
+
Furthermore, end users have to know where to find your service to search for relevant content.
+
Off the web
+
Besides being error-prone and requiring resources for keeping up with changes in the repositories, this approach also does not take into account how web standards work. As Van de Sompel and Nelson 2015 (both co-editors of the OAI-PMH specification) phrase it:
+
+
“Conceptually, we have come to see [OAI-PMH] as repository-centric instead of resource-centric or web-centric. It has its starting point in the repository, which is considered to be the center of the universe. Interoperability is framed in terms of the repository, rather than in terms of the web and its primitives. This kind of repository, although it resides on the web, hinders seamless access to its content because it does not fully embrace the ways of the web.”
+
+
In short, the repository metaphor guiding this practice obscures what constitutes the web: resources that are identified by HTTP URIs (Uniform Resource Identifier).
+
Subject-specific subscription to web resources
+
So how could a web- or resource-centric approach to resource discovery by subject look like?
+
Of the web
+
To truly be part of the web, URIs are the most important part: Every resource (e.g. an OER) needs a URL that locates and identifies it. In order to make use of knowledge organization systems on the web, representing a controlled vocabulary using SKOS vocabulary is the best way to go forward: each subject in the vocabulary is identified by a URI. With these prerequisites, anybody can link their resources to subjects from a controlled vocabulary. This can be done e.g. by embedding LRMI, “Learning Resource Metadata Initiative” metadata as JSON-LD into the resource or its description page.
+
+
Web-based subscriptions and notifications
+
So, HTTP URIs for resources and subject are important to transparently publish and thereafter identify and link educational resources, controlled vocabularies and subject on the web. But with URIs as the basic requirement in place, we also get the possibility to utilize further web standards for the discovery of OER. For SkoHub, we make use of Social Web Protocols to build an infrastructure where services can send and subscribe to notifications for subject. The general setup looks as follows:
+
+
Every element of a controlled vocabulary gets an inbox, identified by a URL.
+
+
+2. Systems can send notifications to the inbox, for example “This is a new resource about this subject”.
+
+3. Systems can subscribe to a subject’s inbox and will directly receive a notification as soon as it is received (push approach).
+
+
This infrastructure allows applications
+
+
to send a notification to a subject’s inbox containing information about and a link to new content about this subject
+
to subscribe to the inbox of a subject from a knowledge organization system in order to receive push updates about new content in real time.
+
+
Here is an example: a teacher is interested in new resources about environmental subjects. She subscribes to the subject via a controlled vocabulary like ISCED-2013 Fields of Education and Training. She then receives updates whenever a colleague publishes a resource that is linked to the subject.
+
+
To be really useful, applications for subscribing to content should enable additional filters, to subscribe to combinations of subjects (e.g. “Environment” & “Building and civil engineering”) or to add addtional filters on educational level, license type etc.
+
Advantages
+
This subject-oriented notification/subscription approach to content syndication on the web has many advantages.
+
Push instead of pull
+
+With the push approach, you subscribe once and content is coming from different and new sources without the subscriber having to maintain a list of sources. Of course quality control might become an issue. Thus, instead of whitelisting by administering a subscription list one would practice blacklisting by filtering out sources that distribute spam or provide low-quality content.
+
Supporting web-wide publications
+
Being of the web, SkoHub supports publications residing anywhere on the web. While the repository-centric approach favours content in a repository that provides interfaces for harvesting, with SkoHub any web resource can make use of the notification mechanism. Thus, content producers can choose which tool or platform best fits their publishing needs, be it YouTube, a repository, hackmd.io or else. The only requirement is for publications to have a stable URL and, voilà, they can syndicate their content via KOS.
+
Knowledge organization systems are used to their full potential
+
+This additional layer to the use of Knowledge Organization Systems makes them much more powerful (“KOS on steroids”) and attractive for potential users.
+
Encouraging creation and use of shared Knowledge Organization Systems across applications
+
+In the German OER context it is a recurring theme that people are wishing everybody would use the same controlled vocabularies so that data exchange and aggregation required less mapping. With a SkoHub infrastructure in place, there are big additional incentives on going forward in this direction.
+
Incentive for content producers to add machine-readable descriptions
+
+When subject indexing becomes tantamount with notifying interested parties about one’s resources, this means a huge incentive for content producers to describe their resources with structured data doing subject indexing.
+
SkoHub project scope
+
The SkoHub project has four deliverables. While working on the backend infrastructure for receiving and pushing notifications (skohub-pubsub), we also want to provide people with means to publish a controlled vocabulary along with inboxes (skohub-ssg), to link to subjects and send notifications (skohub-editor) and to subscribe to notifications in the browser (skohub-deck).
+
skohub-pubsub: Inboxes and subscriptions
+
+Code: https://github.com/hbz/skohub-pubsub
+
+This part provides the SkoHub core infrastructure, setting up basic inboxes for subjects plus the ability of subscribing to push subscriptions for each new notification.
+
skohub-ssg: Static site generator for Simple Knowledge Organization Systems
+
+Code: https://github.com/hbz/skohub-ssg
+
+This part of the project covers the need to easily publish a controlled vocabulary as a SKOS file, with a basic lookup API and a nice HTML view including links to an inbox for each subject.
+
skohub-editor: Describing & linking learning resources, sending notifications
+
+Code: https://github.com/hbz/skohub-editor
+
+The editor will run in the browser and enable structured description of educational resources published anywhere on the web. It includes validation of the entered content for each field and lookup of controlled values via the API provided by skohub-ssg.
+
skohub-deck: Browser-based subscription to subjects
+
+Code: https://github.com/hbz/skohub-deck
+
+The SkoHub deck is a proof of concept to show that the technologies developed actually work. It enables people to subscribe to notifications for specific subjects in the browser. The incoming notifications will be shown in a Tweetdeck-like interface.
+
Outlook
+
The project will be completed by end of 2019. We intend to provide updates about the process during the way. Next up, we will explain the technical architecture in more detail, expanding on our use of social web protocols. Furthermore, we will provide updates on the development status of the project.
+
+
* Note that while SkoHub has clear similarities with the “Information-Sharing Pipeline” envisioned in Ilik and Koster 2019 regarding the use of social web protocols on authority data, there is also a fundamental difference: While Ilik and Koster are talking about sharing updates of authority entries themselves (e.g. receiving updates for a person profile to be considered for inclusion in one’s own authority file), SkoHub is about sharing new links to an entry in an authority file or other controlled vocabulary.
+
References
+
de Sompel, Herbert Van / Nelson, Michael L. (2015): Reminiscing About 15 Years of Interoperability Efforts. D-Lib Magazine 21 , no. 11/12. DOI: 10.1045/november2015-vandesompel
+
+
\ No newline at end of file
diff --git a/2019-09-27-skohub-vocabs/index.html b/2019-09-27-skohub-vocabs/index.html
new file mode 100644
index 0000000..300d2a2
--- /dev/null
+++ b/2019-09-27-skohub-vocabs/index.html
@@ -0,0 +1,275 @@
+Presenting the SkoHub Vocabs Prototype | Skohub Blog
We are happy to announce that the SkoHub prototype outlined in our post “SkoHub: Enabling KOS-based content subscription” is now finished. In a series of three post we will report on the outcome by walking through the different components and presenting their features.
+
SkoHub is all about utilizing the power of Knowledge Organization Systems (KOS) to create a publication/subscription infrastructure for Open Educational Resources (OER). Consequently, publishing these KOS on the web according to the standards was the first area of focus for us. We are well aware that there are already plenty of Open Source tools to publish and edit vocabularies based on SKOS, but these are usually monolithic database applications. Our own workflows often involve managing smaller vocabularies as flat files on GitHub, and others seem to also do so.
+
We will thus start this series with SkoHub Vocabs (formerly called “skohub-ssg”), a static site generator that provides integration for a GitHub-based workflow to publish an HTML version of SKOS vocabularies. Check out the JAMStack Best Practices for some thoughts about the advantages of this approach. SkoHub Vocabs – like SkoHub Editor that will be presented in a separate post – is a stand-alone module that can already be helpful on its own, when used without any of the other SkoHub modules.
+
How to publish a SKOS scheme from GitHub with SkoHub Vocabs
+
Let’s take a look at the editing and publishing workflow step by step. We will use SkoHub Vocabs to publish a subject classification for Open Educational Resources. We will use the “Educational Subject Classification” (ESC), that was created for the OER World Map based on ISCED Fields of Education and Training 2013.
+
Step 1: Publish vocab as turtle file(s) on GitHub
+
Currently, a SKOS vocab has to be published in a GitHub repository as one or more Turtle file(s) in order to be processed by SkoHub Vocabs. ESC is already available on GitHub in one Turtle file, so there is nothing to do in this regard. Note that you can also use the static site generator locally, i.e. without GitHub integration; see below for more about this.
+
Step 2: Configure webhook
+
In order to publish a vocabulary from GitHub with SkoHub Vocabs, you have to set up a webhook in GitHub. It goes like this:
+
+
In the GitHub repo where the vocab resides, go to “Settings” → “Webhooks” and click “Add webhook”
+
+
+2. Enter https://test.skohub.io/build as payload URL, choose application/json as content type and enter the secret. (Please contact us for the secret if you want to try it out.)
+
+
Step 3: Execute build & error handling
+
For the vocabulary to be built and published on SkoHub, there has to be a new commit in the master branch. So, we have to adjust something in the vocab and push it into the master branch. Looking again at the webhook page in the repo settings, you can see a notice that the build was triggered:
+
+
However, looking at the build log, an error is shown and the site did not build:
+
+
Oops, we forgot to check the vocab for syntax errors before triggering the build and there actually is a syntax error in the turtle file. Fixing the syntax in a new commit will automatically trigger a new build:
As we want the canonical version of ESC to be the one published with SkoHub Vocabs, we need to redirect the namespace URI we defined in the Turtle file to SkoHub. As we used w3id.org for this, we have to make a pull request in the respective repo.
+
+
If everything looks good, w3id.org PRs are merged very quickly, in this case it happened an hour later.
+
Result: HTML & JSON-LD representation published with SkoHub & basic GitHub editing workflow
+
As a result, we have published a controlled vocabulary in SKOS under a permanent URI and with a human-readable HTML representation from GitHub with a minimum amount of work. Additionally, the initial Turtle representation is transformed to more developer-friendly JSON-LD. The HTML has a hierarchy view that can be expanded and collapsed at will:
+
+
There also is a search field to easily filter the vocabulary:
+
+
This filter is based on a FlexSearch index that is also built along with the rest of the content. This allows us to implement lookup functionalities without the need for a server-side API. More about this below and in the upcoming post on the SkoHub Editor.
+
Implementation
+
To follow along the more technical aspects, you might want to have SkoHub Vocabs checked out locally:
+
$ git clone https://github.com/hbz/skohub-vocabs
+$ cd skohub-vocabs
+$ npm i
+$ cp .env.example .env
+
The static site generator itself is implemented with Gatsby. One reason for this choice was our good previous experience with React. Another nice feature of Gatsby is that all content is sourced into an in-memory database that is available using GraphQL. While there is certainly a learning curve, this makes the experience of creating a static site not that much different from traditional database-based approaches. You can locally build a vocab as follows:
+
$ cp test/data/systematik.ttl data/
+$ npm run build
+
This will result in a build in public/ directory. Currently, the build is optimized to be served by Apache with Multiviews in order to provide content negotiation. Please note that currently only vocabularies are supported that implement the slash namespace pattern. We will add support for hash URIs in the future.
+
In order to trigger the static site generator from GitHub, a small webhook server based on Koa was implemented. (Why not Express? – It wouldn’t have made a difference.) The webhook server listens for and validates POST requests coming from GitHub, retrieves the data from the corresponding repository and then spins up Gatsby to create the static content.
+
A final word on the FlexSearch index mentioned above. An important use case for vocabularies is to access them from external applications. Using the FlexSearch library and the index pre-built by SkoHub Vocabs, a lookup of vocabulary terms is easy to implement:
Note that currently the index will only return URIs associated with the search term, not the corresponding labels. This will change in a future update.
+
+
\ No newline at end of file
diff --git a/2020-01-29-skohub-talk-at-swib19/index.html b/2020-01-29-skohub-talk-at-swib19/index.html
new file mode 100644
index 0000000..775b41e
--- /dev/null
+++ b/2020-01-29-skohub-talk-at-swib19/index.html
@@ -0,0 +1,11 @@
+SkoHub talk at SWIB19: KOS-based content syndication with ActivityPub | Skohub Blog
SkoHub talk at SWIB19: KOS-based content syndication with ActivityPub
January 29, 2020 | Adrian Pohl
On November 27th 2019, Adrian Pohl and Felix Ostrowski (graphthinking) presented SkoHub at the “Semantic Web in Libraries” conference in Hamburg (SWIB19).
+
+
+
\ No newline at end of file
diff --git a/2020-03-31-skohub-editor/index.html b/2020-03-31-skohub-editor/index.html
new file mode 100644
index 0000000..a44f14c
--- /dev/null
+++ b/2020-03-31-skohub-editor/index.html
@@ -0,0 +1,287 @@
+Presenting the SkoHub Editor | Skohub Blog
ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub Editor demo for an indefinite time. However, the code is still there for anybody to set up their own instance.
+
+
In a previous blog post we presented a first SkoHub module: SkoHub Vocabs. Before talking about another module, first a short summary of the features SkoHub Vocabs offers. Basically, it provides an editorial workflow to publish a SKOS vocabulary on the web which can then be consumed by humans and applications. It builds on git-based online software development platforms (currently GitHub and GitLab are supported) where you maintain a SKOS vocabulary as a Turtle file. This allows you to use all the associated features such as branches and pull requests for a full-fledged review process. With every new commit in a branch, triggered by a webhook, SkoHub Vocabs will build a static site for the vocab – with HTML for human consumption and JSON-LD for consumption by applications.
+
In this post, we present SkoHub Editor (demo, code) that is accompanied by a browser extension. In a nutshell, SkoHub Editor enables the automatic generation of a web form based on a JSON schema, along with the possibility to look up terms in a controlled vocabulary that is published with SkoHub Vocabs. Additionally, metadata generated by the editor can be published using SkoHub PubSub, which we will describe in an upcoming post. Let’s take a look at the specifics by configuring an editor that lets you create JSON-LD describing an open educational resource (OER) on the web.
+
Describing a resource with the browser extension
+
Let’s start with actually using SkoHub Editor. You will have the most comfortable experience when using the SkoHub browser extension that wraps the SkoHub Editor and pre-populates some field in the web form. The browser extension is available both for Firefox and Chrome. Just add the extension to your browser and a little icon will be shown on the right-hand side of your navigation bar:
+
+
While having any web page open, you can now open the SkoHub editor in your browser to describe that web resource. Let’s use as an example the YouTube video “COVID-19 – 6 Dangerous Coronavirus Myths, Busted by World Health Organization” published recently by the World Economic Forum under a CC-BY license. Open the video in your browser, click on the extension and you will see that several fields are automatically filled out.
+
+
We can now add additional metadata by selecting a type (VideoObject in this case), add a creator, creation date, language etc. As we mentioned, you can look up a subject from a controlled vocabulary for some fields in the web form. You will experience this when inputting content into the fields “Subject”, “License”, “Learning Resource Type”, and “Intended Audience”. For those fields you will get a drop down with suggestions from a controlled vocabulary, e.g. for “Subject” from a German classification of subjects in Higher education that is published with SkoHub Vocabs.
+
+
Currently, only the fields “URL”, “Type” and “Title” are obligatory, all other fields are optional. When you think you have described the resource sufficiently, you can click on “Show Preview” in the extension, copy & paste the JSON-LD to the clipboard and include it in the HTML of any web page within a <script type="application/ld+json"> tag.
As said above, the SkoHub Extension wraps the SkoHub Editor running at https://skohub.io/editor/. SkoHub Editor is configured with a JSON schema document that is used both to generate appropriate form inputs and to validate the entered content. Thus, the JSON Schema is the central, most crucial part when working with SkoHub Editor. Currently, we are using as default schema a draft schema for OER we created using relevant properties and types from schema.org. With the JSON schema URL, we can now load the web form you already know from the browser extension by providing the link to the schema. Of course, you can just write your own schema to build a web form for your use case.
+
Let’s take a short look at the underlying schema, which we tried to keep as straightforward as possible. Generally, with JSON schema you can specify a number of optional or mandatory properties and what type of input each expects. The "title" of each property will be used as the label for the field in the web form.
Such lists of allowed values can be considered controlled vocabularies, and ideally they should be shared across many data sources. This is where SkoHub Vocabs comes into play. Instead of embedding the list of allowed values into our schema, we can reference a SKOS vocabulary on the web:
Notice the custom key _widget in the JSON schema. This will configure the editor to use the specified UI element for the given field. In our example, the SkohubLookup widget is used, which works with all controlled vocabularies that are published with SkoHub Vocabs. All custom JSON schema extensions start with an underscore _ and are used to control the look and feel of the editor; see below for an example for how to hide a field on the form.
+
Finally, to make our data JSON-LD, we also set a mandatory @context property and a default object value for the @context. This makes the editor add it to the document without any user interaction needed.
Of course you can also poke around the editor while running it locally:
+
$ git clone https://github.com/hbz/skohub-editor.git
+$ cd skohub-editor
+$ npm install
+
As is the case with SkoHub Vocabs, the editor is implemented in React. The form components are located in src/components/JSONSchemaForm. In a nutshell, a Form provides data to the various input components:
Obviously it would be tedious to manually code all the inputs for a given schema. This is where the Builder comes into play. It reads a schema and creates all necessary input components:
The browser extension is essentially a simple wrapper for the editor running at https://skohub.io/editor/, which the extension injects as an iframe into the current page. Additionally, before the iframe is injected, some metadata is scraped from that page. This data is used to pre-populate the editor. This process obviously depends both on the data found in the web page and on the schema the editor is configured to use. YouTube for example uses meta name="description" for data about YouTube itself rather than the actual video, which is described in meta property="og:description". Even if the correct metadata is extracted, there is no guarantee that the schema used to configure the editor even has a description field. In the future, it would be nice to find a possibility to somehow map page metadata to properties in the schema itself.
+
Outlook
+
SkoHub Editor already works very well and can be extremely useful. However, some things are still work in progress and will need some future effort to be improved:
+
+
Using schema.org markup for pre-population: This might sound obvious but we have not implemented it yet, see #17.
Furthermore, some work will have to be put into the current default schema and the controlled vocabularies it uses:
+
+
Develop JSON Schema: The JSON Schema definitely is not finished yet. For example, it makes sense to include http://schema.org/keywords in the future for adding arbitrary tags to describe a resource. We plan to develop the schema within the common OER metadata group of DINI AG KIM & Jointly with a focus on describing OER in the German-speaking world.
+
Improve Vocabularies: For “Learning Resource Type” and “Intended Audience” we are using controlled vocabularies that are not nearly finished but in development at the LRMI Task Group of the Dublin Core Metadata Initiative (DCMI). Trying out the browser extension, you will for instance see that the educational resources types are missing some options. However, we assume that the combination of SkoHub Editor & SkoHub Vocabs makes a pretty nice environment for the development of these vocabularies in an open and transparent process on GitHub or GitLab.
+
+
Get involved
+
Please try it out and let us know what doesn’t work or which feature you are missing and also what you like about SkoHub. We are happy about every bug report, suggestion and feature requests for the production version. Get in contact with us via a hypothes.is annotation, GitHub, Email, Mastodon or IRC.
+
+
\ No newline at end of file
diff --git a/2020-06-25-skohub-pubsub/index.html b/2020-06-25-skohub-pubsub/index.html
new file mode 100644
index 0000000..c9dfb6a
--- /dev/null
+++ b/2020-06-25-skohub-pubsub/index.html
@@ -0,0 +1,231 @@
+Presenting SkoHub PubSub | Skohub Blog
ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub PubSub demo server at skohub.io for an indefinite time. However, the code is still there for anybody to set up their own instance.
+
+
In the previous blog posts we have presented SkoHub Vocabs and SkoHub Editor. In the final post of this SkoHub introduction series we will take a deeper look at SkoHub PubSub, the part of SkoHub that brings the novel approach of KOS-based content subscription into the game.
+
Let’s refresh what SkoHub is about by quoting the gist from the project homepage:
+
+
SkoHub supports a novel approach for finding content on the web. The general idea is to extend the scope of Knowledge Organization Systems (KOS) to also act as communication hubs for publishers and information seekers. In effect, SkoHub allows to follow specific subjects in order to be notified when new content about that subject is published.
+
+
Before diving into the technical implementation and protocols used, we provide an example on how this subscription, publication and notification process can be carried out in practice. Although SkoHub PubSub constitutes the core of the SkoHub infrastructure being the module that brings all SkoHub components together, it is not visible to end users by itself but only through applications which send out notifications or subscribe to a specific topic. (This is the great thing about open standards as it also invites everybody to develop new clients for specific use cases!)
+
So, let’s take a look at an example workflow involving SkoHub Editor and the federated microblogging service Mastodon to demonstrate the functionalities.
On the left-hand side, you can see the location of the topic in the classification hierarchy. On the right-hand side, there is some basic information on the subject: It has a URI (https://w3id.org/class/esc/n0322), a notation (0322), a preferred label (Library, information and archival studies) and an inbox. This is how the underlying JSON data (e.g. by adding the format suffix .json to the URI) looks like:
Besides the usual SKOS properties, the followers key gives a hint that I can somehow follow this subject. Clicking on the associated URL, I will see a JSON file containing the list of followers of this subject. I am also interested in this topic and want to follow it to receive notifications about new online resources that are published and tagged with this subject. How do I achieve this?
+
As already noted, what I need is an application that speaks ActivityPub. In this case we will use one of the most popular services in the Fediverse: Mastodon. So, I open up my Mastodon client and put the topic URI into the search box:
+
+
I click on the follow button and am now following this subject with my Mastodon account and will receive any updates posted by it.
+
Describing and announcing a resource with SkoHub Editor
+
Let’s now switch into the role of a scholar, teacher, tutor or general interested person who has created an instructive online resource and wants to publish it to all people interested in the topic of “Library, information and archival studies”. In this case, I published a blog post about a talk at SWIB19 – Semantic Web in Libraries Conference and want to share it with others. I somehow need to send a message to the topic’s inbox, in this case I am using the SkoHub Editor (but it could be any other ActivityPub client or even the command line interface from which I publish). For the best user experience I download the SkoHub browser extension (Firefox, Chrome).
+
As the default JSON schema uses another classification, we first have to configure the editor based on a schema that actually makes use of the Educational Subjects Classification. For this, we created a version of the default schema that does so. Now I put it into the extension’s settings:
+
+
Then, I fire up the extension when visiting the web page I like to share and add data to the input form:
+
+
I select the topic “Library, information and archival studies” from the suggestions in the “subject” field, add information on licensing etc. and click “Publish”. A pop up lets me know that the resource is published to “Library, information and archival studies”. In the background, the description of the resource is sent to the respective topic (it could be more than one) which distributes the information to all its subscribers. Thus, in the end I as a subscriber of the topic will receive a notification of the resource in my Mastodon timeline:
+
+
Protocols and implementation
+
The SkoHub-PubSub server is built in Node.js and implements a subset of ActivityPub, Linked Data Notifications and Webfinger to achieve the behavior described above. On the ActivityPub side, Server to Server Follow and corresponding Undo interactions can be received to handle the subscription mechanism. Non-activity messages are considered Linked Data Notifications and can simply be sent to the inbox of a subject using a POST request with any JSON body. These notifications are considered metadata payload, wrapped in a Create action and distributed to every follower of the corresponding subject again using ActivityPub.
+
As for the internals, MongoDB is used to manage followers lists and an Elasticsearch index is used to keep an archive of all payloads that have been distributed. This archive can be used to search and further explore metadata that has been distributed, e.g. by visualizing the distribution of subjects across all payloads.
+
The most challenging aspects of the implementation were to gain an understanding of Webfinger for user discovery and of the details of message signatures and how to validate them. “How to implement a basic ActivityPub server” was a good guidance here!
+
Outlook
+
We currently consider PubSub the least mature component of SkoHub. In the future, we would like to validate incoming Linked Data Notifications against a JSON schema that should be specific enough to ensure a consistent experience when viewing them e.g. in Mastodon but flexible enough to support additional use cases. We would also like to support ActivityPub on the publication side and Announce activities in order to enable use cases such as mentioning a SkoHub concept on Mastodon. We would really value your input on this!
+
+
\ No newline at end of file
diff --git a/2020-10-09-skohub-apconf/index.html b/2020-10-09-skohub-apconf/index.html
new file mode 100644
index 0000000..5753438
--- /dev/null
+++ b/2020-10-09-skohub-apconf/index.html
@@ -0,0 +1,23 @@
+ActivityPub Conference 2020 | Skohub Blog
From 2-5 October the ActivityPub conference happened online where people using the ActivityPub protocol came together to discuss topics all around federated networks and the respective web standards. For presentations, a flipped classroom approach was chosen where talks would be uploaded before the conference and the live part would be Q&A sessions for each talk. On Sunday, there was an additional round of lightning talks and some birds of a feather (bof) sessions where – quite similar to a barcamp session – people interested in a topic could propose a session and meet with likeminded people.
On Friday, we had a session about SkoHub. Here is our previously recorded video:
+
+
With regard to creating and maintaining controlled vocabularies and assigning topics, (at least some) people who develop applications for the Fediverse have quite some interest in building on experiences and approaches from the library world. What we learned from our Q&A session is to better prepare next time as there most certainly will be some people who haven’t fully watched the recording or where some time has passed since watching it: Next time we would have some slides ready to recap the basic concepts and some links to point to exemplary implementations and further information.
+
Here are some projects that are dealing with categories, taxonomies in the Fediverse:
+
+
CommonsPub – which builds on experiences from MoodleNet – is working on “[f]ederated taxonomies for topic-based search and discovery across instances.”
LearnAwesome.org helps people in managing and finding learning material online in very different formats. It supports following certain topics.
+
The “Rebooting Indymedia” project wants to use topic-based channels to create moderation workflows for building independent news sites from decentrally published content.
+
In a related effort by Trolli Schmittlauch, the problem is addressed how to create a comprehensive hashtag search & subscription in a federated social network. See this paper for details.
+
+
In a birds of a feather session about “Topics” and Services we can subscribe to some of the people working on tags, controlled vocabularies etc. came together and had a fruitful exchange, sorting out the different problems and which approaches exist to address them. We are looking forward to further working on SkoHub and discussing common approaches to assigning or following controlled topics in the fediverse.
+
+
\ No newline at end of file
diff --git a/2020-11-25-swib20-workshop/index.html b/2020-11-25-swib20-workshop/index.html
new file mode 100644
index 0000000..a9ab76d
--- /dev/null
+++ b/2020-11-25-swib20-workshop/index.html
@@ -0,0 +1,17 @@
+SkoHub workshop at SWIB20 | Skohub Blog
Wednesday was workshop day where Adrian and Steffen offered a SkoHub workshop. They tried out a flipped classroom approach and created several video tutorials and wrote down walkthroughs for the participants to prepare themselves. This did not work out as intended, lesson learned: Always be prepared that a bigger number of participants is not prepared.
+
+
\ No newline at end of file
diff --git a/2021-12-10-skohub-vocabs-workshops/index.html b/2021-12-10-skohub-vocabs-workshops/index.html
new file mode 100644
index 0000000..d510729
--- /dev/null
+++ b/2021-12-10-skohub-vocabs-workshops/index.html
@@ -0,0 +1,48 @@
+SKOS Introduction workshops with SkoHub Vocabs | Skohub Blog
We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands-on learning how to publish a small vocabulary with SkoHub Vocabs. The first one “Eine Einführung in SKOS mit SkoHub Vocabs” was held as a cooperation between the Hochschulbibliothekszentrum NRW and the Göttingen eResearch Alliance with 14 German-speaking participants on 2021-11-02, see the workshop pad and the slides.
+
As the workshop worked quite well, we applied the same approach to our Workshop “An Introduction to SKOS with SkoHub-Vocabs” on 2021-11-30 at SWIB21 (slides) with around 20 participants.
+
+
Generally, we had the impression that participants of both workshops where having a good time, at least nobody left the conference room before the end of the workshop. Here are some notes and lessons learned we collected after the SWIB workshop:
+
+
We invited participants to share their screen to not be talking to the void. Around 4-5 participants followed our invitation which was ok for us.
+
Many participants joined without their microphones connected. We missed to explicitly ask them to turn their mics on.
+
We introduced ourselves and used the BigBlueButton poll feature to get to know the participants and their previous experience bit more.
+
We divided the workshop into two parts: an introduction as a frontal presentation and a hands-on part with discussion and Q&A phases in between. Though this worked quite well, we think it might be nice to switch more often between explanatory parts and hands-on parts.
+
Participants were a bit shy. We got some good feedback in the chat at the end but only from some participants. Next time, we should prepare another poll for getting feedback at the end of the workshop
+
+
With regard to the further development of SkoHub Vocabs, it became clear during the workshops that it would be great to have an automatic test with each commit that lets you know whether a SKOS/Turtle file in a repo is “SkoHub-ready”, i.e. conforms to the pattern that is supported by SkoHub Vocabs. Issue #91 is already addressing this need and should be worked on to accomplish this.
+
+
\ No newline at end of file
diff --git a/2021-lis-workshop/index.html b/2021-lis-workshop/index.html
new file mode 100644
index 0000000..47af530
--- /dev/null
+++ b/2021-lis-workshop/index.html
@@ -0,0 +1,48 @@
+SkoHub Presentation at LIS Workshop 2021 | Skohub Blog
Last Friday afternoon, the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) took place organized by the Working Group within the GfKL – Data Science Society. Adrian and Steffen had the chance to present SkoHub in the workshop’s first presentation. The slides can be viewed at https://pad.gwdg.de/p/lis-workshop21-skohub.
+
Here is an overview over the full programme that comprised six talks:
+
+
Adrian Pohl (hbz, Cologne, Germany) & Steffen Rörtgen (GWDG, Göttingen, Germany): SkoHub: Publishing and using knowledge organization systems on the web
+
Colin Higgins (University of Cambridge, Cambridge, United Kingdom): Justice, governance, and the thesaurus – the Cambridge experience with ‘illegal aliens’
+
Gislene Rodrigues da Silva & Célia da Consolação Dias (Universidade Federal de Minas Gerais, Belo Horizonte Brasil): Subjective aspects of indexing photographs from visual communication using a reading model based on the complex method and the primary functions of the image
+
Heidrun Wiesenmüller (Stuttgart Media University, Stuttgart, Germany): Orientation and exploration – the presentation of subject headings in German catalog
+
Karin Schmidgall (Deutsches Literaturarchiv Marbach, Marbach, Germany) & Matthias Finck (Effective Webwork, Hamburg, Germany): Glückliche Funde - ein Katalog der Forschende auf neue Ideen und Pfade bringt
+
Julijana Nadj-Guttandin (Deutsche Nationalbibliothek, Frankfurt, Germany) & Sarah Pielmeier (University and State Library, Münster, Germany): Ein neues und modulares Regelwerk für die verbale Inhaltserschließung / A new and modular standard for subject indexing
+
+
The workshop happened as part of the virtual conference “Data Science, Statistics & Visualisation and European Conference on Data Analysis 2021” (DSSV-ECDA 2021). The promotion for the workshop could probably have been better, e.g. the presentations weren’t even listed in the regular DSSV-ECDA programme. In the end, the speakers and moderators were among themselves with little additional audience. However, the talks discussed interesting topics and discussion was lively.
+
+
\ No newline at end of file
diff --git a/2022-05-eww-project-kickoff/index.html b/2022-05-eww-project-kickoff/index.html
new file mode 100644
index 0000000..64d6633
--- /dev/null
+++ b/2022-05-eww-project-kickoff/index.html
@@ -0,0 +1,39 @@
+Collaborating on improving SkoHub Vocabs | Skohub Blog
In a kickoff workshop, the SkoHub team at hbz has launched a cooperation with the Hamburg-based company effective WEBWORK to work on some pending issues regarding both functionality and design of SkoHub Vocabs. The issues and progress of the project can be followed in a Kanban board.
+
+
A big part of the project will concern a new functionality to support the grouping of concepts by creating SKOS collection pages (Issue #159). Another central goal of the cooperation is a redesign of the static sites that are generated by SkoHub Vocabs. Finally, a logo is to be developed for the SkoHub softwar suite that will hopefully capture the spirit of the software and the community behind it. We are looking forward to presenting the results within the next months.
+
The team from effective WEBWORK consists of software developers, a librarian and a designer whose skills overlap with those of the SkoHub community in many ways and especially regarding their enthusiasm for open source solutions. We are looking forward to this collaboration!
+
+
\ No newline at end of file
diff --git a/2022-11-skohub-workshop/index.html b/2022-11-skohub-workshop/index.html
new file mode 100644
index 0000000..82365dc
--- /dev/null
+++ b/2022-11-skohub-workshop/index.html
@@ -0,0 +1,22 @@
+Things are moving at SkoHub | Skohub Blog
In this blog post we want to introduce to you a new member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will be working on SkoHub as well as invite you to a workshop to present and discuss our future plans for the SkoHub project.
+
After some time with not much happening (or even with shutting down some SkoHub services), we are happy to inform about a lot of movement in the space. We already announced the current project with the effective WEBWORK team to develop a new logo, improve the design and fix some minor issues in SkoHub. We are happy about the improvements made. Watch this space for more details in an upcoming post.
+
Welcome, Steffen!
+
We are very happy to welcome Steffen Rörtgen in the open infrastructure team at the Hochschulbibliothekszentrum NRW who has joined the team this November. In his former projects he already made heavy use of SkoHub Vocabs and contributed to the project. He will now focus on further SkoHub development in context of the Metadaten.nrw project which is funded by the Ministry of Culture and Science of North Rhine-Westphalia (MKW).
+
With the grant for this project, we have resources for further development and would like to discuss with you our plans, especially regarding the use of SkoHub for reconciliation and the ActivityPub-based publish/subscribe approach (SkoHub PubSub). Having already defined some work packages within the Metadaten.nrw project we would like to align these with use cases and ideas the SkoHub community has.
+
Upcoming workshop
+
Therefore we are happy to invite you to a small workshop on Thursday, the 17th of November. We will start at 10:00h CET and split the workshop into two parts with each one lasting about two hours:
+
In the first part we will give an overview of the past and current developments of SkoHub as well as an introduction to the Metadaten.nrw project and its plans for SkoHub. At the end of the first part we want to discuss use cases for the publish/subscripe approach of SkoHub as well as for reconciliation. Our community member Andreas Wagner already developed a prototype for reconciliation with SkoHub, which he will present.
+
In the second part of the workshop we will then deep dive into the reconciliation topic. We will discuss Andreas’ approach and develop a roadmap for SkoHub to implement the desired reconciliation functionalities.
If you are interested in the workshop, please send an email to skohub@hbz-nrw.de
+and give us a note if you are joining the whole workshop or just the first part.
+
Please be aware that we won’t give a general introduction to SkoHub and expect you to be familiar with SKOS and the approach of SkoHub.
+
+
\ No newline at end of file
diff --git a/2022-12-02-new-look/index.html b/2022-12-02-new-look/index.html
new file mode 100644
index 0000000..c76c34f
--- /dev/null
+++ b/2022-12-02-new-look/index.html
@@ -0,0 +1,40 @@
+Have U Seen The New Look? | Skohub Blog
We are happy to announce the new SkoHub logo and design we have deployed right in time for our SWIB22 workshop on Wednesday! In the last months, Kai Mertens and effective WEBWORK helped to work out this new look in the context of the project we have announced earlier. We have now updated the SkoHub website, this blog and the default SkoHub Vocabs setup to incorporate the new logo and design.
+
Here is an example of how a vocabulary built with SkoHub Vocabs will look now with this default design:
+
+
\ No newline at end of file
diff --git a/2022-12-19-workshop-summary/index.html b/2022-12-19-workshop-summary/index.html
new file mode 100644
index 0000000..541dec9
--- /dev/null
+++ b/2022-12-19-workshop-summary/index.html
@@ -0,0 +1,58 @@
+Notes from the November workshop | Skohub Blog
Collection of requirements from the community regarding PubSub and reconciliation
+
Presentation of SkoHub Reconcile prototype by Andreas Wagner
+
+
We split the workshop in two parts. First, we gave a general overview about the current state of SkoHub and the renewed funding through the Metadaten.nrw project. Then we had a general discussion about SkoHub PubSub – the module to connect a SKOS vocab with the Fediverse – as well as Andreas Wagner’s Reconciliation prototype. See also the slides for the first part of the workshop.
+
In the second half Andreas gave us a technical deep dive into his reconciliation prototype, walked us through the code and we discussed the architecture as well as future development and integration into the SkoHub ecosystem.
+
In the following, we will go deeper into what happened in the different parts.
+
Current state of SkoHub
+
Currently, SkoHub Vocabs is by far the most used SkoHub module. It is used by the hbz, the metadata standardization groups around KIM, WirLernenOnline, The Institute for Educational Quality Improvement (IQB), in research projects in the area of digital humanities and by other people and institutes to publish their controlled vocabularies.
+
The browser plugin SkoHub Editor as well as the PubSub module haven’t been used in production yet and have been shut down temporarily in March 2022 due to missing resources.
+
In 2022 the work on SkoHub started again when we partnered with effective WEBWORK (eWW) to redesign the web pages, create a new logo, improve UI configuration and address other issues as for example the support of skos:Collection. (See the project kanban for an overview.)
+
Decouple software and services
+
The general idea is to further decouple the software “SkoHub” from its running instances. Therefore we also wanted eWW to work on UI configuration possibilities, so other institutes or projects can easily brand their SkoHub instance.
+
In the future, we will move the hosted instance, currently running at skohub.io to metadaten.nrw, the project which grants the further development of SkoHub.
+
Metadaten.nrw
+
In the end of 2021 hbz secured some funding by the Ministry of Culture and Science of North Rhine-Westphalia (MKW) for a project called Metadaten.nrw. It consists of two sub-projects, with one called “Infrastructure Initiative Metadata Services” being located in the Open Infrastructure team (OI) at hbz, where SkoHub development will take place.
+
We got four positions funded from which two are already filled, amongst them Steffen for SkoHub development. The goal of the project is to expand the community of users for the existing metadata infrastructure provided by hbz/OI, with focus on libraries and scholars in North Rhine-Westphalia (NRW), and to establish hbz as a competence center for metadata in NRW.
+
Accordingly, we plan to develop SkoHub further regarding the following topics:
+
+
Fediverse integration: Further development of SkoHub PubSub in the context of a concrete use case
+
Reconciliation: Bringing the SkoHub reconciliation module into production
+
Possibly support Annif integration in a later project phase
+
Offer SkoHub tutorials and workshops
+
+
Community, PubSub & Reconciliation
+
To further encourage contributions like the one from Andreas with the reconciliation prototype, we will set up contributing guidelines to have a clear and transparent definition of the development and deployment processes.
+
SkoHub PubSub
+
Afterwards we made a small (re)introduction to SkoHub PubSub and discussed possible use cases. We developed ideas about SkoHub PubSub serving as a communication hub between researchers for their research fields. Raphaëlle Lapotre came up with a conrete use case. They currently have some pains in the context of Timel Thesaurus, an indexing Thesaurus for huge amounts of digitized pictures of medieval iconography. Currently, there are problems with the task of storing the large amounts of images centrally in a repository. Researchers could hold the files locally in their NextCloud and publish the image metadata to inboxes of SKOS concepts. A central service could then listen to the data provided by each concept’s inbox and then display the metadata with a link pointing to the image in its storage location. There are actually two possible use cases : one with the digitized illuminations pictures of the Ahloma lab (EHESS, sample), the second one with painted ceilings pictures from all the mediterranean area, collected by an association of scholars and retired volunteers. Possibly, the support for ActivityPub in Nextcloud could help with such a project.
+
Another topic was the idea of community building around concepts. The Open Educational Resource Search Index (OERSI) as well as the WirLernenOnline project already use elaborated vocabularies to index their resources. Interested humans could easily follow these concepts and engage in discussions around them.
+
This is also applicable to researchers that will be able to build up a topic-specific data base and open discussions about their research in the fediverse. This also rose practical questions about what happens on the notification side with broader and narrower concepts. If I’m following a concept do I also want to get notifications about its narrower or broader concepts? These are questions that can be discussed further in our community.
+
SkoHub Reconciliation
+
Following the PubSub discussion Andreas presented his reconciliation prototype. The reconciliation prototype is based on the Reconciliation API spec developed by the W3C Entity Reconciliation Community Group, so it is interoperable and can be used in any kind of application that acts as a reconciliation client. Andreas implementation already worked in OpenRefine as well as in TEI Publisher’s annotation tool. After showing the implementation with some examples we went into a technical deep dive.
+
Andreas walked us through the code and we discussed the current implementation as well as the future architecture of the SkoHub modules. His current approach is based on the SkoHub Vocabs webhook part and lending code from SkoHub PubSub regarding the elasticsearch indexing.
+
The discussion resulted in the proposal to separate SkoHub Vocabs from the webhook module and by this further separate concerns of the respective modules. He integrated a doreconc query parameter to the webhook, which triggers a script that will populate the vocabulary to the reconcile prototype.
+
After the workshop we transferred the skohub-reconcile repository from Andreas to the SkoHub organization and are happy to start further developing it in 2023.
+
Final thoughts
+
The workshop was a great event to discuss with SkoHub users and those who want to be. We collected valuable feedback and ideas for development in the upcoming two years.
+Especially the sudden rise in awareness for the Fediverse opens up interesting use cases for SkoHub PubSub, which we are happy to engage in. The highlight of the workshop was the presentation of Andreas’ reconciliation prototype and its transfer in the SkoHub organization. This is a good example for the benefits of open source and use case driven development.
+
We are looking forward to future community events, more use cases and even more modules to be developed in the SkoHub ecosystem.
+
+
\ No newline at end of file
diff --git a/2023-02-09-tests-updated/index.html b/2023-02-09-tests-updated/index.html
new file mode 100644
index 0000000..87c7d75
--- /dev/null
+++ b/2023-02-09-tests-updated/index.html
@@ -0,0 +1,157 @@
+Moving to test-driven development and updating existing tests | Skohub Blog
Moving to test-driven development and updating existing tests
February 09, 2023 | Steffen Rörtgen
For quite some time we have been adding new features to SkoHub Vocabs like switching languages, display of all relevant properties on the concept page and support for skos:Collection. Unfortunately there were no tests added to actually test these new functionalities. This led to some surprises now and then, for example when we noticed that at one point language tags did not show up when visiting a Collection page directly.
+
Originally SkoHub Vocabs already contained some tests, so being the maintainer of SkoHub Vocabs I decided to follow up on that and got myself a little bit more familiar on the topic. Quickly I stumbled over the topic of Test Driven Development (TDD) and though I heard of it before, I decided to dive a little deeper and check, if that pattern might be appropriate for SkoHub Vocabs and the other SkoHub modules (and maybe my coding approaches in general).
+
The general idea of TDD is as follows (borrowed heavily from the Wikipedia article):
+
+
Requirements of a new feature are first translated into test cases
+
Tests are written
+
Code is written
+
+
+
This leads to the following development cycle:
+
+
+
Write tests: This ensures that the developer actually understands the user requirements. Usually this is done with the help of use cases and user stories.
+
+
+
Run tests: The tests should now fail. If not, it might be the case that the actual feature is already present in the code and no further code needs to be written. Maybe documentation has to be updated accordingly.
+
+
+
Write the simplest code that passes the new tests: The code can (and should) later be refactored, so it can be ugly at this point.
+
+
+
All tests should now pass: If the code is still failing it should be revised till all tests pass.
+
+
+
Refactor as needed: Your tests now verify that the new feature is working as expected. If any tests during refactoring fail, you now can and will immediately correct your code.
+
+
+
Consequences on SkoHub development
+
This approach has some consequences for the development of SkoHub modules.
+These changes will also be reflected in the yet to be published CONTRIBUTING.md.
+Issues for new features should contain use cases and user stories as well as some notes that indicate when the feature is actually correctly implemented.
+Just when all of this is present, the issue can be marked as ready.
+The use cases and notes can then be used to write the tests and follow the above mentioned development cycle.
+
Regarding code review, this approach also has some consequences. Code review should only be approved if tests were added for the new feature or the tests were adjusted in case of bug fixing.
+
Testing Strategies and Technologies in SkoHub Vocabs
+
At the end of this blog post I want to give you a short overview of currently used testing strategies and technologies used in SkoHub Vocabs development. We use unit tests, integration tests and end-to-end tests whereby we try to write more unit tests than integration tests than end-to-end tests. The reason for this is that end-to-end tests take long and are quite expensive regarding computing power and time. Unit and integration tests on the other hand are cheap and can also auto-run in the background on every save, thus giving you immediate feedback when something is broken.
+
For unit and integration tests we use Jest and the React-Testing-Library since Gatsby – with which SkoHub Vocabs is built – uses React.
+Some of the older tests used Enzyme, but after upgrading to React 18 I noticed that Enzyme was no longer working, because the project is dead.
+After some research I found the React-Testing-Library as the most recommended testing framework and migrated the old Enzyme tests.
+After some initial training, writing tests became actually quite handy and fun.
+
Adding tests is by no way finished at this point, but a lot of the lately added features now have some proper testing. This already required a few changes to the code base from now and then when I noticed things weren’t working as expected.
+
+
For end-to-end tests I decided to go with cypress since it has an excellent documentation, is fully open source and runs tests in a real browser.
+
+
All of the tests are actually integrated in the SkoHub Vocabs CI pipeline and run before a new docker image gets build.
+
+
\ No newline at end of file
diff --git a/2023-11-22-shacl-shape/index.html b/2023-11-22-shacl-shape/index.html
new file mode 100644
index 0000000..79f66ac
--- /dev/null
+++ b/2023-11-22-shacl-shape/index.html
@@ -0,0 +1,85 @@
+Development of SKOS SHACL shape | Skohub Blog
To improve the error messages thrown by SkoHub Vocabs for invalid RDF Turtle files, we decided to implement a validation step, before the static site of a vocabulary gets built with the Gatsby framework.
+This validation step should provide more meaningful error messages than the currently cryptic ones thrown by Gatsby.
+While we could have gone with one of the existing SKOS validator tools like SKOS Play!, SKOSify or Poolparty (it’s pretty epensive) we decided to go with a more generic approach and define the shape rules not in code, but in data.
+
+
If you want to validate the shape of an RDF graph, you currently have two options to do that. You can either use Shape Expressions (ShEx) or the Shapes Constraint Language (SHACL).
+We decided to go with SHACL for the following reasons:
+
+
slightly better tooling (e.g. the zazuko Team provides a JS Validation library rdf-validte-shacl)
Unfortunately the SKOS-XL shape did not work with our tooling (Apache Jena SHACL) out of the box.
+Therefore we decided to build a SKOS shape from the ground up based on the SKOS Reference.
+
SKOS Reference Shape
+
The goal was to implement every consistency example from the SKOS Reference as a test case for the shape.
+To accomplish this it was on the one hand needed to formalize the class and property definitions from the spec as well as the integrity conditions.
+On the other hand we needed a triple store with reasoning capabilities to apply these rules to the very basic examples in the reference.
+We used the Apache Jena tooling for this and built a jena-docker containers based on the docker containers of this repo.
+The SKOS class and property definitions are defined in this file.
+The workflow for validating the SKOS shape is as follows:
Based on the query result an error message is put out
+
+
This way we accomplished to validate all valid examples from the SKOS reference up to example 68 as valid and all the invalid examples as invalid (entailment vs non-entailment examples were left out).
+
SkoHub Shape
+
In SkoHub Vocabs we are a bit stricter regarding some aspects of the SKOS reference.
+For example we want every skos:Concept to have at least one skos:prefLabel.
+Therefore we developed a SkoHub specific shape with skohub.shacl.ttl.
+In contrast to the generic SKOS shape this shape does not contain any SPARQL based SHACL constraints.
+Though it is possible and especially for more elaborated queries useful to use SPARQL to check constraints, the available tools (at least for javascript rdf-validate-shacl) do not support such queries.
+
As a result, validation errors and warnings help SkoHub users to improve the quality of their vocabularies. See for example the validation warning for no provided license:
+
-----------Warning--------------
+Message: [
+ Literal {
+ value: 'A provided license increases reusability of a vocabulary. Should be an URI.',
+ language: '',
+ datatype: NamedNode { value: 'http://www.w3.org/2001/XMLSchema#string' }
+ }
+]
+Path: http://purl.org/dc/terms/license
+Node, where the error occured: http://w3id.org/example-cs/
+Severity of error: http://www.w3.org/ns/shacl#Warning
+
Or the validation error if the object of skos:hasTopConcept is not a skos:Concept:
+
-----------Violation--------------
+Message: [
+ Literal {
+ value: 'The target class for hasTopConcept should be skos:Concept',
+ language: '',
+ datatype: NamedNode { value: 'http://www.w3.org/2001/XMLSchema#string' }
+ }
+]
+Path: http://www.w3.org/2004/02/skos/core#hasTopConcept
+Node, where the error occured: http://w3id.org/example-cs/
+Severity of error: http://www.w3.org/ns/shacl#Violation
+
Community
+
Thanks to a lightning talk at SWIB23 (slides, recording) we got some attention to the shape.
+Suggestions made by Jakob Voß, Osma Suominen and Antoine Isaac already greatly improved the shape.
+Further suggestions and improvements as well as your use cases are highly welcome.
+
Outlook
+
We were quite a bit surprised that we did not find any usable existing SKOS SHACL shape.
+Hopefully our work may help others validating their SKOS files and improve the overall quality of vocabularies.
+There is currently still an open ticket for implementing the qSKOS best practice rules.
+Any feedback and collaboration on the shapes is welcome!
+
+
\ No newline at end of file
diff --git a/2024-01-18-reconcile/index.html b/2024-01-18-reconcile/index.html
new file mode 100644
index 0000000..486de5f
--- /dev/null
+++ b/2024-01-18-reconcile/index.html
@@ -0,0 +1,451 @@
+Supporting the Reconciliation Service API for SKOS vocabularies | Skohub Blog
Supporting the Reconciliation Service API for SKOS vocabularies
January 22, 2024 | Steffen Rörtgen
Reconciliation is the process of integrating data from sources which do not share common unique identifiers by identifying records which refer to the same entities.
+This happens mostly by comparing the attributes of the entities.
+For instance, two entries in a catalogue about persons that share the same date of birth, place of birth, name and death date, will probably be about the same person.
+Linking these two entries by adding the identifier from another data source is the process of reconciliation. This allows for extension of your data by taking over information from a linked record.
+
To facilitate this process multiple tools exist with OpenRefine being the most prominent tool.
+To align and standardize the way of providing data for these tools the Reconciliation Service API is drafted by the Entity Reconciliation Community Group within the World Wide Web Consortium (W3C).
+The specification defines endpoints that data services can expose so that applications like OpenRefine can handle that data.
+A number of other services have already implemented the specification, like TEI Publisher or Cocoda, or the Alma Refine plugin for the commercial Library Management System Alma.
+
Reconciliation and SKOS
+
Simple Knowledge Organization System (SKOS) is an established standard for modeling controlled vocabularies as Linked Data. Thus, SKOS vocabularies are often targets of reconciliation efforts as you can improve your local data by enrichting strings with identifiers of a controlled vocabulary. So SKOS and the Reconciliation Service API often go hand in hand. However, there has not existed an easy way to set up a reconcilation endpoint for an existing SKOS vocabulary. We decied to change that by developing the new SkoHub component SkoHub-Reconcile.
+
Andreas Wagner had already built a reconciliation prototype for SKOS vocabularies (see also our Workshop Blog Post.
+We picked this prototype up, refactored it and moved it into a container based infrastructure.
+We also added support for v0.2 of the reconciliation spec.
+
SkoHub Reconcile Publish
+
To make it easy to upload vocabularies to the reconciliation service a front-end was develped which you can try out at https://reconcile-publish.skohub.io/.
+
+
Every vocabulary that passes the SkoHub SHACL Shape (see our blog post) should work for uploading to the reconcile service.
+The only additional requirement is to provide a vann:preferredNamespaceUri.
+As you can see in the screenshot you also have to provide an account and a language.
+As for the account you can currently choose whatever you want, just make sure it is unique enough, so your dataset (i.e. your vocabulary) does not get overwritten by someone else.
+Since a lang parameter has only been available since the current draft version of the reconciliation specification and not yet implemented in SkoHub Reconcile, the current version of the SkoHub Reconcile service requires you to specify a language you want to use for reconciliation. We will improve this in the future along with the development of the specification.
+
Example: Usage in OpenRefine
+
Let’s see how we can use the service with OpenRefine.
+
First, we upload the vocabulary.
+We will use a classification of subject groups.
+
+
After a successful upload of the turtle file, we are presented with a URI that leads to the “Service Manifest” of our reconciliation service.
Now that the reconciliation service is set up with our data, let’s see how we can use it in OpenRefine.
+
For demo purposes we use a small vocabulary of a few discipline names:
+
+
By clicking on the dropdown button of the column we want to reconcile, we choose “Reconcile” -> “Start reconciling…“.
+
+
After clicking “Add standard service”, we can enter the url we were provided with by the upload service:
+
+
Then we just have to start the reconciliation by clicking “Start reconciling…” and our reconciliation service will be queried with the terms in our OpenRefine project.
+We are then presented with the results:
+
+
This already looks good!
+Now we can choose matches by clicking the checkmark or get additional information by hovering over the proposed entry from the reconcile service.
+
+
If we want we can also search through our vocabulary by clicking “Search for match”:
+
+
After selecting the appropritate matches we have successfully reconciled our data:
Feedback is very much appreciated: via email (skohub@hbz-nrw.de), as an issue or – primarily for the German-speaking users – in the newly set up discourse forum metadaten.community.
+
Our next step will be integrating the above mentioned lang parameter to be able to serve all languages of a vocabulary without the need to specify it beforehand.
+
+
\ No newline at end of file
diff --git a/2024-01-24-uris-without-language-tags/index.html b/2024-01-24-uris-without-language-tags/index.html
new file mode 100644
index 0000000..6b23254
--- /dev/null
+++ b/2024-01-24-uris-without-language-tags/index.html
@@ -0,0 +1,60 @@
+Re-working SkoHub Vocabs internationalization features | Skohub Blog
Re-working SkoHub Vocabs internationalization features
January 31, 2024 | Steffen Rörtgen
In the past: Internationalization with drawbacks
+
If you have worked with SkoHub Vocabs before, you might have noticed that the URLs in the address bar had a little special feature that you don’t encounter very often, a language tag before the .html:
We wanted Internationalization features to be able to navigate multiple languages.
+Normally this is done via a subdomain or adding a language tag behind the domain name like https://w3id.org/kim/hochschulfaechersystematik/en/.
+But this does not work for SkoHub Vocabs since the we use the URIs from the turtle files as IDs for the concept.
+Changing the URI by adding a language tag somewhere would break the whole concept of SkoHub Vocabs.
+
So it was decided to add the language at the end of the URL by using Apache Multiviews features.
+But this lead to some drawbacks:
+
+
SkoHub Vocabs needed to be served by an Apache Webserver
+
The webserver needed special configuration
+
SkoHub Docker Vocabs, which is served via GitHub Pages, always needed a specific link to an index.{language}.html file, since GitHub Pages only looks for an index.html
+
The build directory grew quite a bit, since there were dedicated html pages built for every language
+
+
Switching to one page for all languages
+
In order to overcome these issues we decided to change this behaviour and just build one html page with a functionality to switch languages. The shown language is now chosen the following way:
+
+
by using your browser language
+
if you switched languages in the application the chosen language is taken
+
if a language is not present, a default language present in the vocabulary is used
+
+
To point users to a specific language, you can use a query parameter lang= like:
+
https://w3id.org/kim/hcrt/scheme?lang=uk
+
Since SkoHub Vocabs also used the language tag of the URL internally to determine which language to serve a lot of changes had to be done in the codebase.
+But overall this resulted in a much reduced size of the built vocabularies and more flexibility on serving the vocabularies.
+
Benefits of the new approach
+
This new internationalization approach brings lots of improvements:
+
+
SkoHub Vocabs is now independent from the underlying webserver
+
The size of the vocabularies is drastically reduced, especially for vocabularies with lots of languages
+
SkoHub Docker Vocabs is now simpler to setup since we only have “normal” index.html files that it knows how to handle
+
+
What to do if I’m running my own webhook server?
+
If you are running your own webhook server, you should upgrade the following way:
+
+
Follow the steps outlined in the webhook repository to rebuild vocabularies. This will rebuild all still existing branches you are currently serving.
+
Set up a redirect in your apache config, so that links that still have ...de.html will be redirected to ...html?lang=de:
+
+
# Redirect from ...filename.LANGCODE.html to ...filename.html?lang=LANGCODE
+ RewriteRule ^(.+)\.([a-z]{2})\.html$ $1.html?lang=$2 [L,R=301]
+
+
After that you should be good!
+
+
Anything else?
+
During developing the script to rebuild all existing vocabularies, I noticed that we are serving a lot of branches that do not exist anymore.
+SkoHub Webhook currently builds a vocabulary for every branch, you are setting up and pushing to.
+But the webhook service does not get notified, when a branch is deleted.
+This way we end up having lots of files for branches that no one needs anymore.
+In order to clean this up a bit, we will soon add a script to clean the dist directory up and remove those no longer needed files.
+
+
\ No newline at end of file
diff --git a/2024-03-21-skohub-pages/index.html b/2024-03-21-skohub-pages/index.html
new file mode 100644
index 0000000..d6edbf4
--- /dev/null
+++ b/2024-03-21-skohub-pages/index.html
@@ -0,0 +1,240 @@
+Publishing SKOS the easy way | Skohub Blog
With SkoHub Pages we now provide a very simple way for publishing your SKOS vocabulary from a GitHub repository. It only involves 5-6 steps:
+
1. Fork the skohub-pages repo
+
Click the “Fork” button in the top-right corner of the SkoHub Pages repo. You can change the name of your fork to whatever you like, e.g. my-shiny-vocab. See also the GitHub fork documentation.
+
+
2. Activate GitHub Actions
+
+
3. Configure GitHub Pages branch
+
Go to “Settings”, navigate to the “Pages” setting and select gh-pages as the branch your site is being built from.
+
+
4. Update pages URL
+
Go back to the main page of your repo and click the little gear icon in the top right of the “About” section. Check the box at “Use your GitHub Pages website”.
+
+
+
5. Start committing
+
Now you can add a commit to the main branch adjusting the example vocabularies or adding a new turtle file. The changes will automatically be published to your GitHub pages website that is now linked at the top-right of your GitHub repo (sometimes it takes a little to see the changes, remember to do some hard refreshing).
+
6. Set your GitHub Pages URL as namespace (optional)
+
See section “Resolving custom domains” below ⬇️
+
Utilizing GitHub Actions & Pages
+
Not all projects or individuals involved in the creation of controlled vocabularies are able or have the resources to run their own infrastructure. Thus, we have been pursuing this approach – formerly under the name of “skohub-docker-vocabs” – to utilize Docker and GitHub infrastructure for publishing SKOS vocabularies with SkoHub Vocabs. Specifically, the workflow relies on ”GitHub Pages” and ”GitHub Actions”. With GitHub Pages it is possible to host websites on the GitHub infrastructure, GitHub Actions are used for automated tests and deployments.
+
We have written a GitHub Action that ensures that a process is started after each push to the repository which builds the vocabularies with SkoHub Vocabs.
+The built vocabulary is then pushed to a separate git branch gh-pages.
+As seen above, GitHub Pages is configured to deliver HTML pages from this gh-pages branch.
+
We have been using this approach in various introduction to SKOS and SkoHub workshops.
+However, in the past the workflow required some adjustments in the GitHub action so that errors could quickly creep in. We are happy to having improved this considerably and made the process much less error-prone! 🎉
+
The relevant information is now set directly as environment variables and all other customizations can be changed via the GitHub GUI, so the workflow is now much more user-friendly. But that’s not all!
+
Resolving custom domains
+
Although with the presented approach the custom vocabulary could be provided without own infrastructure, the domains did not resolve to the GitHub pages.
+This means that a concept scheme that uses URIs based on the GitHub Pages domain (e.g. https://myhandle.github.io/skohub-pages/) could not be resolved so far. In the past, in order to mitigate this we recommended setting up a redirect via w3id or purl.org.
+Of course, it still makes sense to set up a redirect (in case the vocabulary moves somewhere else). However, it is now also possible to use the domain that is assigned via GitHub Pages and have quickly set up a fully working SKOS vocabulary with resolving concept URIs which can come handy for prototyping.
+
To do this, a config.yaml must be created in the repo.
+The respective domain must then be entered under the custom_domain.
+Example: Your GitHub Pages domain is https://myhandle.github.io/skohub-pages/. Then provide https://myhandle.github.io/skohub-pages/ as custom_domain in your config.yaml.
+
The base of your concept scheme could then be something like: https://myhandle.github.io/skohub-pages/myvocab/
There are lots of reasons why people might not want to use the GitHub infrastructure owned by Microsoft for their SKOS publication workflows. That’s why we will be looking into replacing as much as possible of this workflow by generic git-based tooling for triggering the build. The goal is to support such an easy SKOS publishing workflow on other forges like GitLab or Forgejo. The work on this happens around this issue: https://github.com/skohub-io/skohub-pages/issues/19
+
Let us know if you have some good implementation ideas or more wishes for future development!
+
+
\ No newline at end of file
diff --git a/404/index.html b/404/index.html
new file mode 100644
index 0000000..ebd3c65
--- /dev/null
+++ b/404/index.html
@@ -0,0 +1,10 @@
+404: Not Found | Skohub Blog
+
+
\ No newline at end of file
diff --git a/_gatsby/slices/_gatsby-scripts-1.html b/_gatsby/slices/_gatsby-scripts-1.html
new file mode 100644
index 0000000..6d4968b
--- /dev/null
+++ b/_gatsby/slices/_gatsby-scripts-1.html
@@ -0,0 +1,7 @@
+
+
+
\ No newline at end of file
diff --git a/about/index.html b/about/index.html
new file mode 100644
index 0000000..c3e9545
--- /dev/null
+++ b/about/index.html
@@ -0,0 +1,10 @@
+About | Skohub Blog
SkoHub supports a novel approach for finding content on the web. The general idea is to extend the scope of Knowledge Organization Systems (KOS) to also act as communication hubs for publishers and information seekers. In effect, SkoHub allows to follow specific subjects in order to be notified when new content about that subject is published.
The approach is realized by putting Knowledge Organization Systems online according to the SKOS standard. Additionally, they are exposed using the (social) networking protocols ActivityPub and Linked Data Notifications. This effectively turns the published vocabularies into hubs that provide structured metadata about and links to web content in real time.
The project to create a production-ready version of SkoHub has been carried out by the North-Rhine Westphalian Library Service Centre (hbz) in cooperation with graphthinking GmbH with four deliverables. The core is the back end infrastructure for publishing vocabularies on the web (Skohub Vocabs) and for receiving and pushing notifications (SkoHub PubSub). Additionally, we provide an editor to describe web resources according to a common metadata schema and to send notifications (SkoHub Editor). The editor can also be used as a browser plugin for Firefox and Chrome (SkoHub Extension).
+
+
\ No newline at end of file
diff --git a/app-fe620b60292e32d725ed.js b/app-fe620b60292e32d725ed.js
new file mode 100644
index 0000000..91f1a0a
--- /dev/null
+++ b/app-fe620b60292e32d725ed.js
@@ -0,0 +1,3 @@
+/*! For license information please see app-fe620b60292e32d725ed.js.LICENSE.txt */
+(self.webpackChunkskohub_blog=self.webpackChunkskohub_blog||[]).push([[143],{4506:function(e,t){"use strict";t.H=void 0;t.H=(e,t="always")=>{const n=e.endsWith(".html"),r=e.endsWith(".xml"),o=e.endsWith(".pdf");return"/"===e?e:((n||r||o)&&(t="never"),"always"===t?e.endsWith("/")?e:`${e}/`:"never"===t&&e.endsWith("/")?e.slice(0,-1):e)}},9679:function(e,t,n){"use strict";t.p2=t.$C=void 0;var r=n(1432);t.$C=r.ScrollHandler;var o=n(4855);t.p2=o.useScrollRestoration},1432:function(e,t,n){"use strict";var r=n(4836);t.__esModule=!0,t.ScrollHandler=t.ScrollContext=void 0;var o=r(n(6115)),a=r(n(7867)),i=function(e,t){if(!t&&e&&e.__esModule)return e;if(null===e||"object"!=typeof e&&"function"!=typeof e)return{default:e};var n=u(t);if(n&&n.has(e))return n.get(e);var r={},o=Object.defineProperty&&Object.getOwnPropertyDescriptor;for(var a in e)if("default"!==a&&Object.prototype.hasOwnProperty.call(e,a)){var i=o?Object.getOwnPropertyDescriptor(e,a):null;i&&(i.get||i.set)?Object.defineProperty(r,a,i):r[a]=e[a]}r.default=e,n&&n.set(e,r);return r}(n(7294)),s=r(n(5697)),c=n(1142);function u(e){if("function"!=typeof WeakMap)return null;var t=new WeakMap,n=new WeakMap;return(u=function(e){return e?n:t})(e)}var l=i.createContext(new c.SessionStorage);t.ScrollContext=l,l.displayName="GatsbyScrollContext";var p=function(e){function t(){for(var t,n=arguments.length,r=new Array(n),a=0;a{}},7730:function(e,t){"use strict";t.__esModule=!0,t.getForwards=function(e){return null==e?void 0:e.flatMap((e=>(null==e?void 0:e.forward)||[]))}},2731:function(e,t,n){"use strict";t.__esModule=!0,t.injectPartytownSnippet=function(e){if(!e.length)return;const t=document.querySelector("script[data-partytown]"),n=document.querySelector('iframe[src*="~partytown/partytown-sandbox-sw"]');t&&t.remove();n&&n.remove();const a=(0,o.getForwards)(e),i=document.createElement("script");i.dataset.partytown="",i.innerHTML=(0,r.partytownSnippet)({forward:a}),document.head.appendChild(i)};var r=n(2911),o=n(7730)},5418:function(e,t,n){t.components={"component---src-pages-404-js":()=>Promise.all([n.e(948),n.e(351),n.e(883)]).then(n.bind(n,429)),"component---src-pages-about-js":()=>Promise.all([n.e(948),n.e(351),n.e(682)]).then(n.bind(n,5465)),"component---src-pages-contact-js":()=>Promise.all([n.e(948),n.e(351),n.e(501)]).then(n.bind(n,5791)),"component---src-pages-index-js":()=>Promise.all([n.e(948),n.e(351),n.e(678)]).then(n.bind(n,6558)),"component---src-pages-using-typescript-tsx":()=>Promise.all([n.e(948),n.e(351),n.e(970)]).then(n.bind(n,8619)),"component---src-templates-blog-post-js":()=>Promise.all([n.e(948),n.e(351),n.e(989)]).then(n.bind(n,4982))}},4741:function(e,t,n){e.exports=[{plugin:n(8696),options:{plugins:[],siteUrl:"https://blog.skohub.io"}},{plugin:n(2154),options:{plugins:[],maxWidth:630,showCaptions:!0,linkImagesToOriginal:!0,markdownCaptions:!1,backgroundColor:"white",quality:50,withWebp:!1,withAvif:!1,loading:"lazy",decoding:"async",disableBgImageOnAlpha:!1,disableBgImage:!1}},{plugin:n(9608),options:{plugins:[],name:"Skohub Blog",short_name:"Skohub Blog",start_url:"/",background_color:"#ffffff",theme_color:"#26c884",display:"minimal-ui",icon:"src/images/skohub.png",legacy:!0,theme_color_in_head:!0,cache_busting_mode:"query",crossOrigin:"anonymous",include_favicon:!0,cacheDigest:"3b68e7175ff10cac1dc50da0827ce2e1"}},{plugin:n(7420),options:{plugins:[]}},{plugin:n(4094),options:{plugins:[]}}]},3092:function(e,t,n){const r=n(4741),{getResourceURLsForPathname:o,loadPage:a,loadPageSync:i}=n(8575).jN;t.h=function(e,t,n,s){void 0===t&&(t={});let c=r.map((n=>{if(!n.plugin[e])return;t.getResourceURLsForPathname=o,t.loadPage=a,t.loadPageSync=i;const r=n.plugin[e](t,n.options);return r&&s&&(t=s({args:t,result:r,plugin:n})),r}));return c=c.filter((e=>void 0!==e)),c.length>0?c:n?[n]:[]},t.I=(e,t,n)=>r.reduce(((n,r)=>r.plugin[e]?n.then((()=>r.plugin[e](t,r.options))):n),Promise.resolve())},4004:function(e,t){t.M=()=>""},8299:function(e,t,n){"use strict";n.d(t,{Z:function(){return r}});var r=function(e){return e=e||Object.create(null),{on:function(t,n){(e[t]||(e[t]=[])).push(n)},off:function(t,n){e[t]&&e[t].splice(e[t].indexOf(n)>>>0,1)},emit:function(t,n){(e[t]||[]).slice().map((function(e){e(n)})),(e["*"]||[]).slice().map((function(e){e(t,n)}))}}}()},7802:function(e,t,n){"use strict";n.d(t,{UD:function(){return d},Cj:function(){return f},GA:function(){return h},DS:function(){return p}});var r=n(7896),o=n(1505),a=e=>{if(void 0===e)return e;let[t,n=""]=e.split("?");return n&&(n="?"+n),"/"===t?"/"+n:"/"===t.charAt(t.length-1)?t.slice(0,-1)+n:t+n},i=n(6073);const s=new Map;let c=[];const u=e=>{let t=e;if(-1!==e.indexOf("?")){const[n,r]=e.split("?");t=n+"?"+encodeURIComponent(r)}const n=decodeURIComponent(t);return(0,o.Z)(n,decodeURIComponent("")).split("#")[0]};function l(e){return e.startsWith("/")||e.startsWith("https://")||e.startsWith("http://")?e:new URL(e,window.location.href+(window.location.href.endsWith("/")?"":"/")).pathname}const p=e=>{c=e},d=e=>{const t=m(e),n=c.map((e=>{let{path:t,matchPath:n}=e;return{path:n,originalPath:t}})),o=(0,r.pick)(n,t);return o?a(o.route.originalPath):null},h=e=>{const t=m(e),n=c.map((e=>{let{path:t,matchPath:n}=e;return{path:n,originalPath:t}})),o=(0,r.pick)(n,t);return o?o.params:{}},f=e=>{const t=u(l(e));if(s.has(t))return s.get(t);const n=(0,i.J)(e);if(n)return f(n.toPath);let r=d(t);return r||(r=m(e)),s.set(t,r),r},m=e=>{let t=u(l(e));return"/index.html"===t&&(t="/"),t=a(t),t}},1883:function(e,t,n){"use strict";n.r(t),n.d(t,{Link:function(){return s.rU},PageRenderer:function(){return a()},Script:function(){return S.Script},ScriptStrategy:function(){return S.ScriptStrategy},Slice:function(){return E},StaticQuery:function(){return c.i1},StaticQueryContext:function(){return c.B9},collectedScriptsByPage:function(){return S.collectedScriptsByPage},graphql:function(){return R},navigate:function(){return s.c4},parsePath:function(){return s.cP},prefetchPathname:function(){return C},scriptCache:function(){return S.scriptCache},scriptCallbackCache:function(){return S.scriptCallbackCache},useScrollRestoration:function(){return i.p2},useStaticQuery:function(){return c.K2},withAssetPrefix:function(){return s.mc},withPrefix:function(){return s.dq}});var r=n(8575),o=n(2743),a=n.n(o),i=n(9679),s=n(1562),c=n(1757);var u=n(4578);function l(e){return l=Object.setPrototypeOf?Object.getPrototypeOf.bind():function(e){return e.__proto__||Object.getPrototypeOf(e)},l(e)}var p=n(9611);function d(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(e){return!1}}function h(e,t,n){return h=d()?Reflect.construct.bind():function(e,t,n){var r=[null];r.push.apply(r,t);var o=new(Function.bind.apply(e,r));return n&&(0,p.Z)(o,n.prototype),o},h.apply(null,arguments)}function f(e){var t="function"==typeof Map?new Map:void 0;return f=function(e){if(null===e||(n=e,-1===Function.toString.call(n).indexOf("[native code]")))return e;var n;if("function"!=typeof e)throw new TypeError("Super expression must either be null or a function");if(void 0!==t){if(t.has(e))return t.get(e);t.set(e,r)}function r(){return h(e,arguments,l(this).constructor)}return r.prototype=Object.create(e.prototype,{constructor:{value:r,enumerable:!1,writable:!0,configurable:!0}}),(0,p.Z)(r,e)},f(e)}var m=n(7294),g=n(4004),v=n(8995);const y=e=>{let{sliceId:t,children:n}=e;const r=[m.createElement("slice-start",{id:t+"-1"}),m.createElement("slice-end",{id:t+"-1"})];return n&&(r.push(n),r.push(m.createElement("slice-start",{id:t+"-2"}),m.createElement("slice-end",{id:t+"-2"}))),r},w=e=>{let{sliceName:t,allowEmpty:n,children:r,...o}=e;const a=(0,m.useContext)(v.u0),i=(0,m.useContext)(v.Db),s=a[t];if(!s){if(n)return null;throw new Error('Slice "'+s+'" for "'+t+'" slot not found')}const c=((e,t)=>Object.keys(t).length?e+"-"+(0,g.M)(t):e)(s,o);let u=i[c];return u?r&&(u.hasChildren=!0):i[c]=u={props:o,sliceName:s,hasChildren:!!r},m.createElement(y,{sliceId:c},r)},b=e=>{let{sliceName:t,allowEmpty:n,children:r,...o}=e;const a=(0,m.useContext)(v.u0),i=(0,m.useContext)(v.m3),s=a[t],c=i.get(s);if(!c){if(n)return null;throw new Error('Slice "'+s+'" for "'+t+'" slot not found')}return m.createElement(c.component,Object.assign({sliceContext:c.sliceContext,data:c.data},o),r)};function E(e){{const t={...e,sliceName:e.alias};delete t.alias,delete t.__renderedByLocation;const n=(0,m.useContext)(v.Bs),r=P(e);if(Object.keys(r).length)throw new _("browser"===n.renderEnvironment,t.sliceName,r,e.__renderedByLocation);if("server"===n.renderEnvironment)return m.createElement(w,t);if("browser"===n.renderEnvironment)return m.createElement(b,t);if("engines"===n.renderEnvironment)return m.createElement(b,t);if("slices"===n.renderEnvironment){let t="";try{t='\n\nSlice component "'+n.sliceRoot.name+'" ('+n.sliceRoot.componentPath+') tried to render '}catch{}throw new Error("Nested slices are not supported."+t+"\n\nSee https://gatsbyjs.com/docs/reference/built-in-components/gatsby-slice#nested-slices")}throw new Error('Slice context "'+n.renderEnvironment+'" is not supported.')}}let _=function(e){function t(n,r,o,a){var i;const s=Object.entries(o).map((e=>{let[t,n]=e;return'not serializable "'+n+'" type passed to "'+t+'" prop'})).join(", "),c="SlicePropsError";let u="",l="";if(n){const e=m.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.ReactDebugCurrentFrame.getCurrentStack().trim().split("\n").slice(1);e[0]=e[0].trim(),u="\n"+e.join("\n"),l='Slice "'+r+'" was passed props that are not serializable ('+s+")."}else{l=c+': Slice "'+r+'" was passed props that are not serializable ('+s+").";u=l+"\n"+(new Error).stack.trim().split("\n").slice(2).join("\n")}return(i=e.call(this,l)||this).name=c,u?i.stack=u:Error.captureStackTrace(function(e){if(void 0===e)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return e}(i),t),a&&(i.forcedLocation={...a,functionName:"Slice"}),i}return(0,u.Z)(t,e),t}(f(Error));const P=function(e,t,n,r){void 0===t&&(t={}),void 0===n&&(n=[]),void 0===r&&(r=null);for(const[o,a]of Object.entries(e)){if(null==a||!r&&"children"===o)continue;const e=r?r+"."+o:o;"function"==typeof a?t[e]=typeof a:"object"==typeof a&&n.indexOf(a)<=0&&(n.push(a),P(a,t,n,e))}return t};var S=n(3521);const C=r.ZP.enqueue;function R(){throw new Error("It appears like Gatsby is misconfigured. Gatsby related `graphql` calls are supposed to only be evaluated at compile time, and then compiled away. Unfortunately, something went wrong and the query was left in the compiled code.\n\nUnless your site has a complex or custom babel/Gatsby configuration this is likely a bug in Gatsby.")}},8575:function(e,t,n){"use strict";n.d(t,{uQ:function(){return d},kL:function(){return _},ZP:function(){return C},Nt:function(){return k},hs:function(){return R},jN:function(){return S},N1:function(){return P}});var r=n(4578);function o(e,t){(null==t||t>e.length)&&(t=e.length);for(var n=0,r=new Array(t);n{if("undefined"==typeof document)return void r();const o=document.createElement("link");o.setAttribute("rel","prefetch"),o.setAttribute("href",e),Object.keys(t).forEach((e=>{o.setAttribute(e,t[e])})),o.onload=n,o.onerror=r;(document.getElementsByTagName("head")[0]||document.getElementsByName("script")[0].parentNode).appendChild(o)}))}:function(e){return new Promise(((t,n)=>{const r=new XMLHttpRequest;r.open("GET",e,!0),r.onload=()=>{200===r.status?t():n()},r.send(null)}))},c={};var u=function(e,t){return new Promise((n=>{c[e]?n():s(e,t).then((()=>{n(),c[e]=!0})).catch((()=>{}))}))},l=n(8299),p=n(7802);const d={Error:"error",Success:"success"},h=e=>{const[t,n]=e.split("?");var r;return"/page-data/"+("/"===t?"index":(r="/"===(r=t)[0]?r.slice(1):r).endsWith("/")?r.slice(0,-1):r)+"/page-data.json"+(n?"?"+n:"")},f=e=>e.startsWith("//");function m(e,t){return void 0===t&&(t="GET"),new Promise((n=>{const r=new XMLHttpRequest;r.open(t,e,!0),r.onreadystatechange=()=>{4==r.readyState&&n(r)},r.send(null)}))}const g=/bot|crawler|spider|crawling/i,v=function(e,t,n){var r;void 0===t&&(t=null);const o={componentChunkName:e.componentChunkName,path:e.path,webpackCompilationHash:e.webpackCompilationHash,matchPath:e.matchPath,staticQueryHashes:e.staticQueryHashes,getServerDataError:e.getServerDataError,slicesMap:null!==(r=e.slicesMap)&&void 0!==r?r:{}};return{component:t,head:n,json:e.result,page:o}};function y(e){return new Promise((t=>{try{const n=e.readRoot();t(n)}catch(n){if(!Object.hasOwnProperty.call(n,"_response")||!Object.hasOwnProperty.call(n,"_status"))throw n;setTimeout((()=>{y(e).then(t)}),200)}}))}let w=function(){function e(e,t){this.inFlightNetworkRequests=new Map,this.pageDb=new Map,this.inFlightDb=new Map,this.staticQueryDb={},this.pageDataDb=new Map,this.partialHydrationDb=new Map,this.slicesDataDb=new Map,this.sliceInflightDb=new Map,this.slicesDb=new Map,this.isPrefetchQueueRunning=!1,this.prefetchQueued=[],this.prefetchTriggered=new Set,this.prefetchCompleted=new Set,this.loadComponent=e,(0,p.DS)(t)}var t=e.prototype;return t.memoizedGet=function(e){let t=this.inFlightNetworkRequests.get(e);return t||(t=m(e,"GET"),this.inFlightNetworkRequests.set(e,t)),t.then((t=>(this.inFlightNetworkRequests.delete(e),t))).catch((t=>{throw this.inFlightNetworkRequests.delete(e),t}))},t.setApiRunner=function(e){this.apiRunner=e,this.prefetchDisabled=e("disableCorePrefetching").some((e=>e))},t.fetchPageDataJson=function(e){const{pagePath:t,retries:n=0}=e,r=h(t);return this.memoizedGet(r).then((r=>{const{status:o,responseText:a}=r;if(200===o)try{const n=JSON.parse(a);if(void 0===n.path)throw new Error("not a valid pageData response");const r=t.split("?")[1];return r&&!n.path.includes(r)&&(n.path+="?"+r),Object.assign(e,{status:d.Success,payload:n})}catch(i){}return 404===o||200===o?"/404.html"===t||"/500.html"===t?Object.assign(e,{status:d.Error}):this.fetchPageDataJson(Object.assign(e,{pagePath:"/404.html",notFound:!0})):500===o?this.fetchPageDataJson(Object.assign(e,{pagePath:"/500.html",internalServerError:!0})):n<3?this.fetchPageDataJson(Object.assign(e,{retries:n+1})):Object.assign(e,{status:d.Error})}))},t.fetchPartialHydrationJson=function(e){const{pagePath:t,retries:n=0}=e,r=h(t).replace(".json","-rsc.json");return this.memoizedGet(r).then((r=>{const{status:o,responseText:a}=r;if(200===o)try{return Object.assign(e,{status:d.Success,payload:a})}catch(i){}return 404===o||200===o?"/404.html"===t||"/500.html"===t?Object.assign(e,{status:d.Error}):this.fetchPartialHydrationJson(Object.assign(e,{pagePath:"/404.html",notFound:!0})):500===o?this.fetchPartialHydrationJson(Object.assign(e,{pagePath:"/500.html",internalServerError:!0})):n<3?this.fetchPartialHydrationJson(Object.assign(e,{retries:n+1})):Object.assign(e,{status:d.Error})}))},t.loadPageDataJson=function(e){const t=(0,p.Cj)(e);if(this.pageDataDb.has(t)){const e=this.pageDataDb.get(t);return Promise.resolve(e)}return this.fetchPageDataJson({pagePath:t}).then((e=>(this.pageDataDb.set(t,e),e)))},t.loadPartialHydrationJson=function(e){const t=(0,p.Cj)(e);if(this.partialHydrationDb.has(t)){const e=this.partialHydrationDb.get(t);return Promise.resolve(e)}return this.fetchPartialHydrationJson({pagePath:t}).then((e=>(this.partialHydrationDb.set(t,e),e)))},t.loadSliceDataJson=function(e){if(this.slicesDataDb.has(e)){const t=this.slicesDataDb.get(e);return Promise.resolve({sliceName:e,jsonPayload:t})}return m("/slice-data/"+e+".json","GET").then((t=>{const n=JSON.parse(t.responseText);return this.slicesDataDb.set(e,n),{sliceName:e,jsonPayload:n}}))},t.findMatchPath=function(e){return(0,p.UD)(e)},t.loadPage=function(e){const t=(0,p.Cj)(e);if(this.pageDb.has(t)){const e=this.pageDb.get(t);return e.error?Promise.resolve({error:e.error,status:e.status}):Promise.resolve(e.payload)}if(this.inFlightDb.has(t))return this.inFlightDb.get(t);const n=[this.loadAppData(),this.loadPageDataJson(t)];const r=Promise.all(n).then((e=>{const[n,r,o]=e;if(r.status===d.Error||(null==o?void 0:o.status)===d.Error)return{status:d.Error};let s=r.payload;const{componentChunkName:c,staticQueryHashes:u=[],slicesMap:p={}}=s,h={},f=Array.from(new Set(Object.values(p))),m=e=>{if(this.slicesDb.has(e.name))return this.slicesDb.get(e.name);if(this.sliceInflightDb.has(e.name))return this.sliceInflightDb.get(e.name);const t=this.loadComponent(e.componentChunkName).then((t=>{return{component:(n=t,n&&n.default||n),sliceContext:e.result.sliceContext,data:e.result.data};var n}));return this.sliceInflightDb.set(e.name,t),t.then((t=>{this.slicesDb.set(e.name,t),this.sliceInflightDb.delete(e.name)})),t};return Promise.all(f.map((e=>this.loadSliceDataJson(e)))).then((e=>{const p=[],f=a(u);for(const{jsonPayload:t,sliceName:n}of Object.values(e)){p.push({name:n,...t});for(const e of t.staticQueryHashes)f.includes(e)||f.push(e)}const g=[Promise.all(p.map(m)),this.loadComponent(c,"head")];g.push(this.loadComponent(c));const w=Promise.all(g).then((e=>{const[t,a,c]=e;h.createdAt=new Date;for(const n of t)(!n||n instanceof Error)&&(h.status=d.Error,h.error=n);let u;if((!c||c instanceof Error)&&(h.status=d.Error,h.error=c),h.status!==d.Error){if(h.status=d.Success,!0!==r.notFound&&!0!==(null==o?void 0:o.notFound)||(h.notFound=!0),s=Object.assign(s,{webpackCompilationHash:n?n.webpackCompilationHash:""}),"string"==typeof(null==o?void 0:o.payload)){u=v(s,null,a),u.partialHydration=o.payload;const e=new ReadableStream({start(e){const t=new TextEncoder;e.enqueue(t.encode(o.payload))},pull(e){e.close()},cancel(){}});return y((0,i.createFromReadableStream)(e)).then((e=>(u.partialHydration=e,u)))}u=v(s,c,a)}return u})),b=Promise.all(f.map((e=>{if(this.staticQueryDb[e]){const t=this.staticQueryDb[e];return{staticQueryHash:e,jsonPayload:t}}return this.memoizedGet("/page-data/sq/d/"+e+".json").then((t=>{const n=JSON.parse(t.responseText);return{staticQueryHash:e,jsonPayload:n}})).catch((()=>{throw new Error("We couldn't load \"/page-data/sq/d/"+e+'.json"')}))}))).then((e=>{const t={};return e.forEach((e=>{let{staticQueryHash:n,jsonPayload:r}=e;t[n]=r,this.staticQueryDb[n]=r})),t}));return Promise.all([w,b]).then((e=>{let n,[r,o]=e;return r&&(n={...r,staticQueryResults:o},h.payload=n,l.Z.emit("onPostLoadPageResources",{page:n,pageResources:n})),this.pageDb.set(t,h),h.error?{error:h.error,status:h.status}:n})).catch((e=>({error:e,status:d.Error})))}))}));return r.then((()=>{this.inFlightDb.delete(t)})).catch((e=>{throw this.inFlightDb.delete(t),e})),this.inFlightDb.set(t,r),r},t.loadPageSync=function(e,t){void 0===t&&(t={});const n=(0,p.Cj)(e);if(this.pageDb.has(n)){var r;const e=this.pageDb.get(n);if(e.payload)return e.payload;if(null!==(r=t)&&void 0!==r&&r.withErrorDetails)return{error:e.error,status:e.status}}},t.shouldPrefetch=function(e){return!!(()=>{if("connection"in navigator&&void 0!==navigator.connection){if((navigator.connection.effectiveType||"").includes("2g"))return!1;if(navigator.connection.saveData)return!1}return!0})()&&((!navigator.userAgent||!g.test(navigator.userAgent))&&!this.pageDb.has(e))},t.prefetch=function(e){if(!this.shouldPrefetch(e))return{then:e=>e(!1),abort:()=>{}};if(this.prefetchTriggered.has(e))return{then:e=>e(!0),abort:()=>{}};const t={resolve:null,reject:null,promise:null};t.promise=new Promise(((e,n)=>{t.resolve=e,t.reject=n})),this.prefetchQueued.push([e,t]);const n=new AbortController;return n.signal.addEventListener("abort",(()=>{const t=this.prefetchQueued.findIndex((t=>{let[n]=t;return n===e}));-1!==t&&this.prefetchQueued.splice(t,1)})),this.isPrefetchQueueRunning||(this.isPrefetchQueueRunning=!0,setTimeout((()=>{this._processNextPrefetchBatch()}),3e3)),{then:(e,n)=>t.promise.then(e,n),abort:n.abort.bind(n)}},t._processNextPrefetchBatch=function(){(window.requestIdleCallback||(e=>setTimeout(e,0)))((()=>{const e=this.prefetchQueued.splice(0,4),t=Promise.all(e.map((e=>{let[t,n]=e;return this.prefetchTriggered.has(t)||(this.apiRunner("onPrefetchPathname",{pathname:t}),this.prefetchTriggered.add(t)),this.prefetchDisabled?n.resolve(!1):this.doPrefetch((0,p.Cj)(t)).then((()=>{this.prefetchCompleted.has(t)||(this.apiRunner("onPostPrefetchPathname",{pathname:t}),this.prefetchCompleted.add(t)),n.resolve(!0)}))})));this.prefetchQueued.length?t.then((()=>{setTimeout((()=>{this._processNextPrefetchBatch()}),3e3)})):this.isPrefetchQueueRunning=!1}))},t.doPrefetch=function(e){const t=h(e);return u(t,{crossOrigin:"anonymous",as:"fetch"}).then((()=>this.loadPageDataJson(e)))},t.hovering=function(e){this.loadPage(e)},t.getResourceURLsForPathname=function(e){const t=(0,p.Cj)(e),n=this.pageDataDb.get(t);if(n){const e=v(n.payload);return[].concat(a(b(e.page.componentChunkName)),[h(t)])}return null},t.isPageNotFound=function(e){const t=(0,p.Cj)(e),n=this.pageDb.get(t);return!n||n.notFound},t.loadAppData=function(e){return void 0===e&&(e=0),this.memoizedGet("/page-data/app-data.json").then((t=>{const{status:n,responseText:r}=t;let o;if(200!==n&&e<3)return this.loadAppData(e+1);if(200===n)try{const e=JSON.parse(r);if(void 0===e.webpackCompilationHash)throw new Error("not a valid app-data response");o=e}catch(a){}return o}))},e}();const b=e=>(window.___chunkMapping[e]||[]).map((e=>""+e));let E,_=function(e){function t(t,n,r){var o;return o=e.call(this,(function(e,n){if(void 0===n&&(n="components"),!t[n="components"][e])throw new Error("We couldn't find the correct component chunk with the name \""+e+'"');return t[n][e]().catch((e=>e))}),n)||this,r&&o.pageDataDb.set((0,p.Cj)(r.path),{pagePath:r.path,payload:r,status:"success"}),o}(0,r.Z)(t,e);var n=t.prototype;return n.doPrefetch=function(t){return e.prototype.doPrefetch.call(this,t).then((e=>{if(e.status!==d.Success)return Promise.resolve();const t=e.payload,n=t.componentChunkName,r=b(n);return Promise.all(r.map(u)).then((()=>t))}))},n.loadPageDataJson=function(t){return e.prototype.loadPageDataJson.call(this,t).then((e=>e.notFound?f(t)?e:m(t,"HEAD").then((t=>200===t.status?{status:d.Error}:e)):e))},n.loadPartialHydrationJson=function(t){return e.prototype.loadPartialHydrationJson.call(this,t).then((e=>e.notFound?f(t)?e:m(t,"HEAD").then((t=>200===t.status?{status:d.Error}:e)):e))},t}(w);const P=e=>{E=e},S={enqueue:e=>E.prefetch(e),getResourceURLsForPathname:e=>E.getResourceURLsForPathname(e),loadPage:e=>E.loadPage(e),loadPageSync:function(e,t){return void 0===t&&(t={}),E.loadPageSync(e,t)},prefetch:e=>E.prefetch(e),isPageNotFound:e=>E.isPageNotFound(e),hovering:e=>E.hovering(e),loadAppData:()=>E.loadAppData()};var C=S;function R(){return E?E.staticQueryDb:{}}function k(){return E?E.slicesDb:{}}},4779:function(e,t,n){"use strict";n.d(t,{Z:function(){return E}});var r=n(7294),o=n(5697),a=n.n(o),i=n(3092),s=n(7802),c=n(1883),u=n(7896),l=n(4941);function p(e){let{children:t,callback:n}=e;return(0,r.useEffect)((()=>{n()})),t}const d=["link","meta","style","title","base","noscript","script","html","body"];function h(e,t){if(e instanceof HTMLElement&&t instanceof HTMLElement){const n=t.getAttribute("nonce");if(n&&!e.getAttribute("nonce")){const r=t.cloneNode(!0);return r.setAttribute("nonce",""),r.nonce=n,n===e.nonce&&e.isEqualNode(r)}}return e.isEqualNode(t)}const f=document.createElement("div"),m=new Set,g=new Set,v=(e,t,n,r)=>{const o=document.getElementsByTagName(e)[0];o&&(o.setAttribute(t,n),r.add(t))},y=()=>{var e;const t=[],n=new Map;for(const u of f.childNodes){var r,o;const e=u.nodeName.toLowerCase(),i=null===(r=u.attributes)||void 0===r||null===(o=r.id)||void 0===o?void 0:o.value;if(!d.includes(e))continue;if("html"===e){for(const e of u.attributes)v("html",e.name,e.value,m);continue}if("body"===e){for(const e of u.attributes)v("body",e.name,e.value,g);continue}let s=u.cloneNode(!0);if(s.setAttribute("data-gatsby-head",!0),"script"===s.nodeName.toLowerCase()){const e=document.createElement("script");for(const t of s.attributes)e.setAttribute(t.name,t.value);e.innerHTML=s.innerHTML,s=e}if(i){if(n.has(i)){var a;const e=n.get(i);null===(a=t[e].parentNode)||void 0===a||a.removeChild(t[e]),t[e]=s;continue}t.push(s),n.set(i,t.length-1)}else t.push(s)}const i=document.querySelectorAll("[data-gatsby-head]");var s;if(0===i.length)return void(s=document.head).append.apply(s,t);const c=[];!function(e){let{oldNodes:t,newNodes:n,onStale:r,onNew:o}=e;for(const a of t){const e=n.findIndex((e=>h(e,a)));-1===e?r(a):n.splice(e,1)}for(const a of n)o(a)}({oldNodes:i,newNodes:t,onStale:e=>e.parentNode.removeChild(e),onNew:e=>c.push(e)}),(e=document.head).append.apply(e,c)};function w(e){let{pageComponent:t,staticQueryResults:n,pageComponentProps:o}=e;(0,r.useEffect)((()=>{if(null!=t&&t.Head){!function(e){if("function"!=typeof e)throw new Error('Expected "Head" export to be a function got "'+typeof e+'".')}(t.Head);const{render:a}=(0,l.U)(),i=t.Head;a(r.createElement(p,{callback:y},r.createElement(c.StaticQueryContext.Provider,{value:n},r.createElement(u.LocationProvider,null,r.createElement(i,{location:{pathname:(e=o).location.pathname},params:e.params,data:e.data||{},serverData:e.serverData,pageContext:e.pageContext})))),f)}var e;return()=>{(()=>{const e=document.querySelectorAll("[data-gatsby-head]");for(const t of e)t.parentNode.removeChild(t)})(),m.forEach((e=>{document.getElementsByTagName("html")[0].removeAttribute(e)})),g.forEach((e=>{document.getElementsByTagName("body")[0].removeAttribute(e)}))}}))}function b(e){const t={...e,params:{...(0,s.GA)(e.location.pathname),...e.pageResources.json.pageContext.__params}};let n;var o;n=e.pageResources.partialHydration?e.pageResources.partialHydration:(0,r.createElement)((o=e.pageResources.component)&&o.default||o,{...t,key:e.path||e.pageResources.page.path});w({pageComponent:e.pageResources.head,staticQueryResults:e.pageResources.staticQueryResults,pageComponentProps:t});return(0,i.h)("wrapPageElement",{element:n,props:t},n,(e=>{let{result:n}=e;return{element:n,props:t}})).pop()}b.propTypes={location:a().object.isRequired,pageResources:a().object.isRequired,data:a().object,pageContext:a().object.isRequired};var E=b},5824:function(e,t,n){"use strict";var r=n(4578),o=n(3092),a=n(7294),i=n(7896),s=n(9679),c=n(1757),u=n(8995),l=n(8575),p=n(6073),d=n(8299);const h={id:"gatsby-announcer",style:{position:"absolute",top:0,width:1,height:1,padding:0,overflow:"hidden",clip:"rect(0, 0, 0, 0)",whiteSpace:"nowrap",border:0},"aria-live":"assertive","aria-atomic":"true"};var f=n(1562);function m(e){const t=(0,p.J)(e),{hash:n,search:r}=window.location;return null!=t&&(window.___replace(t.toPath+r+n),!0)}let g="";window.addEventListener("unhandledrejection",(e=>{/loading chunk \d* failed./i.test(e.reason)&&g&&(window.location.pathname=g)}));const v=(e,t)=>{m(e.pathname)||(g=e.pathname,(0,o.h)("onPreRouteUpdate",{location:e,prevLocation:t}))},y=(e,t)=>{m(e.pathname)||(0,o.h)("onRouteUpdate",{location:e,prevLocation:t})},w=function(e,t){if(void 0===t&&(t={}),"number"==typeof e)return void i.globalHistory.navigate(e);const{pathname:n,search:r,hash:a}=(0,f.cP)(e),s=(0,p.J)(n);if(s&&(e=s.toPath+r+a),window.___swUpdated)return void(window.location=n+r+a);const c=setTimeout((()=>{d.Z.emit("onDelayedLoadPageResources",{pathname:n}),(0,o.h)("onRouteUpdateDelayed",{location:window.location})}),1e3);l.ZP.loadPage(n+r).then((o=>{if(!o||o.status===l.uQ.Error)return window.history.replaceState({},"",location.href),window.location=n,void clearTimeout(c);o&&o.page.webpackCompilationHash!==window.___webpackCompilationHash&&("serviceWorker"in navigator&&null!==navigator.serviceWorker.controller&&"activated"===navigator.serviceWorker.controller.state&&navigator.serviceWorker.controller.postMessage({gatsbyApi:"clearPathResources"}),window.location=n+r+a),(0,i.navigate)(e,t),clearTimeout(c)}))};function b(e,t){let{location:n}=t;const{pathname:r,hash:a}=n,i=(0,o.h)("shouldUpdateScroll",{prevRouterProps:e,pathname:r,routerProps:{location:n},getSavedScrollPosition:e=>[0,this._stateStorage.read(e,e.key)]});if(i.length>0)return i[i.length-1];if(e){const{location:{pathname:t}}=e;if(t===r)return a?decodeURI(a.slice(1)):[0,0]}return!0}let E=function(e){function t(t){var n;return(n=e.call(this,t)||this).announcementRef=a.createRef(),n}(0,r.Z)(t,e);var n=t.prototype;return n.componentDidUpdate=function(e,t){requestAnimationFrame((()=>{let e="new page at "+this.props.location.pathname;document.title&&(e=document.title);const t=document.querySelectorAll("#gatsby-focus-wrapper h1");t&&t.length&&(e=t[0].textContent);const n="Navigated to "+e;if(this.announcementRef.current){this.announcementRef.current.innerText!==n&&(this.announcementRef.current.innerText=n)}}))},n.render=function(){return a.createElement("div",Object.assign({},h,{ref:this.announcementRef}))},t}(a.Component);const _=(e,t)=>{var n,r;return e.href!==t.href||(null==e||null===(n=e.state)||void 0===n?void 0:n.key)!==(null==t||null===(r=t.state)||void 0===r?void 0:r.key)};let P=function(e){function t(t){var n;return n=e.call(this,t)||this,v(t.location,null),n}(0,r.Z)(t,e);var n=t.prototype;return n.componentDidMount=function(){y(this.props.location,null)},n.shouldComponentUpdate=function(e){return!!_(e.location,this.props.location)&&(v(this.props.location,e.location),!0)},n.componentDidUpdate=function(e){_(e.location,this.props.location)&&y(this.props.location,e.location)},n.render=function(){return a.createElement(a.Fragment,null,this.props.children,a.createElement(E,{location:location}))},t}(a.Component);var S=n(4779),C=n(5418);function R(e,t){for(var n in e)if(!(n in t))return!0;for(var r in t)if(e[r]!==t[r])return!0;return!1}var k=function(e){function t(t){var n;n=e.call(this)||this;const{location:r,pageResources:o}=t;return n.state={location:{...r},pageResources:o||l.ZP.loadPageSync(r.pathname+r.search,{withErrorDetails:!0})},n}(0,r.Z)(t,e),t.getDerivedStateFromProps=function(e,t){let{location:n}=e;if(t.location.href!==n.href){return{pageResources:l.ZP.loadPageSync(n.pathname+n.search,{withErrorDetails:!0}),location:{...n}}}return{location:{...n}}};var n=t.prototype;return n.loadResources=function(e){l.ZP.loadPage(e).then((t=>{t&&t.status!==l.uQ.Error?this.setState({location:{...window.location},pageResources:t}):(window.history.replaceState({},"",location.href),window.location=e)}))},n.shouldComponentUpdate=function(e,t){return t.pageResources?this.state.pageResources!==t.pageResources||(this.state.pageResources.component!==t.pageResources.component||(this.state.pageResources.json!==t.pageResources.json||(!(this.state.location.key===t.location.key||!t.pageResources.page||!t.pageResources.page.matchPath&&!t.pageResources.page.path)||function(e,t,n){return R(e.props,t)||R(e.state,n)}(this,e,t)))):(this.loadResources(e.location.pathname+e.location.search),!1)},n.render=function(){return this.props.children(this.state)},t}(a.Component),O=n(1505),x=n(4941);const j=new l.kL(C,[],window.pageData);(0,l.N1)(j),j.setApiRunner(o.h);const{render:T,hydrate:D}=(0,x.U)();window.asyncRequires=C,window.___emitter=d.Z,window.___loader=l.jN,i.globalHistory.listen((e=>{e.location.action=e.action})),window.___push=e=>w(e,{replace:!1}),window.___replace=e=>w(e,{replace:!0}),window.___navigate=(e,t)=>w(e,t);const N="gatsby-reload-compilation-hash-match";(0,o.I)("onClientEntry").then((()=>{(0,o.h)("registerServiceWorker").filter(Boolean).length>0&&n(9939);const e=e=>a.createElement(i.BaseContext.Provider,{value:{baseuri:"/",basepath:"/"}},a.createElement(S.Z,e)),t=a.createContext({}),p={renderEnvironment:"browser"};let d=function(e){function n(){return e.apply(this,arguments)||this}return(0,r.Z)(n,e),n.prototype.render=function(){const{children:e}=this.props;return a.createElement(i.Location,null,(n=>{let{location:r}=n;return a.createElement(k,{location:r},(n=>{let{pageResources:r,location:o}=n;const i=(0,l.hs)(),s=(0,l.Nt)();return a.createElement(c.B9.Provider,{value:i},a.createElement(u.Bs.Provider,{value:p},a.createElement(u.m3.Provider,{value:s},a.createElement(u.u0.Provider,{value:r.page.slicesMap},a.createElement(t.Provider,{value:{pageResources:r,location:o}},e)))))}))}))},n}(a.Component),h=function(n){function o(){return n.apply(this,arguments)||this}return(0,r.Z)(o,n),o.prototype.render=function(){return a.createElement(t.Consumer,null,(t=>{let{pageResources:n,location:r}=t;return a.createElement(P,{location:r},a.createElement(s.$C,{location:r,shouldUpdateScroll:b},a.createElement(i.Router,{basepath:"",location:r,id:"gatsby-focus-wrapper"},a.createElement(e,Object.assign({path:"/404.html"===n.page.path||"/500.html"===n.page.path?(0,O.Z)(r.pathname,""):encodeURI((n.page.matchPath||n.page.path).split("?")[0])},this.props,{location:r,pageResources:n},n.json)))))}))},o}(a.Component);const{pagePath:f,location:m}=window;f&&""+f!==m.pathname+(f.includes("?")?m.search:"")&&!(j.findMatchPath((0,O.Z)(m.pathname,""))||f.match(/^\/(404|500)(\/?|.html)$/)||f.match(/^\/offline-plugin-app-shell-fallback\/?$/))&&(0,i.navigate)(""+f+(f.includes("?")?"":m.search)+m.hash,{replace:!0});const g=()=>{try{return sessionStorage}catch{return null}};l.jN.loadPage(m.pathname+m.search).then((e=>{var t;const n=g();if(null!=e&&null!==(t=e.page)&&void 0!==t&&t.webpackCompilationHash&&e.page.webpackCompilationHash!==window.___webpackCompilationHash&&("serviceWorker"in navigator&&null!==navigator.serviceWorker.controller&&"activated"===navigator.serviceWorker.controller.state&&navigator.serviceWorker.controller.postMessage({gatsbyApi:"clearPathResources"}),n)){if(!("1"===n.getItem(N)))return n.setItem(N,"1"),void window.location.reload(!0)}if(n&&n.removeItem(N),!e||e.status===l.uQ.Error){const t="page resources for "+m.pathname+" not found. Not rendering React";if(e&&e.error)throw console.error(t),e.error;throw new Error(t)}const r=(0,o.h)("wrapRootElement",{element:a.createElement(h,null)},a.createElement(h,null),(e=>{let{result:t}=e;return{element:t}})).pop(),i=function(){const e=a.useRef(!1);return a.useEffect((()=>{e.current||(e.current=!0,performance.mark&&performance.mark("onInitialClientRender"),(0,o.h)("onInitialClientRender"))}),[]),a.createElement(d,null,r)},s=document.getElementById("gatsby-focus-wrapper");let c=T;s&&s.children.length&&(c=D);const u=(0,o.h)("replaceHydrateFunction",void 0,c)[0];function p(){const e="undefined"!=typeof window?document.getElementById("___gatsby"):null;u(a.createElement(i,null),e)}const f=document;if("complete"===f.readyState||"loading"!==f.readyState&&!f.documentElement.doScroll)setTimeout((function(){p()}),0);else{const e=function(){f.removeEventListener("DOMContentLoaded",e,!1),window.removeEventListener("load",e,!1),p()};f.addEventListener("DOMContentLoaded",e,!1),window.addEventListener("load",e,!1)}}))}))},224:function(e,t,n){"use strict";n.r(t);var r=n(7294),o=n(8575),a=n(4779);t.default=e=>{let{location:t}=e;const n=o.ZP.loadPageSync(t.pathname);return n?r.createElement(a.Z,{location:t,pageResources:n,...n.json}):null}},2743:function(e,t,n){var r;e.exports=(r=n(224))&&r.default||r},4941:function(e,t,n){"use strict";n.d(t,{U:function(){return o}});const r=new WeakMap;function o(){const e=n(745);return{render:(t,n)=>{let o=r.get(n);o||r.set(n,o=e.createRoot(n)),o.render(t)},hydrate:(t,n)=>e.hydrateRoot(n,t)}}},6073:function(e,t,n){"use strict";n.d(t,{J:function(){return a}});const r=new Map,o=new Map;function a(e){let t=r.get(e);return t||(t=o.get(e.toLowerCase())),t}[].forEach((e=>{e.ignoreCase?o.set(e.fromPath,e):r.set(e.fromPath,e)}))},9939:function(e,t,n){"use strict";n.r(t);var r=n(3092);"https:"!==window.location.protocol&&"localhost"!==window.location.hostname?console.error("Service workers can only be used over HTTPS, or on localhost for development"):"serviceWorker"in navigator&&navigator.serviceWorker.register("/sw.js").then((function(e){e.addEventListener("updatefound",(()=>{(0,r.h)("onServiceWorkerUpdateFound",{serviceWorker:e});const t=e.installing;console.log("installingWorker",t),t.addEventListener("statechange",(()=>{switch(t.state){case"installed":navigator.serviceWorker.controller?(window.___swUpdated=!0,(0,r.h)("onServiceWorkerUpdateReady",{serviceWorker:e}),window.___failedResources&&(console.log("resources failed, SW updated - reloading"),window.location.reload())):(console.log("Content is now available offline!"),(0,r.h)("onServiceWorkerInstalled",{serviceWorker:e}));break;case"redundant":console.error("The installing service worker became redundant."),(0,r.h)("onServiceWorkerRedundant",{serviceWorker:e});break;case"activated":(0,r.h)("onServiceWorkerActive",{serviceWorker:e})}}))}))})).catch((function(e){console.error("Error during service worker registration:",e)}))},8995:function(e,t,n){"use strict";n.d(t,{Bs:function(){return a},Db:function(){return s},m3:function(){return o},u0:function(){return i}});var r=n(7294);const o=r.createContext({}),a=r.createContext({}),i=r.createContext({}),s=r.createContext({})},1757:function(e,t,n){"use strict";n.d(t,{i1:function(){return c},B9:function(){return o},K2:function(){return u}});var r=n(7294);const o=(a="StaticQuery",i={},r.createServerContext?function(e,t){return void 0===t&&(t=null),globalThis.__SERVER_CONTEXT||(globalThis.__SERVER_CONTEXT={}),globalThis.__SERVER_CONTEXT[e]||(globalThis.__SERVER_CONTEXT[e]=r.createServerContext(e,t)),globalThis.__SERVER_CONTEXT[e]}(a,i):r.createContext(i));var a,i;function s(e){let{staticQueryData:t,data:n,query:o,render:a}=e;const i=n?n.data:t[o]&&t[o].data;return r.createElement(r.Fragment,null,i&&a(i),!i&&r.createElement("div",null,"Loading (StaticQuery)"))}const c=e=>{const{data:t,query:n,render:a,children:i}=e;return r.createElement(o.Consumer,null,(e=>r.createElement(s,{data:t,query:n,render:a||i,staticQueryData:e})))},u=e=>{var t;r.useContext;const n=r.useContext(o);if(isNaN(Number(e)))throw new Error("useStaticQuery was called with a string but expects to be called using `graphql`. Try this:\n\nimport { useStaticQuery, graphql } from 'gatsby';\n\nuseStaticQuery(graphql`"+e+"`);\n");if(null!==(t=n[e])&&void 0!==t&&t.data)return n[e].data;throw new Error("The result of this StaticQuery could not be fetched.\n\nThis is likely a bug in Gatsby and if refreshing the page does not fix it, please open an issue in https://github.com/gatsbyjs/gatsby/issues")}},1505:function(e,t,n){"use strict";function r(e,t){return void 0===t&&(t=""),t?e===t?"/":e.startsWith(t+"/")?e.slice(t.length):e:e}n.d(t,{Z:function(){return r}})},7420:function(e,t,n){"use strict";n.r(t)},8696:function(e,t,n){"use strict";n.r(t),n.d(t,{onRouteUpdate:function(){return r}});const r=function(e,t){let{location:n}=e;void 0===t&&(t={stripQueryString:!1});const r=document.querySelector("link[rel='canonical']"),o=r.getAttribute("href"),a=r.getAttribute("data-baseProtocol"),i=r.getAttribute("data-baseHost");if(o&&a&&i){let e=a+"//"+i+n.pathname;const{stripQueryString:o}=t;o||(e+=n.search),e+=n.hash,r.setAttribute("href",""+e)}}},9608:function(e,t,n){"use strict";n.r(t),n.d(t,{onRouteUpdate:function(){return r}});n(1883),n(292);const r=function(e,t){let{location:n}=e}},292:function(e,t,n){"use strict";var r=n(1883)},855:function(e,t){"use strict";t.DEFAULT_OPTIONS={maxWidth:650,wrapperStyle:"",backgroundColor:"white",linkImagesToOriginal:!0,showCaptions:!1,markdownCaptions:!1,withWebp:!1,withAvif:!1,tracedSVG:!1,loading:"lazy",decoding:"async",disableBgImageOnAlpha:!1,disableBgImage:!1},t.EMPTY_ALT="GATSBY_EMPTY_ALT",t.imageClass="gatsby-resp-image-image",t.imageWrapperClass="gatsby-resp-image-wrapper",t.imageBackgroundClass="gatsby-resp-image-background-image"},2154:function(e,t,n){"use strict";var r=n(855),o=r.DEFAULT_OPTIONS,a=r.imageClass,i=r.imageBackgroundClass,s=r.imageWrapperClass;t.onRouteUpdate=function(e,t){for(var n=Object.assign({},o,t),r=document.querySelectorAll("."+s),c=function(e){var t=r[e],o=t.querySelector("."+i),s=t.querySelector("."+a),c=function(){o.style.transition="opacity 0.5s 0.5s",s.style.transition="opacity 0.5s",u()},u=function e(){o.style.opacity=0,s.style.opacity=1,s.style.color="inherit",s.style.boxShadow="inset 0px 0px 0px 400px "+n.backgroundColor,s.removeEventListener("load",c),s.removeEventListener("error",e)};s.style.opacity=0,s.addEventListener("load",c),s.addEventListener("error",u),s.complete&&u()},u=0;u((e,t)=>{const{forward:n=[],...r}=e||{},o=JSON.stringify(r,((e,t)=>("function"==typeof t&&(t=String(t)).startsWith(e+"(")&&(t="function "+t),t)));return["!(function(w,p,f,c){",Object.keys(r).length>0?`c=w[p]=Object.assign(w[p]||{},${o});`:"c=w[p]=w[p]||{};","c[f]=(c[f]||[])",n.length>0?`.concat(${JSON.stringify(n)})`:"","})(window,'partytown','forward');",t].join("")})(e,'/* Partytown 0.7.5 - MIT builder.io */\n!function(t,e,n,i,r,o,a,d,s,c,p,l){function u(){l||(l=1,"/"==(a=(o.lib||"/~partytown/")+(o.debug?"debug/":""))[0]&&(s=e.querySelectorAll(\'script[type="text/partytown"]\'),i!=t?i.dispatchEvent(new CustomEvent("pt1",{detail:t})):(d=setTimeout(f,1e4),e.addEventListener("pt0",w),r?h(1):n.serviceWorker?n.serviceWorker.register(a+(o.swPath||"partytown-sw.js"),{scope:a}).then((function(t){t.active?h():t.installing&&t.installing.addEventListener("statechange",(function(t){"activated"==t.target.state&&h()}))}),console.error):f())))}function h(t){c=e.createElement(t?"script":"iframe"),t||(c.setAttribute("style","display:block;width:0;height:0;border:0;visibility:hidden"),c.setAttribute("aria-hidden",!0)),c.src=a+"partytown-"+(t?"atomics.js?v=0.7.5":"sandbox-sw.html?"+Date.now()),e.body.appendChild(c)}function f(n,r){for(w(),i==t&&(o.forward||[]).map((function(e){delete t[e.split(".")[0]]})),n=0;n=0||(o[n]=e[n]);return o}const u=e=>{const{search:t,hash:n,href:r,origin:o,protocol:a,host:i,hostname:s,port:c}=e.location;let{pathname:u}=e.location;return!u&&r&&d&&(u=new URL(r).pathname),{pathname:encodeURI(decodeURI(u)),search:t,hash:n,href:r,origin:o,protocol:a,host:i,hostname:s,port:c,state:e.history.state,key:e.history.state&&e.history.state.key||"initial"}},l=(e,t)=>{let n=[],r=u(e),o=!1,a=()=>{};return{get location(){return r},get transitioning(){return o},_onTransitionComplete(){o=!1,a()},listen(t){n.push(t);const o=()=>{r=u(e),t({location:r,action:"POP"})};return e.addEventListener("popstate",o),()=>{e.removeEventListener("popstate",o),n=n.filter((e=>e!==t))}},navigate(t,{state:i,replace:c=!1}={}){if("number"==typeof t)e.history.go(t);else{i=s({},i,{key:Date.now()+""});try{o||c?e.history.replaceState(i,null,t):e.history.pushState(i,null,t)}catch(n){e.location[c?"replace":"assign"](t)}}r=u(e),o=!0;const l=new Promise((e=>a=e));return n.forEach((e=>e({location:r,action:"PUSH"}))),l}}},p=(e="/")=>{const t=e.indexOf("?"),n={pathname:t>-1?e.substr(0,t):e,search:t>-1?e.substr(t):""};let r=0;const o=[n],a=[null];return{get location(){return o[r]},addEventListener(e,t){},removeEventListener(e,t){},history:{get entries(){return o},get index(){return r},get state(){return a[r]},pushState(e,t,n){const[i,s=""]=n.split("?");r++,o.push({pathname:i,search:s.length?`?${s}`:s}),a.push(e)},replaceState(e,t,n){const[i,s=""]=n.split("?");o[r]={pathname:i,search:s},a[r]=e},go(e){const t=r+e;t<0||t>a.length-1||(r=t)}}}},d=!("undefined"==typeof window||!window.document||!window.document.createElement),h=l(d?window:p()),{navigate:f}=h;function m(e,t){return o.createServerContext?((e,t=null)=>(globalThis.__SERVER_CONTEXT||(globalThis.__SERVER_CONTEXT={}),globalThis.__SERVER_CONTEXT[e]||(globalThis.__SERVER_CONTEXT[e]=o.createServerContext(e,t)),globalThis.__SERVER_CONTEXT[e]))(e,t):o.createContext(t)}const g=m("Base",{baseuri:"/",basepath:"/"}),v=m("Location"),y=()=>o.useContext(g),w=()=>o.useContext(v);function b(e){this.uri=e}const E=e=>e instanceof b,_=e=>{throw new b(e)};function P(e){const{to:t,replace:n=!0,state:r,noThrow:a,baseuri:i}=e;o.useEffect((()=>{Promise.resolve().then((()=>{const o=O(t,i);f(x(o,e),{replace:n,state:r})}))}),[]);const s=O(t,i);return a||_(x(s,e)),null}const S=e=>{const t=w(),{baseuri:n}=y();return o.createElement(P,s({},t,{baseuri:n},e))};S.propTypes={from:a.string,to:a.string.isRequired};const C=(e,t)=>e.substr(0,t.length)===t,R=(e,t)=>{let n,r;const[o]=t.split("?"),a=I(o),s=""===a[0],c=M(e);for(let u=0,l=c.length;u dynamic segment "${r[1]}" is a reserved name. Please use a different name in path "${o.path}".`);const t=decodeURIComponent(n);p[r[1]]=t}else if(t!==n){e=!0;break}}if(!e){n={route:o,params:p,uri:"/"+a.slice(0,h).join("/")};break}}return n||r||null},k=(e,t)=>R([{path:e}],t),O=(e,t)=>{if(C(e,"/"))return e;const[n,r]=e.split("?"),[o]=t.split("?"),a=I(n),i=I(o);if(""===a[0])return A(o,r);if(!C(a[0],".")){const e=i.concat(a).join("/");return A(("/"===o?"":"/")+e,r)}const s=i.concat(a),c=[];for(let u=0,l=s.length;u{const[n,r=""]=e.split("?");let o="/"+I(n).map((e=>{const n=T.exec(e);return n?t[n[1]]:e})).join("/");const{location:{search:a=""}={}}=t,i=a.split("?")[1]||"";return o=A(o,r,i),o},j=(e,t)=>{const n=e=>D(e);return I(e).filter(n).sort().join("/")===I(t).filter(n).sort().join("/")},T=/^:(.+)/,D=e=>T.test(e),N=e=>e&&"*"===e[0],L=(e,t)=>({route:e,score:e.default?0:I(e.path).reduce(((e,t)=>(e+=4,(e=>""===e)(t)?e+=1:D(t)?e+=2:N(t)?e-=5:e+=3,e)),0),index:t}),M=e=>e.map(L).sort(((e,t)=>e.scoret.score?-1:e.index-t.index)),I=e=>e.replace(/(^\/+|\/+$)/g,"").split("/"),A=(e,...t)=>e+((t=t.filter((e=>e&&e.length>0)))&&t.length>0?`?${t.join("&")}`:""),H=["uri","path"],U=(e,t)=>{const n=Object.keys(e);return n.length===Object.keys(t).length&&n.every((n=>t.hasOwnProperty(n)&&e[n]===t[n]))},W=e=>e.replace(/(^\/+|\/+$)/g,""),q=e=>t=>{if(!t)return null;if(t.type===o.Fragment&&t.props.children)return o.Children.map(t.props.children,q(e));if(i(t.props.path||t.props.default||t.type===S,`: Children of must have a \`path\` or \`default\` prop, or be a \`\`. None found on element type \`${t.type}\``),i(!!(t.type!==S||t.props.from&&t.props.to),` requires both "from" and "to" props when inside a .`),i(!(t.type===S&&!j(t.props.from,t.props.to)),` has mismatched dynamic segments, ensure both paths have the exact same dynamic segments.`),t.props.default)return{value:t,default:!0};const n=t.type===S?t.props.from:t.props.path,r="/"===n?e:`${W(e)}/${W(n)}`;return{value:t,default:t.props.default,path:t.props.children?`${W(r)}/*`:r}},F=["innerRef"],B=["to","state","replace","getProps"],Q=["key"];let{forwardRef:J}=r||(r=n.t(o,2));void 0===J&&(J=e=>e);const $=()=>{},Z=J(((e,t)=>{let{innerRef:n}=e,r=c(e,F);const{baseuri:a}=y(),{location:i}=w(),{to:u,state:l,replace:p,getProps:d=$}=r,h=c(r,B),m=O(u,a),g=encodeURI(m),v=i.pathname===g,b=C(i.pathname,g);return o.createElement("a",s({ref:t||n,"aria-current":v?"page":void 0},h,d({isCurrent:v,isPartiallyCurrent:b,href:m,location:i}),{href:m,onClick:e=>{if(h.onClick&&h.onClick(e),(e=>!e.defaultPrevented&&0===e.button&&!(e.metaKey||e.altKey||e.ctrlKey||e.shiftKey))(e)){e.preventDefault();let t=p;if("boolean"!=typeof p&&v){const e=c(s({},i.state),Q);t=U(s({},l),e)}f(m,{state:l,replace:t})}}}))}));Z.displayName="Link",Z.propTypes={to:a.string.isRequired};class G extends o.Component{constructor(...e){super(...e),this.displayName="ReactUseErrorBoundary"}componentDidCatch(...e){this.setState({}),this.props.onError(...e)}render(){return this.props.children}}const K=o.createContext({componentDidCatch:{current:void 0},error:void 0,setError:()=>!1});function z({children:e}){const[t,n]=o.useState(),r=o.useRef(),a=o.useMemo((()=>({componentDidCatch:r,error:t,setError:n})),[t]);return o.createElement(K.Provider,{value:a},o.createElement(G,{error:t,onError:(e,t)=>{n(e),null==r.current||r.current(e,t)}},e))}z.displayName="ReactUseErrorBoundaryContext";const V=function(e){var t,n;function r(t){return o.createElement(z,null,o.createElement(e,s({key:"WrappedComponent"},t)))}return r.displayName=`WithErrorBoundary(${null!=(t=null!=(n=e.displayName)?n:e.name)?t:"Component"})`,r}((({history:e=h,children:t})=>{const{location:n}=e,[r,a]=o.useState({location:n}),[i]=function(e){const t=o.useContext(K);t.componentDidCatch.current=void 0;const n=o.useCallback((()=>{t.setError(void 0)}),[]);return[t.error,n]}();if(o.useEffect((()=>{e._onTransitionComplete()}),[r.location]),o.useEffect((()=>{let t=!1;const n=e.listen((({location:e})=>{Promise.resolve().then((()=>{requestAnimationFrame((()=>{t||a({location:e})}))}))}));return()=>{t=!0,n()}}),[]),i){if(!E(i))throw i;f(i.uri,{replace:!0})}return o.createElement(v.Provider,{value:r},"function"==typeof t?t(r):t||null)})),X=({children:e})=>{const t=w();return t?e(t):o.createElement(V,null,e)},Y=({url:e,children:t})=>{const n=e.indexOf("?");let r,a="";return n>-1?(r=e.substring(0,n),a=e.substring(n)):r=e,o.createElement(v.Provider,{value:{location:{pathname:r,search:a,hash:""}}},t)},ee=({path:e,children:t})=>{const{baseuri:n}=y(),{location:r}=w(),o=O(e,n),a=k(o,r.pathname);return t({location:r,match:a?s({},a.params,{uri:a.uri,path:e}):null})},te=["uri","location","component"],ne=["children","style","component","uri","location"],re=e=>{let{uri:t,location:n,component:r}=e,a=c(e,te);return o.createElement(ae,s({},a,{component:r,uri:t,location:n}))};let oe=0;const ae=e=>{let{children:t,style:n,component:r="div",uri:a,location:i}=e,u=c(e,ne);const l=o.useRef(),p=o.useRef(!0),d=o.useRef(a),h=o.useRef(i.pathname),f=o.useRef(!1);o.useEffect((()=>(oe++,m(),()=>{oe--,0===oe&&(p.current=!0)})),[]),o.useEffect((()=>{let e=!1,t=!1;a!==d.current&&(d.current=a,e=!0),i.pathname!==h.current&&(h.current=i.pathname,t=!0),f.current=e||t&&i.pathname===a,f.current&&m()}),[a,i]);const m=o.useCallback((()=>{var e;p.current?p.current=!1:(e=l.current,f.current&&e&&e.focus())}),[]);return o.createElement(r,s({style:s({outline:"none"},n),tabIndex:"-1",ref:l},u),t)},ie=["location","primary","children","basepath","baseuri","component"],se=e=>{const t=y(),n=w();return o.createElement(ce,s({},t,n,e))};function ce(e){const{location:t,primary:n=!0,children:r,basepath:a,component:i="div"}=e,u=c(e,ie),l=o.Children.toArray(r).reduce(((e,t)=>{const n=q(a)(t);return e.concat(n)}),[]),{pathname:p}=t,d=R(l,p);if(d){const{params:e,uri:r,route:c,route:{value:l}}=d,p=c.default?a:c.path.replace(/\*$/,""),h=s({},e,{uri:r,location:t}),f=o.cloneElement(l,h,l.props.children?o.createElement(se,{location:t,primary:n},l.props.children):void 0),m=n?re:i,v=n?s({uri:r,location:t,component:i},u):u;return o.createElement(g.Provider,{value:{baseuri:r,basepath:p}},o.createElement(m,v,f))}return null}const ue=()=>{const e=w();if(!e)throw new Error("useLocation hook was used but a LocationContext.Provider was not found in the parent tree. Make sure this is used in a component that is a child of Router");return e.location},le=()=>{throw new Error("useNavigate is removed. Use import { navigate } from 'gatsby' instead")},pe=()=>{const e=y();if(!e)throw new Error("useParams hook was used but a LocationContext.Provider was not found in the parent tree. Make sure this is used in a component that is a child of Router");const t=ue(),n=k(e.basepath,t.pathname);return n?n.params:null},de=e=>{if(!e)throw new Error("useMatch(path: string) requires an argument of a string to match against");const t=y();if(!t)throw new Error("useMatch hook was used but a LocationContext.Provider was not found in the parent tree. Make sure this is used in a component that is a child of Router");const n=ue(),r=O(e,t.baseuri),o=k(r,n.pathname);return o?s({},o.params,{uri:o.uri,path:e}):null}},1562:function(e,t,n){"use strict";n.d(t,{c4:function(){return E},cP:function(){return c},dq:function(){return p},mc:function(){return g},rU:function(){return b}});var r=n(5697),o=n(7294),a=n(7896),i=n(4506);function s(){return s=Object.assign?Object.assign.bind():function(e){for(var t=1;t{if("string"==typeof e)return!(e=>u.test(e))(e)};function p(e,t=""){var n;if(!l(e))return e;if(e.startsWith("./")||e.startsWith("../"))return e;const r=null!=(n=null!=t?t:"")?n:"/";return`${null!=r&&r.endsWith("/")?r.slice(0,-1):r}${e.startsWith("/")?e:`/${e}`}`}const d=e=>null==e?void 0:e.startsWith("/");function h(e,t){const{pathname:n,search:r,hash:o}=c(e);return`${(0,i.H)(n,t)}${r}${o}`}const f=(e,t)=>"number"==typeof e?e:l(e)?d(e)?function(e){const t=p(e),n="always";return h(t,n)}(e):function(e,t){if(d(e))return e;const n="always",r=(0,a.resolve)(e,t);return h(r,n)}(e,t):e,m=["to","getProps","onClick","onMouseEnter","activeClassName","activeStyle","innerRef","partiallyActive","state","replace","_location"];function g(e){return p(e,"")}const v={activeClassName:r.string,activeStyle:r.object,partiallyActive:r.bool};function y(e){return o.createElement(a.Location,null,(({location:t})=>o.createElement(w,s({},e,{_location:t}))))}class w extends o.Component{constructor(e){super(e),this.defaultGetProps=({isPartiallyCurrent:e,isCurrent:t})=>(this.props.partiallyActive?e:t)?{className:[this.props.className,this.props.activeClassName].filter(Boolean).join(" "),style:s({},this.props.style,this.props.activeStyle)}:null;let t=!1;"undefined"!=typeof window&&window.IntersectionObserver&&(t=!0),this.state={IOSupported:t},this.abortPrefetch=null,this.handleRef=this.handleRef.bind(this)}_prefetch(){let e=window.location.pathname+window.location.search;this.props._location&&this.props._location.pathname&&(e=this.props._location.pathname+this.props._location.search);const t=c(f(this.props.to,e)),n=t.pathname+t.search;if(e!==n)return ___loader.enqueue(n)}componentWillUnmount(){if(!this.io)return;const{instance:e,el:t}=this.io;this.abortPrefetch&&this.abortPrefetch.abort(),e.unobserve(t),e.disconnect()}handleRef(e){this.props.innerRef&&Object.prototype.hasOwnProperty.call(this.props.innerRef,"current")?this.props.innerRef.current=e:this.props.innerRef&&this.props.innerRef(e),this.state.IOSupported&&e&&(this.io=((e,t)=>{const n=new window.IntersectionObserver((n=>{n.forEach((n=>{e===n.target&&t(n.isIntersecting||n.intersectionRatio>0)}))}));return n.observe(e),{instance:n,el:e}})(e,(e=>{e?this.abortPrefetch=this._prefetch():this.abortPrefetch&&this.abortPrefetch.abort()})))}render(){const e=this.props,{to:t,getProps:n=this.defaultGetProps,onClick:r,onMouseEnter:i,state:u,replace:p,_location:d}=e,h=function(e,t){if(null==e)return{};var n,r,o={},a=Object.keys(e);for(r=0;r=0||(o[n]=e[n]);return o}(e,m),g=f(t,d.pathname);return l(g)?o.createElement(a.Link,s({to:g,state:u,getProps:n,innerRef:this.handleRef,onMouseEnter:e=>{i&&i(e);const t=c(g);___loader.hovering(t.pathname+t.search)},onClick:e=>{if(r&&r(e),!(0!==e.button||this.props.target||e.defaultPrevented||e.metaKey||e.altKey||e.ctrlKey||e.shiftKey)){e.preventDefault();let t=p;const n=encodeURI(g)===d.pathname;"boolean"!=typeof p&&n&&(t=!0),window.___navigate(g,{state:u,replace:t})}return!0}},h)):o.createElement("a",s({href:g},h))}}w.propTypes=s({},v,{onClick:r.func,to:r.string.isRequired,replace:r.bool,state:r.object});const b=o.forwardRef(((e,t)=>o.createElement(y,s({innerRef:t},e)))),E=(e,t)=>{window.___navigate(f(e,window.location.pathname),t)}},3521:function(e,t,n){"use strict";n.r(t),n.d(t,{Script:function(){return f},ScriptStrategy:function(){return u},collectedScriptsByPage:function(){return s},scriptCache:function(){return d},scriptCallbackCache:function(){return h}});var r=n(7294),o=n(7896);function a(){return a=Object.assign?Object.assign.bind():function(e){for(var t=1;ti.get(e)||[],set(e,t){const n=i.get(e)||[];n.push(t),i.set(e,n)},delete(e){i.delete(e)}},c="undefined"!=typeof self&&self.requestIdleCallback&&self.requestIdleCallback.bind(window)||function(e){const t=Date.now();return setTimeout((function(){e({didTimeout:!1,timeRemaining:function(){return Math.max(0,50-(Date.now()-t))}})}),1)};var u,l;(l=u||(u={})).postHydrate="post-hydrate",l.idle="idle",l.offMainThread="off-main-thread";const p=new Set(["src","strategy","dangerouslySetInnerHTML","children","onLoad","onError"]),d=new Set,h=new Map;function f(e){return r.createElement(o.Location,null,(()=>r.createElement(m,e)))}function m(e){const{src:t,strategy:n=u.postHydrate}=e||{},{pathname:i}=(0,o.useLocation)();if((0,r.useEffect)((()=>{let t;switch(n){case u.postHydrate:t=g(e);break;case u.idle:c((()=>{t=g(e)}));break;case u.offMainThread:{const t=y(e);s.set(i,t)}}return()=>{const{script:e,loadCallback:n,errorCallback:r}=t||{};n&&(null==e||e.removeEventListener("load",n)),r&&(null==e||e.removeEventListener("error",r)),null==e||e.remove()}}),[]),n===u.offMainThread){const o=v(e),c=y(e);return"undefined"==typeof window&&s.set(i,c),r.createElement("script",o?a({type:"text/partytown","data-strategy":n,crossOrigin:"anonymous"},c,{dangerouslySetInnerHTML:{__html:v(e)}}):a({type:"text/partytown",src:w(t),"data-strategy":n,crossOrigin:"anonymous"},c))}return null}function g(e){const{id:t,src:n,strategy:r=u.postHydrate,onLoad:o,onError:i}=e||{},s=t||n,c=["load","error"],l={load:o,error:i};if(s){for(const e of c)if(null!=l&&l[e]){var p;const t=h.get(s)||{},{callbacks:n=[]}=(null==t?void 0:t[e])||{};var f,m;n.push(null==l?void 0:l[e]),null!=t&&null!=(p=t[e])&&p.event?null==l||null==(f=l[e])||f.call(l,null==t||null==(m=t[e])?void 0:m.event):h.set(s,a({},t,{[e]:{callbacks:n}}))}if(d.has(s))return null}const g=v(e),w=y(e),E=document.createElement("script");t&&(E.id=t),E.dataset.strategy=r;for(const[a,u]of Object.entries(w))E.setAttribute(a,u);g&&(E.textContent=g),n&&(E.src=n);const _={};if(s){for(const e of c){const t=t=>b(t,s,e);E.addEventListener(e,t),_[`${e}Callback`]=t}d.add(s)}return document.body.appendChild(E),{script:E,loadCallback:_.loadCallback,errorCallback:_.errorCallback}}function v(e){const{dangerouslySetInnerHTML:t,children:n=""}=e||{},{__html:r=""}=t||{};return r||n}function y(e){const t={};for(const[n,r]of Object.entries(e))p.has(n)||(t[n]=r);return t}function w(e){if(e)return`/__third-party-proxy?url=${encodeURIComponent(e)}`}function b(e,t,n){const r=h.get(t)||{};for(const a of(null==r||null==(o=r[n])?void 0:o.callbacks)||[]){var o;a(e)}h.set(t,{[n]:{event:e}})}}},function(e){e.O(0,[774,532],(function(){return t=5824,e(e.s=t);var t}));e.O()}]);
+//# sourceMappingURL=app-fe620b60292e32d725ed.js.map
\ No newline at end of file
diff --git a/app-fe620b60292e32d725ed.js.LICENSE.txt b/app-fe620b60292e32d725ed.js.LICENSE.txt
new file mode 100644
index 0000000..467a64a
--- /dev/null
+++ b/app-fe620b60292e32d725ed.js.LICENSE.txt
@@ -0,0 +1,9 @@
+/**
+ * @license React
+ * react-server-dom-webpack.production.min.js
+ *
+ * Copyright (c) Facebook, Inc. and its affiliates.
+ *
+ * This source code is licensed under the MIT license found in the
+ * LICENSE file in the root directory of this source tree.
+ */
diff --git a/app-fe620b60292e32d725ed.js.map b/app-fe620b60292e32d725ed.js.map
new file mode 100644
index 0000000..ad03009
--- /dev/null
+++ b/app-fe620b60292e32d725ed.js.map
@@ -0,0 +1 @@
+{"version":3,"file":"app-fe620b60292e32d725ed.js","mappings":";6GAGAA,EAAQ,OAA2B,EAiBnCA,EAAQ,EAhByB,CAACC,EAAOC,EAAS,YAChD,MAAMC,EAAgBF,EAAMG,SAAS,SAC/BC,EAAeJ,EAAMG,SAAS,QAC9BE,EAAeL,EAAMG,SAAS,QACpC,MAAc,MAAVH,EAAsBA,IACtBE,GAAiBE,GAAgBC,KACnCJ,EAAS,SAEI,WAAXA,EACKD,EAAMG,SAAS,KAAOH,EAAQ,GAAGA,KAE3B,UAAXC,GACKD,EAAMG,SAAS,KAAOH,EAAMM,MAAM,GAAI,GAExCN,EAAK,qCCfdD,EAAQ,GAAuBA,EAAQ,QAAgB,EACvD,IAAIQ,EAAiB,EAAQ,MAC7BR,EAAQ,GAAgBQ,EAAeC,cACvC,IAAIC,EAAwB,EAAQ,MACpCV,EAAQ,GAAuBU,EAAsBC,wDCLrD,IAAIC,EAAyB,EAAQ,MACrCZ,EAAQa,YAAa,EACrBb,EAAQS,cAAgBT,EAAQc,mBAAgB,EAChD,IAAIC,EAA0BH,EAAuB,EAAQ,OACzDI,EAAkBJ,EAAuB,EAAQ,OACjDK,EAIJ,SAAiCC,EAAKC,GAAe,IAAKA,GAAeD,GAAOA,EAAIL,WAAc,OAAOK,EAAO,GAAY,OAARA,GAA+B,iBAARA,GAAmC,mBAARA,EAAsB,MAAO,CAAEE,QAASF,GAAS,IAAIG,EAAQC,EAAyBH,GAAc,GAAIE,GAASA,EAAME,IAAIL,GAAQ,OAAOG,EAAMG,IAAIN,GAAQ,IAAIO,EAAS,CAAC,EAAOC,EAAwBC,OAAOC,gBAAkBD,OAAOE,yBAA0B,IAAK,IAAIC,KAAOZ,EAAO,GAAY,YAARY,GAAqBH,OAAOI,UAAUC,eAAeC,KAAKf,EAAKY,GAAM,CAAE,IAAII,EAAOR,EAAwBC,OAAOE,yBAAyBX,EAAKY,GAAO,KAAUI,IAASA,EAAKV,KAAOU,EAAKC,KAAQR,OAAOC,eAAeH,EAAQK,EAAKI,GAAgBT,EAAOK,GAAOZ,EAAIY,EAAQ,CAAIL,EAAOL,QAAUF,EAASG,GAASA,EAAMc,IAAIjB,EAAKO,GAAW,OAAOA,CAAQ,CAJvxBW,CAAwB,EAAQ,OACxCC,EAAazB,EAAuB,EAAQ,OAC5C0B,EAAkB,EAAQ,MAC9B,SAAShB,EAAyBH,GAAe,GAAuB,mBAAZoB,QAAwB,OAAO,KAAM,IAAIC,EAAoB,IAAID,QAAeE,EAAmB,IAAIF,QAAW,OAAQjB,EAA2B,SAAkCH,GAAe,OAAOA,EAAcsB,EAAmBD,CAAmB,GAAGrB,EAAc,CAE9U,IAAIL,EAA6BG,EAAMyB,cAAc,IAAIJ,EAAgBK,gBACzE3C,EAAQc,cAAgBA,EACxBA,EAAc8B,YAAc,sBAC5B,IAAInC,EAA6B,SAAUoC,GAEzC,SAASpC,IAEP,IADA,IAAIqC,EACKC,EAAOC,UAAUC,OAAQC,EAAO,IAAIC,MAAMJ,GAAOK,EAAO,EAAGA,EAAOL,EAAMK,IAC/EF,EAAKE,GAAQJ,UAAUI,GAiCzB,OA/BAN,EAAQD,EAAiBZ,KAAKoB,MAAMR,EAAkB,CAACS,MAAMC,OAAOL,KAAUI,MACxEE,cAAgB,IAAIlB,EAAgBK,eAC1CG,EAAMW,YAAa,EACnBX,EAAMY,oBAAsB,EAC5BZ,EAAMa,eAAiB,WACrBb,EAAMY,oBAAsBE,OAAOC,QAC9Bf,EAAMW,aACTX,EAAMW,YAAa,EACnBK,sBAAsBhB,EAAMiB,YAAYC,MAAK,EAAIjD,EAAwBK,SAAS0B,KAEtF,EACAA,EAAMmB,aAAe,SAAUC,EAAUC,GACnCrB,EAAMsB,mBAAmBD,EAAWrB,EAAMuB,QAC5CT,OAAOU,SAAS,EAAGJ,EAEvB,EACApB,EAAMyB,aAAe,SAAUC,EAAML,GACnC,IAAIM,EAAOC,SAASC,eAAeH,EAAKI,UAAU,IAC9CH,GAAQ3B,EAAMsB,mBAAmBD,EAAWrB,EAAMuB,QACpDI,EAAKI,gBAET,EACA/B,EAAMsB,mBAAqB,SAAUU,EAAiBC,GACpD,IAAIX,EAAqBtB,EAAMuB,MAAMD,mBACrC,OAAKA,GAKEA,EAAmBnC,MAAK,EAAIlB,EAAwBK,SAAS0B,GAAQgC,EAAiBC,EAC/F,EACOjC,CACT,EAtCA,EAAI9B,EAAgBI,SAASX,EAAeoC,GAuC5C,IAAImC,EAASvE,EAAcsB,UA2D3B,OA1DAiD,EAAOjB,YAAc,WACnB,IAAIjC,EAAMwB,KAAKe,MAAMY,SAASnD,KAAO,KACjCA,GACFwB,KAAKE,cAAc0B,KAAK5B,KAAKe,MAAMY,SAAUnD,EAAKwB,KAAKI,qBAEzDJ,KAAKG,YAAa,CACpB,EACAuB,EAAOG,kBAAoB,WAEzB,IAAIC,EADJxB,OAAOyB,iBAAiB,SAAU/B,KAAKK,gBAEvC,IAAI2B,EAAuBhC,KAAKe,MAAMY,SACpCnD,EAAMwD,EAAqBxD,IAC3B0C,EAAOc,EAAqBd,KAC1B1C,IACFsD,EAAiB9B,KAAKE,cAAc+B,KAAKjC,KAAKe,MAAMY,SAAUnD,IAM5D0C,EACFlB,KAAKiB,aAAaiB,UAAUhB,QAAOiB,GAC1BL,GACT9B,KAAKW,aAAamB,OAAgBK,EAEtC,EACAT,EAAOU,qBAAuB,WAC5B9B,OAAO+B,oBAAoB,SAAUrC,KAAKK,eAC5C,EACAqB,EAAOY,mBAAqB,SAA4BzB,GACtD,IAGIiB,EAHAS,EAAwBvC,KAAKe,MAAMY,SACrCT,EAAOqB,EAAsBrB,KAC7B1C,EAAM+D,EAAsB/D,IAE1BA,IACFsD,EAAiB9B,KAAKE,cAAc+B,KAAKjC,KAAKe,MAAMY,SAAUnD,IAY5D0C,EACFlB,KAAKiB,aAAaiB,UAAUhB,GAAOL,GAEnCb,KAAKW,aAAamB,EAAgBjB,EAEtC,EACAa,EAAOc,OAAS,WACd,OAAoB7E,EAAM8E,cAAcjF,EAAckF,SAAU,CAC9DC,MAAO3C,KAAKE,eACXF,KAAKe,MAAM6B,SAChB,EACOzF,CACT,CApGiC,CAoG/BQ,EAAMkF,WACRnG,EAAQS,cAAgBA,EACxBA,EAAc2F,UAAY,CACxBhC,mBAAoB/B,EAAWjB,QAAQiF,KACvCH,SAAU7D,EAAWjB,QAAQkF,QAAQC,WACrCtB,SAAU5C,EAAWjB,QAAQoF,OAAOD,6CCtHtCvG,EAAQa,YAAa,EACrBb,EAAQ2C,oBAAiB,EACzB,IACI8D,EAA6B,gCAC7B9D,EAA8B,WAChC,SAASA,IAAkB,CAC3B,IAAIqC,EAASrC,EAAeZ,UAqC5B,OApCAiD,EAAOO,KAAO,SAAcN,EAAUnD,GACpC,IAAI4E,EAAWpD,KAAKqD,YAAY1B,EAAUnD,GAC1C,IACE,IAAImE,EAAQrC,OAAOgD,eAAeC,QAAQH,GAC1C,OAAOT,EAAQa,KAAKC,MAAMd,GAAS,CASrC,CARE,MAAOe,GAIP,OAAIpD,QAAUA,OAAO6C,IAA+B7C,OAAO6C,GAA4BC,GAC9E9C,OAAO6C,GAA4BC,GAErC,CACT,CACF,EACA1B,EAAOE,KAAO,SAAcD,EAAUnD,EAAKmE,GACzC,IAAIS,EAAWpD,KAAKqD,YAAY1B,EAAUnD,GACtCmF,EAAcH,KAAKI,UAAUjB,GACjC,IACErC,OAAOgD,eAAeO,QAAQT,EAAUO,EAW1C,CAVE,MAAOD,GACHpD,QAAUA,OAAO6C,KAGnB7C,OAAO6C,GAA8B,CAAC,GAFtC7C,OAAO6C,GAA4BC,GAAYI,KAAKC,MAAME,EAQ9D,CACF,EACAjC,EAAO2B,YAAc,SAAqB1B,EAAUnD,GAClD,IAAIsF,EAtCe,YAsCwBnC,EAASoC,SACpD,OAAOvF,QAA6CsF,EAAeA,EAAe,IAAMtF,CAC1F,EACOa,CACT,CAxCkC,GAyClC3C,EAAQ2C,eAAiBA,qCC7CzB3C,EAAQa,YAAa,EACrBb,EAAQW,qBAIR,SAA8B2G,GAC5B,IAAIrC,GAAW,EAAIsC,EAAaC,eAC5BC,GAAQ,EAAIC,EAAOC,YAAYnH,EAAeM,eAC9C8G,GAAM,EAAIF,EAAOG,QAAQ,MAO7B,OANA,EAAIH,EAAOI,kBAAiB,WAC1B,GAAIF,EAAIG,QAAS,CACf,IAAI7D,EAAWuD,EAAMlC,KAAKN,EAAUqC,GACpCM,EAAIG,QAAQzD,SAAS,EAAGJ,GAAY,EACtC,CACF,GAAG,CAACe,EAASnD,MACN,CACL8F,IAAKA,EACLI,SAAU,WACJJ,EAAIG,SACNN,EAAMvC,KAAKD,EAAUqC,EAAYM,EAAIG,QAAQE,UAEjD,EAEJ,EArBA,IAAIzH,EAAiB,EAAQ,MACzBkH,EAAS,EAAQ,MACjBH,EAAe,EAAQ,yCCJ3BvH,EAAQa,YAAa,EACrBb,EAAQkI,2BAAwB,EACZ,EAAQ,MACE,EAAQ,MAgBtClI,EAAQkI,sBAdsB,KAOyC,mCCZvElI,EAAQa,YAAa,EACrBb,EAAQmI,YACR,SAAqBC,GACnB,OAAOA,aAA2D,EAASA,EAAiBC,SAAQC,IAAWA,aAAuC,EAASA,EAAOC,UAAY,IACpL,qCCJAvI,EAAQa,YAAa,EACrBb,EAAQwI,uBAIR,SAAgCJ,GAC9B,IAAKA,EAAiBnF,OACpB,OAEF,MAAMwF,EAAkB/D,SAASgE,cAAc,0BACzCC,EAAkBjE,SAASgE,cAAc,kDAC3CD,GACFA,EAAgBG,SAEdD,GACFA,EAAgBC,SAElB,MAAMC,GAAW,EAAIC,EAAaX,aAAaC,GACzCW,EAAUrE,SAASqB,cAAc,UACvCgD,EAAQC,QAAQC,UAAY,GAC5BF,EAAQG,WAAY,EAAIC,EAAaC,kBAAkB,CACrDb,QAASM,IAEXnE,SAAS2E,KAAKC,YAAYP,EAC5B,EAtBA,IAAII,EAAe,EAAQ,MACvBL,EAAe,EAAQ,4BCL3B9I,EAAQuJ,WAAa,CACnB,+BAAgC,IAAM,8DACtC,iCAAkC,IAAM,+DACxC,mCAAoC,IAAM,+DAC1C,iCAAkC,IAAM,+DACxC,6CAA8C,IAAM,+DACpD,yCAA0C,IAAM,sFCNlDC,EAAOxJ,QAAU,CAAC,CACZyJ,OAAQC,EAAQ,MAChBC,QAAS,CAAC,QAAU,GAAG,QAAU,2BACjC,CACAF,OAAQC,EAAQ,MAChBC,QAAS,CAAC,QAAU,GAAG,SAAW,IAAI,cAAe,EAAK,sBAAuB,EAAK,kBAAmB,EAAM,gBAAkB,QAAQ,QAAU,GAAG,UAAW,EAAM,UAAW,EAAM,QAAU,OAAO,SAAW,QAAQ,uBAAwB,EAAM,gBAAiB,IAC3Q,CACAF,OAAQC,EAAQ,MAChBC,QAAS,CAAC,QAAU,GAAG,KAAO,cAAc,WAAa,cAAc,UAAY,IAAI,iBAAmB,UAAU,YAAc,UAAU,QAAU,aAAa,KAAO,wBAAwB,QAAS,EAAK,qBAAsB,EAAK,mBAAqB,QAAQ,YAAc,YAAY,iBAAkB,EAAK,YAAc,qCACvU,CACAF,OAAQC,EAAQ,MAChBC,QAAS,CAAC,QAAU,KACpB,CACAF,OAAQC,EAAQ,MAChBC,QAAS,CAAC,QAAU,4BCd1B,MAAMC,EAAUF,EAAQ,OAClB,2BAAEG,EAA0B,SAAEC,EAAQ,aAAEC,GAC5CL,EAAAA,MAAAA,GAEF1J,EAAQ,EAAY,SAACgK,EAAK9G,EAAW+G,EAAeC,QAAtB,IAAJhH,IAAAA,EAAO,CAAC,GAYhC,IAAIiH,EAAUP,EAAQQ,KAAIX,IACxB,IAAKA,EAAOA,OAAOO,GACjB,OAGF9G,EAAK2G,2BAA6BA,EAClC3G,EAAK4G,SAAWA,EAChB5G,EAAK6G,aAAeA,EAEpB,MAAMM,EAASZ,EAAOA,OAAOO,GAAK9G,EAAMuG,EAAOE,SAI/C,OAHIU,GAAUH,IACZhH,EAAOgH,EAAa,CAAEhH,OAAMmH,SAAQZ,YAE/BY,CAAM,IAMf,OAFAF,EAAUA,EAAQG,QAAOD,QAAuB,IAANA,IAEtCF,EAAQlH,OAAS,EACZkH,EACEF,EACF,CAACA,GAED,EAEX,EAEAjK,EAAQ,EAAiB,CAACgK,EAAK9G,EAAM+G,IACnCL,EAAQW,QACN,CAACC,EAAUC,IACTA,EAAKhB,OAAOO,GACRQ,EAASE,MAAK,IAAMD,EAAKhB,OAAOO,GAAK9G,EAAMuH,EAAKd,WAChDa,GACNG,QAAQC,+BClDZ5K,EAAQ,EAAsB,0ECG9B,MCgBA,SAAc6K,GAGb,OAFAA,EAAMA,GAAOlJ,OAAOmJ,OAAO,MAEpB,CAQNC,GAAI,SAAYC,EAAcC,IAC5BJ,EAAIG,KAAUH,EAAIG,GAAQ,KAAKE,KAAKD,EACtC,EASAE,IAAK,SAAaH,EAAcC,GAC3BJ,EAAIG,IACPH,EAAIG,GAAMI,OAAOP,EAAIG,GAAMK,QAAQJ,KAAa,EAAG,EAErD,EAUAK,KAAM,SAAcN,EAAcO,IAChCV,EAAIG,IAAS,IAAIzK,QAAQ6J,KAAI,SAAUa,GAAWA,EAAQM,EAAM,KAChEV,EAAI,MAAQ,IAAItK,QAAQ6J,KAAI,SAAUa,GAAWA,EAAQD,EAAMO,EAAM,GACvE,EAEF,CD1DgBC,uKEFhB,EAAeC,IACb,QAAsBhG,IAAlBgG,EACF,OAAOA,EAET,IAAKC,EAAMC,EAAM,IAASF,EAAcG,MAAM,KAK9C,OAJID,IACFA,EAAS,IAAMA,GAGT,MAAJD,EACK,IAAMC,EAEiB,MAA5BD,EAAKG,OAAOH,EAAKzI,OAAS,GACrByI,EAAKnL,MAAM,GAAI,GAAKoL,EAEtBD,EAAOC,CACf,YCXD,MAAMG,EAAY,IAAIC,IACtB,IAAIC,EAAa,GAEjB,MAAMC,EAAeC,IACnB,IAAIC,EAAiBD,EAGrB,IAAoB,IAFDA,EAAYb,QAAQ,KAEhB,CACrB,MAAOK,EAAMU,GAAMF,EAAYN,MAAM,KACrCO,EAAoBT,EAAI,IAAIW,mBAAmBD,EACjD,CAEA,MAAM/E,EAAWiF,mBAAmBH,GAUpC,OAPwBI,EAAAA,EAAAA,GACtBlF,EACAiF,mBAAmBE,KAGlBZ,MAAM,KAAK,EAEQ,EAGxB,SAASa,EAAWf,GAElB,OACEA,EAAKgB,WAAW,MAChBhB,EAAKgB,WAAW,aAChBhB,EAAKgB,WAAW,WAEThB,EAIF,IAAIiB,IACTjB,EACA9H,OAAOqB,SAAS2H,MAAQhJ,OAAOqB,SAAS2H,KAAKxM,SAAS,KAAI,SAC1DiH,QACJ,CAOO,MAAMwF,EAAgB5G,IAC3B+F,EAAa/F,CAAK,EAWP6G,EAAgBZ,IAC3B,MAAMa,EAAkBC,EAAUd,GAE5Be,EAAYjB,EAAW5B,KAAI,IAA0B,IAAzB,KAAEsB,EAAI,UAAEwB,GAAW,EACnD,MAAO,CACLxB,KAAMwB,EACNC,aAAczB,EACf,IAGGA,GAAO0B,EAAAA,EAAAA,MAAKH,EAAWF,GAE7B,OAAIrB,EACK2B,EAAkB3B,EAAK4B,MAAMH,cAG/B,IAAI,EAYAI,EAAkBrB,IAC7B,MAAMa,EAAkBC,EAAUd,GAE5Be,EAAYjB,EAAW5B,KAAI,IAA0B,IAAzB,KAAEsB,EAAI,UAAEwB,GAAW,EACnD,MAAO,CACLxB,KAAMwB,EACNC,aAAczB,EACf,IAGGA,GAAO0B,EAAAA,EAAAA,MAAKH,EAAWF,GAE7B,OAAIrB,EACKA,EAAK8B,OAGP,CAAC,CAAC,EAWEC,EAAWvB,IACtB,MAAMa,EAAkBd,EAAaQ,EAAWP,IAChD,GAAIJ,EAAUvK,IAAIwL,GAChB,OAAOjB,EAAUtK,IAAIuL,GAGvB,MAAMW,GAAWC,EAAAA,EAAAA,GAAwBzB,GACzC,GAAIwB,EACF,OAAOD,EAASC,EAASE,QAG3B,IAAIC,EAAYf,EAAcC,GAQ9B,OANKc,IACHA,EAAYb,EAAUd,IAGxBJ,EAAU3J,IAAI4K,EAAiBc,GAExBA,CAAS,EAULb,EAAYd,IAGvB,IAAI2B,EAFoB5B,EAAaQ,EAAWP,IAShD,MANa,gBAAT2B,IACFA,EAAS,KAGXA,EAAYR,EAAkBQ,GAEvBA,CAAS,q1BC9JH,SAASC,EAAgBC,GAItC,OAHAD,EAAkBnM,OAAOqM,eAAiBrM,OAAOsM,eAAejK,OAAS,SAAyB+J,GAChG,OAAOA,EAAEG,WAAavM,OAAOsM,eAAeF,EAC9C,EACOD,EAAgBC,EACzB,eCLe,SAASI,IACtB,GAAuB,oBAAZC,UAA4BA,QAAQC,UAAW,OAAO,EACjE,GAAID,QAAQC,UAAUC,KAAM,OAAO,EACnC,GAAqB,mBAAVC,MAAsB,OAAO,EACxC,IAEE,OADAC,QAAQzM,UAAU0M,QAAQxM,KAAKmM,QAAQC,UAAUG,QAAS,IAAI,WAAa,MACpE,CAGT,CAFE,MAAOxH,GACP,OAAO,CACT,CACF,CCRe,SAAS0H,EAAWC,EAAQzL,EAAM0L,GAa/C,OAXEF,EADE,IACWN,QAAQC,UAAUrK,OAElB,SAAoB2K,EAAQzL,EAAM0L,GAC7C,IAAIC,EAAI,CAAC,MACTA,EAAE3D,KAAK7H,MAAMwL,EAAG3L,GAChB,IACI4L,EAAW,IADGC,SAAS/K,KAAKX,MAAMsL,EAAQE,IAG9C,OADID,IAAO,EAAAZ,EAAA,GAAec,EAAUF,EAAM7M,WACnC+M,CACT,EAEKJ,EAAWrL,MAAM,KAAML,UAChC,CCZe,SAASgM,EAAiBJ,GACvC,IAAIK,EAAwB,mBAARlD,IAAqB,IAAIA,SAAQtG,EAuBrD,OAtBAuJ,EAAmB,SAA0BJ,GAC3C,GAAc,OAAVA,ICPkCM,EDOEN,GCNsB,IAAzDG,SAASI,SAASlN,KAAKiN,GAAI7D,QAAQ,kBDMQ,OAAOuD,ECP5C,IAA2BM,EDQtC,GAAqB,mBAAVN,EACT,MAAM,IAAIQ,UAAU,sDAEtB,QAAsB,IAAXH,EAAwB,CACjC,GAAIA,EAAO1N,IAAIqN,GAAQ,OAAOK,EAAOzN,IAAIoN,GACzCK,EAAO9M,IAAIyM,EAAOS,EACpB,CACA,SAASA,IACP,OAAO,EAAUT,EAAO5L,UAAW,EAAeM,MAAMgM,YAC1D,CASA,OARAD,EAAQtN,UAAYJ,OAAOmJ,OAAO8D,EAAM7M,UAAW,CACjDuN,YAAa,CACXrJ,MAAOoJ,EACPE,YAAY,EACZC,UAAU,EACVC,cAAc,MAGX,EAAAzB,EAAA,GAAeqB,EAAST,EACjC,EACOI,EAAiBJ,EAC1B,mCE3BO,MAAMc,EAAsB,IAA4B,IAA3B,QAAEC,EAAO,SAAEzJ,GAAU,EACvD,MAAM0J,EAAW,CACf3O,EAAAA,cAAoB,cAAe,CACjC4O,GAAOF,EAAO,OAEhB1O,EAAAA,cAAoB,YAAa,CAC/B4O,GAAOF,EAAO,QAkBlB,OAdIzJ,IAGF0J,EAAS1E,KAAKhF,GACd0J,EAAS1E,KACPjK,EAAAA,cAAoB,cAAe,CACjC4O,GAAOF,EAAO,OAEhB1O,EAAAA,cAAoB,YAAa,CAC/B4O,GAAOF,EAAO,SAKbC,CAAQ,ECZJE,EAAc,IAKpB,IALqB,UAC1BC,EAAS,WACTC,EAAU,SACV9J,KACG+J,GACJ,EACC,MAAMC,GAAYvI,EAAAA,EAAAA,YAAWwI,EAAAA,IACvBC,GAAczI,EAAAA,EAAAA,YAAW0I,EAAAA,IACzBC,EAAoBJ,EAAUH,GAEpC,IAAKO,EAAmB,CACtB,GAAIN,EACF,OAAO,KAEP,MAAM,IAAIO,MAAM,UACJD,EAAiB,UAAUP,EAAS,mBAGpD,CAEA,MAAMJ,EA7BW,EAACI,EAAWE,IACxBtO,OAAO6O,KAAKP,GAAYhN,OAKnB8M,EAAS,KADCU,EAAAA,EAAAA,GAAoBR,GAH/BF,EA2BOW,CAAWJ,EAAmBL,GAG9C,IAAIU,EAAaP,EAAYT,GAa7B,OAZKgB,EAOCzK,IACFyK,EAAWC,aAAc,GAP3BR,EAAYT,GAAWgB,EAAa,CAClCtM,MAAO4L,EACPF,UAAWO,EACXM,cAAe1K,GAQZ,gBAACwJ,EAAmB,CAACC,QAASA,GAAUzJ,EAA+B,EC/CnE2K,EAAc,IAKpB,IALqB,UAC1Bd,EAAS,WACTC,EAAU,SACV9J,KACG+J,GACJ,EACC,MAAMC,GAAYvI,EAAAA,EAAAA,YAAWwI,EAAAA,IACvBW,GAAmBnJ,EAAAA,EAAAA,YAAWoJ,EAAAA,IAC9BT,EAAoBJ,EAAUH,GAC9BxP,EAAQuQ,EAAiBtP,IAAI8O,GAEnC,IAAK/P,EAAO,CACV,GAAIyP,EACF,OAAO,KAEP,MAAM,IAAIO,MAAM,UACJD,EAAiB,UAAUP,EAAS,mBAGpD,CAEA,OACE,gBAACxP,EAAMyQ,UAAS,eACdC,aAAc1Q,EAAM0Q,aACpBC,KAAM3Q,EAAM2Q,MACRjB,GAEH/J,EACe,ECxBf,SAASiL,EAAM9M,GACW,CAE7B,MAAM+M,EAAgB,IACjB/M,EACH0L,UAAW1L,EAAMgN,cAEZD,EAAcC,aACdD,EAAcE,qBAErB,MAAMC,GAAgB5J,EAAAA,EAAAA,YAAW6J,EAAAA,IAG3BC,EAAaC,EAAmBrN,GACtC,GAAI1C,OAAO6O,KAAKiB,GAAYxO,OAC1B,MAAM,IAAI0O,EACuB,YAA/BJ,EAAcK,kBACdR,EAAcrB,UACd0B,EACApN,EAAMiN,sBAIV,GAAmC,WAA/BC,EAAcK,kBAChB,OAAO,gBAAC9B,EAAgBsB,GACnB,GAAmC,YAA/BG,EAAcK,kBAEvB,OAAO,gBAACf,EAAgBO,GACnB,GAAmC,YAA/BG,EAAcK,kBAEvB,OAAO,gBAACf,EAAgBO,GACnB,GAAmC,WAA/BG,EAAcK,kBAAgC,CAGvD,IAAIC,EAAwB,GAI5B,IACEA,EAAwB,wBAA2BN,EAAcO,UAAUC,KAAI,MAAMR,EAAcO,UAAUE,cAAa,mCAAmC3N,EAAMgN,MAAK,KAExK,CADA,MACA,CAGF,MAAM,IAAId,MAAM,mCACqBsB,EAAwB,6FAE/D,CACE,MAAM,IAAItB,MAAM,kBACIgB,EAAcK,kBAAiB,sBAGvD,CAGF,CAAC,IAEKD,EAAe,YACnB,WAAYM,EAAWlC,EAAW0B,EAAYS,GAAqB,IAAD,EAChE,MAAMC,EAASxQ,OAAOyQ,QAAQX,GAC3BrH,KACC,QAAEtI,EAAKmE,GAAM,6BACUA,EAAK,qBAAqBnE,EAAG,YAErDuQ,KAAK,MAEFN,EAAI,kBACV,IAAIO,EAAK,GACLC,EAAO,GAEX,GAAIN,EAAW,CAGb,MAIMO,EAHJvR,EAAAA,mDAAAA,uBAAAA,kBAG2BwR,OAAO7G,MAAM,MAAMrL,MAAM,GACtDiS,EAAW,GAAKA,EAAW,GAAGC,OAC9BH,EAAQ,KAAOE,EAAWH,KAAK,MAE/BE,EAAO,UAAaxC,EAAS,iDAAiDoC,EAAM,IACtF,KAAO,CAELI,EAAaR,EAAI,YAAYhC,EAAS,iDAAiDoC,EAAM,KAE7FG,EAAWC,EAAO,MADC,IAAIhC,OAAQ+B,MAAMG,OAAO7G,MAAM,MAAMrL,MAAM,GAC5B8R,KAAK,KACzC,CAYC,OAVD,cAAME,IAAQ,MACTR,KAAOA,EACRO,EACF,EAAKA,MAAQA,EAEb/B,MAAMmC,kBCrGG,SAAgCC,GAC7C,QAAa,IAATA,EACF,MAAM,IAAIC,eAAe,6DAE3B,OAAOD,CACT,CDgG8B,IAAMhB,GAG5BO,IACF,EAAKW,eAAiB,IAAKX,EAAoBY,aAAa,UAC7D,CACH,CAAC,OA3CkB,YA2ClB,EA3CkB,CA2ClB,EA3C2BvC,QA8C9B,MAAMmB,EAAqB,SACzBrN,EACA8N,EACAY,EACArH,QAFM,IAANyG,IAAAA,EAAS,CAAC,QACC,IAAXY,IAAAA,EAAc,SACV,IAAJrH,IAAAA,EAAO,MAGP,IAAK,MAAOqG,EAAM9L,KAAUtE,OAAOyQ,QAAQ/N,GAAQ,CACjD,GACE4B,UAEEyF,GAAY,aAAJqG,EAEV,SAGF,MAAMiB,EAAWtH,EAAUA,EAAI,IAAIqG,EAASA,EAE5B,mBAAL9L,EACTkM,EAAOa,UAAmB/M,EACL,iBAALA,GAAsB8M,EAAY1H,QAAQpF,IAAU,IACpE8M,EAAY7H,KAAKjF,GACjByL,EAAmBzL,EAAOkM,EAAQY,EAAaC,GAEnD,CAEA,OAAOb,CACT,gBEvIA,MAAMc,EAAmBC,EAAAA,GAAAA,QAEzB,SAASC,IACP,MAAM,IAAI5C,MACR,6UAKJ,oOCXe,SAAS6C,EAAkBC,EAAKC,IAClC,MAAPA,GAAeA,EAAMD,EAAIpQ,UAAQqQ,EAAMD,EAAIpQ,QAC/C,IAAK,IAAIsQ,EAAI,EAAGC,EAAO,IAAIrQ,MAAMmQ,GAAMC,EAAID,EAAKC,IAAKC,EAAKD,GAAKF,EAAIE,GACnE,OAAOC,CACT,CCAe,SAASC,EAAmBJ,GACzC,OCJa,SAA4BA,GACzC,GAAIlQ,MAAMuQ,QAAQL,GAAM,OAAO,EAAiBA,EAClD,CDES,CAAkBA,IELZ,SAA0BM,GACvC,GAAsB,oBAAXC,QAAmD,MAAzBD,EAAKC,OAAOC,WAA2C,MAAtBF,EAAK,cAAuB,OAAOxQ,MAAM2Q,KAAKH,EACtH,CFGmC,CAAgBN,IGJpC,SAAqCtF,EAAGgG,GACrD,GAAKhG,EAAL,CACA,GAAiB,iBAANA,EAAgB,OAAO,EAAiBA,EAAGgG,GACtD,IAAIC,EAAIrS,OAAOI,UAAUoN,SAASlN,KAAK8L,GAAGxN,MAAM,GAAI,GAEpD,MADU,WAANyT,GAAkBjG,EAAEuB,cAAa0E,EAAIjG,EAAEuB,YAAYyC,MAC7C,QAANiC,GAAqB,QAANA,EAAoB7Q,MAAM2Q,KAAK/F,GACxC,cAANiG,GAAqB,2CAA2CC,KAAKD,GAAW,EAAiBjG,EAAGgG,QAAxG,CALc,CAMhB,CHH2D,CAA2BV,IILvE,WACb,MAAM,IAAIjE,UAAU,uIACtB,CJG8F,EAC9F,cKNA,MAyDM8E,EAzDU,SAAUC,GACxB,GAAmB,oBAARzP,SACT,OAAO,EAET,MAAM0P,EAAW1P,SAASqB,cAAc,QACxC,IACE,GAAIqO,EAASC,SAA2C,mBAAzBD,EAASC,QAAQC,SAC9C,OAAOF,EAASC,QAAQC,SAASH,EAIrC,CAFE,MAAOI,GACP,OAAO,CACT,CACA,OAAO,CACT,CA4CkCC,CAAQ,YA1Cb,SAAUC,EAAK9K,GAC1C,OAAO,IAAIgB,SAAQ,CAACC,EAAS8J,KAC3B,GAAmB,oBAARhQ,SAET,YADAgQ,IAIF,MAAMC,EAAOjQ,SAASqB,cAAc,QACpC4O,EAAKC,aAAa,MAAD,YACjBD,EAAKC,aAAa,OAAQH,GAE1B9S,OAAO6O,KAAK7G,GAASkL,SAAQ/S,IAC3B6S,EAAKC,aAAa9S,EAAK6H,EAAQ7H,GAAK,IAGtC6S,EAAKG,OAASlK,EACd+J,EAAKI,QAAUL,GAGbhQ,SAASsQ,qBAAqB,QAAQ,IACtCtQ,SAASuQ,kBAAkB,UAAU,GAAGC,YAC5B5L,YAAYqL,EAAK,GAEnC,EAE4B,SAAUF,GACpC,OAAO,IAAI9J,SAAQ,CAACC,EAAS8J,KAC3B,MAAMS,EAAM,IAAIC,eAChBD,EAAIE,KAAK,MAAOZ,GAAK,GAErBU,EAAIL,OAAS,KACQ,MAAfK,EAAIG,OACN1K,IAEA8J,GACF,EAGFS,EAAII,KAAK,KAAK,GAElB,EAMMC,EAAa,CAAC,EAkBpB,MAhBiB,SAAUf,EAAK9K,GAC9B,OAAO,IAAIgB,SAAQC,IACb4K,EAAWf,GACb7J,IAIFsJ,EAA0BO,EAAK9K,GAC5Be,MAAK,KACJE,IACA4K,EAAWf,IAAO,CAAI,IAEvBgB,OAAM,QAAS,GAEtB,sBCrEO,MAAMC,EAAqB,CAIhCnF,MAAM,QAINoF,QAAQ,WAWJC,EAAoBC,IACxB,MAAOnK,EAAMoK,GAAeD,EAAQjK,MAAM,KAPZmK,MAS9B,MAAyB,eADH,MAAJrK,EAAY,SAP9BqK,EAAQ,OADsBA,EAQqCrK,GAP7D,GAAaqK,EAAExV,MAAM,GAAKwV,GAC1B3V,SAAS,KAAO2V,EAAExV,MAAM,GAAI,GAAKwV,GAOS,mBAC9CD,EAAW,IAAOA,EAAW,GAAO,EAQlCE,EAAmBH,GAAWA,EAAQnJ,WAAW,MAEvD,SAASuJ,EAAQxB,EAAKyB,GACpB,YAD0B,IAANA,IAAAA,EAAM,OACnB,IAAIvL,SAAQC,IACjB,MAAMuK,EAAM,IAAIC,eAChBD,EAAIE,KAAKa,EAAQzB,GAAK,GACtBU,EAAIgB,mBAAqB,KACD,GAAlBhB,EAAIiB,YACNxL,EAAQuK,EACV,EAEFA,EAAII,KAAK,KAAK,GAElB,CAEA,MAgBMc,EAAY,+BAEZC,EAAkB,SAACC,EAAUvF,EAAkB3H,GAAU,IAAD,OAAlB,IAAT2H,IAAAA,EAAY,MAC7C,MAAMwF,EAAO,CACXC,mBAAoBF,EAASE,mBAC7B/K,KAAM6K,EAAS7K,KACfgL,uBAAwBH,EAASG,uBACjCxJ,UAAWqJ,EAASrJ,UACpByJ,kBAAmBJ,EAASI,kBAC5BC,mBAAoBL,EAASK,mBAC7B1G,UAA6B,QAApB,EAAEqG,EAASrG,iBAAS,QAAI,CAAC,GAGpC,MAAO,CACLc,YACA3H,OACAwN,KAAMN,EAASlM,OACfmM,OAEJ,EAEA,SAASM,EAAgBC,GACvB,OAAO,IAAIpM,SAAQC,IACjB,IACE,MAAMP,EAAS0M,EAASC,WACxBpM,EAAQP,EAYV,CAXE,MAAOkK,GACP,IACE5S,OAAOK,eAAeC,KAAKsS,EAAI,eAC/B5S,OAAOK,eAAeC,KAAKsS,EAAI,WAM/B,MAAMA,EAJN0C,YAAW,KACTH,EAAgBC,GAAUrM,KAAKE,EAAQ,GACtC,IAIP,IAEJ,CAEO,IAAMsM,EAAU,WACrB,WAAYC,EAAenL,GAAa,KAgCxCoL,wBAA0B,IAAIrL,IAhB5BzI,KAAK+T,OAAS,IAAItL,IAClBzI,KAAKgU,WAAa,IAAIvL,IACtBzI,KAAKiU,cAAgB,CAAC,EACtBjU,KAAKkU,WAAa,IAAIzL,IACtBzI,KAAKmU,mBAAqB,IAAI1L,IAC9BzI,KAAKoU,aAAe,IAAI3L,IACxBzI,KAAKqU,gBAAkB,IAAI5L,IAC3BzI,KAAKsU,SAAW,IAAI7L,IACpBzI,KAAKuU,wBAAyB,EAC9BvU,KAAKwU,eAAiB,GACtBxU,KAAKyU,kBAAoB,IAAIC,IAC7B1U,KAAK2U,kBAAoB,IAAID,IAC7B1U,KAAK6T,cAAgBA,GACrBtK,EAAAA,EAAAA,IAAcb,EAChB,CAAC,kBAgrBA,OAhrBA,EAIDkM,YAAA,SAAYzD,GACV,IAAI0D,EAAkB7U,KAAK8T,wBAAwB5V,IAAIiT,GAQvD,OANK0D,IACHA,EAAkBlC,EAAQxB,EAAI,OAC9BnR,KAAK8T,wBAAwBjV,IAAIsS,EAAK0D,IAIjCA,EACJzN,MAAKqM,IACJzT,KAAK8T,wBAAwBgB,OAAO3D,GAC7BsC,KAERtB,OAAMlB,IAEL,MADAjR,KAAK8T,wBAAwBgB,OAAO3D,GAC9BF,CAAG,GAEf,EAAC,EAED8D,aAAA,SAAaC,GACXhV,KAAKgV,UAAYA,EACjBhV,KAAKiV,iBAAmBD,EAAU,0BAA0BE,MAAK3J,GAAKA,GACxE,EAAC,EAED4J,kBAAA,SAAkBC,GAChB,MAAM,SAAEC,EAAQ,QAAEC,EAAU,GAAMF,EAC5BjE,EAAMmB,EAAkB+C,GAC9B,OAAOrV,KAAK4U,YAAYzD,GAAK/J,MAAKyK,IAChC,MAAM,OAAEG,EAAM,aAAEuD,GAAiB1D,EAGjC,GAAe,MAAXG,EACF,IACE,MAAMwD,EAAchS,KAAKC,MAAM8R,GAC/B,QAAyBpT,IAArBqT,EAAYpN,KACd,MAAM,IAAI6E,MAAM,iCAGlB,MAAMuF,EAAc6C,EAAS/M,MAAM,KAAK,GAKxC,OAJIkK,IAAgBgD,EAAYpN,KAAKqN,SAASjD,KAC5CgD,EAAYpN,MAAI,IAAQoK,GAGnBnU,OAAOqX,OAAON,EAAS,CAC5BpD,OAAQI,EAAmBC,QAC3BsD,QAASH,GAGX,CADA,MAAOvE,GACP,CAKJ,OAAe,MAAXe,GAA6B,MAAXA,EAER,cAARqD,GAAoC,cAARA,EACvBhX,OAAOqX,OAAON,EAAS,CAC5BpD,OAAQI,EAAmBnF,QAMxBjN,KAAKmV,kBACV9W,OAAOqX,OAAON,EAAS,CAAEC,SAAS,YAAcO,UAAU,KAK/C,MAAX5D,EACKhS,KAAKmV,kBACV9W,OAAOqX,OAAON,EAAS,CACrBC,SAAS,YACTQ,qBAAqB,KAMvBP,EAAU,EACLtV,KAAKmV,kBACV9W,OAAOqX,OAAON,EAAS,CAAEE,QAASA,EAAU,KAKzCjX,OAAOqX,OAAON,EAAS,CAC5BpD,OAAQI,EAAmBnF,OAC3B,GAEN,EAAC,EAED6I,0BAAA,SAA0BV,GACxB,MAAM,SAAEC,EAAQ,QAAEC,EAAU,GAAMF,EAC5BjE,EAAMmB,EAAkB+C,GAAUU,QAAQ,QAAD,aAC/C,OAAO/V,KAAK4U,YAAYzD,GAAK/J,MAAKyK,IAChC,MAAM,OAAEG,EAAM,aAAEuD,GAAiB1D,EAGjC,GAAe,MAAXG,EACF,IACE,OAAO3T,OAAOqX,OAAON,EAAS,CAC5BpD,OAAQI,EAAmBC,QAC3BsD,QAASJ,GAGX,CADA,MAAOtE,GACP,CAKJ,OAAe,MAAXe,GAA6B,MAAXA,EAER,cAARqD,GAAoC,cAARA,EACvBhX,OAAOqX,OAAON,EAAS,CAC5BpD,OAAQI,EAAmBnF,QAMxBjN,KAAK8V,0BACVzX,OAAOqX,OAAON,EAAS,CAAEC,SAAS,YAAcO,UAAU,KAK/C,MAAX5D,EACKhS,KAAK8V,0BACVzX,OAAOqX,OAAON,EAAS,CACrBC,SAAS,YACTQ,qBAAqB,KAMvBP,EAAU,EACLtV,KAAK8V,0BACVzX,OAAOqX,OAAON,EAAS,CAAEE,QAASA,EAAU,KAKzCjX,OAAOqX,OAAON,EAAS,CAC5BpD,OAAQI,EAAmBnF,OAC3B,GAEN,EAAC,EAED+I,iBAAA,SAAiBzD,GACf,MAAM8C,GAAWlL,EAAAA,EAAAA,IAASoI,GAC1B,GAAIvS,KAAKkU,WAAWjW,IAAIoX,GAAW,CACjC,MAAMpC,EAAWjT,KAAKkU,WAAWhW,IAAImX,GAEnC,OAAOhO,QAAQC,QAAQ2L,EAE3B,CAEA,OAAOjT,KAAKmV,kBAAkB,CAAEE,aAAYjO,MAAK6L,IAC/CjT,KAAKkU,WAAWrV,IAAIwW,EAAUpC,GAEvBA,IAEX,EAAC,EAEDgD,yBAAA,SAAyB1D,GACvB,MAAM8C,GAAWlL,EAAAA,EAAAA,IAASoI,GAC1B,GAAIvS,KAAKmU,mBAAmBlW,IAAIoX,GAAW,CACzC,MAAMpC,EAAWjT,KAAKmU,mBAAmBjW,IAAImX,GAE3C,OAAOhO,QAAQC,QAAQ2L,EAE3B,CAEA,OAAOjT,KAAK8V,0BAA0B,CAAET,aAAYjO,MAAK6L,IACvDjT,KAAKmU,mBAAmBtV,IAAIwW,EAAUpC,GAE/BA,IAEX,EAAC,EAEDiD,kBAAA,SAAkBzJ,GAChB,GAAIzM,KAAKoU,aAAanW,IAAIwO,GAAY,CACpC,MAAM+I,EAAcxV,KAAKoU,aAAalW,IAAIuO,GAC1C,OAAOpF,QAAQC,QAAQ,CAAEmF,YAAW+I,eACtC,CAGA,OAAO7C,EADuB,eAAelG,EAAS,QACnC,OAAQrF,MAAK+O,IAC9B,MAAMX,EAAchS,KAAKC,MAAM0S,EAAIZ,cAGnC,OADAvV,KAAKoU,aAAavV,IAAI4N,EAAW+I,GAC1B,CAAE/I,YAAW+I,cAAa,GAErC,EAAC,EAEDhM,cAAA,SAAc+I,GACZ,OAAO/I,EAAAA,EAAAA,IAAc+I,EACvB,EAEA,EACA/L,SAAA,SAAS+L,GACP,MAAM8C,GAAWlL,EAAAA,EAAAA,IAASoI,GAC1B,GAAIvS,KAAK+T,OAAO9V,IAAIoX,GAAW,CAC7B,MAAMnC,EAAOlT,KAAK+T,OAAO7V,IAAImX,GAE3B,OAAInC,EAAKkD,MACA/O,QAAQC,QAAQ,CACrB8O,MAAOlD,EAAKkD,MACZpE,OAAQkB,EAAKlB,SAIV3K,QAAQC,QAAQ4L,EAAKyC,QAEhC,CAEA,GAAI3V,KAAKgU,WAAW/V,IAAIoX,GACtB,OAAOrV,KAAKgU,WAAW9V,IAAImX,GAG7B,MAAMgB,EAAmB,CACvBrW,KAAKsW,cACLtW,KAAKgW,iBAAiBX,IAOxB,MAAMR,EAAkBxN,QAAQE,IAAI8O,GAAkBjP,MAAKmP,IACzD,MAAOC,EAAiBC,EAAkBC,GAAmBH,EAE7D,GACEE,EAAiBzE,SAAWI,EAAmBnF,QAC/CyJ,aAAe,EAAfA,EAAiB1E,UAAWI,EAAmBnF,MAE/C,MAAO,CACL+E,OAAQI,EAAmBnF,OAI/B,IAAIgG,EAAWwD,EAAiBd,QAEhC,MAAM,mBACJxC,EACAE,kBAAmBsD,EAAwB,GAAE,UAC7C/J,EAAY,CAAC,GACXqG,EAEE2D,EAAc,CAAC,EAEfC,EAAoBhX,MAAM2Q,KAAK,IAAIkE,IAAIrW,OAAOyY,OAAOlK,KAErDmK,EAAY9Z,IAChB,GAAI+C,KAAKsU,SAASrW,IAAIhB,EAAMwR,MAC1B,OAAOzO,KAAKsU,SAASpW,IAAIjB,EAAMwR,MAC1B,GAAIzO,KAAKqU,gBAAgBpW,IAAIhB,EAAMwR,MACxC,OAAOzO,KAAKqU,gBAAgBnW,IAAIjB,EAAMwR,MAGxC,MAAMuI,EAAWhX,KAAK6T,cAAc5W,EAAMkW,oBAAoB/L,MAC5DsG,IACE,MAAO,CACLA,WAzYQuJ,EAyYiBvJ,EAzYXuJ,GAAKA,EAAEnZ,SAAYmZ,GA0YjCtJ,aAAc1Q,EAAM8J,OAAO4G,aAC3BC,KAAM3Q,EAAM8J,OAAO6G,MA3YXqJ,KA4YT,IAUL,OANAjX,KAAKqU,gBAAgBxV,IAAI5B,EAAMwR,KAAMuI,GACrCA,EAAS5P,MAAKP,IACZ7G,KAAKsU,SAASzV,IAAI5B,EAAMwR,KAAM5H,GAC9B7G,KAAKqU,gBAAgBS,OAAO7X,EAAMwR,KAAK,IAGlCuI,CAAQ,EAGjB,OAAO3P,QAAQE,IACbsP,EAAkB/P,KAAI2F,GAAazM,KAAKkW,kBAAkBzJ,MAC1DrF,MAAK8P,IACL,MAAMC,EAAS,GACTC,EAAwB,EAAOT,GAErC,IAAK,MAAM,YAAEnB,EAAW,UAAE/I,KAAepO,OAAOyY,OAAOI,GAAa,CAClEC,EAAOvP,KAAK,CAAE6G,KAAMhC,KAAc+I,IAClC,IAAK,MAAM6B,KAAmB7B,EAAYnC,kBACnC+D,EAAyB3B,SAAS4B,IACrCD,EAAyBxP,KAAKyP,EAGpC,CAEA,MAAMC,EAAoB,CACxBjQ,QAAQE,IAAI4P,EAAOrQ,IAAIiQ,IACvB/W,KAAK6T,cAAcV,EAAmB,SAItCmE,EAAkB1P,KAAK5H,KAAK6T,cAAcV,IAS5C,MAAMoE,EAAyBlQ,QAAQE,IAAI+P,GAAmBlQ,MAC5DnB,IACE,MAAOuR,EAAiBC,EAAeC,GAAiBzR,EAExD2Q,EAAYe,UAAY,IAAIC,KAE5B,IAAK,MAAMC,KAAkBL,IACtBK,GAAkBA,aAA0B5K,SAC/C2J,EAAY5E,OAASI,EAAmBnF,MACxC2J,EAAYR,MAAQyB,GAYxB,IAAIC,EAEJ,KARIJ,GAAiBA,aAAyBzK,SAE5C2J,EAAY5E,OAASI,EAAmBnF,MACxC2J,EAAYR,MAAQsB,GAKlBd,EAAY5E,SAAWI,EAAmBnF,MAAO,CAcnD,GAbA2J,EAAY5E,OAASI,EAAmBC,SAER,IAA9BoE,EAAiBb,WACa,KAA9Bc,aAAe,EAAfA,EAAiBd,YAEjBgB,EAAYhB,UAAW,GAEzB3C,EAAW5U,OAAOqX,OAAOzC,EAAU,CACjCG,uBAAwBoD,EACpBA,EAAgBpD,uBAAsB,KAIT,iBAAxBsD,aAAe,EAAfA,EAAiBf,SAAsB,CAChDmC,EAAgB9E,EAAgBC,EAAU,KAAMwE,GAEhDK,EAAcC,iBAAmBrB,EAAgBf,QAEjD,MAAMqC,EAAiB,IAAIC,eAAe,CACxCC,MAAMC,GACJ,MAAMC,EAAK,IAAIC,YACfF,EAAWG,QAAQF,EAAGG,OAAO7B,EAAgBf,SAC/C,EACA6C,KAAKL,GAEHA,EAAWM,OACb,EACAC,SAAU,IAGZ,OAAOlF,GACLmF,EAAAA,EAAAA,0BAAyBX,IACzB5Q,MAAKL,IACL+Q,EAAcC,iBAAmBhR,EAE1B+Q,IAEX,CACEA,EAAgB9E,EACdC,EACAyE,EACAD,EAGN,CAGA,OAAOK,CAAa,IAKlBc,EAA0BvR,QAAQE,IACtC6P,EAAyBtQ,KAAIuQ,IAE3B,GAAIrX,KAAKiU,cAAcoD,GAAkB,CACvC,MAAM7B,EAAcxV,KAAKiU,cAAcoD,GACvC,MAAO,CAAEA,kBAAiB7B,cAC5B,CAEA,OAAOxV,KAAK4U,YACQ,mBAAmByC,EAAe,SAEnDjQ,MAAKyK,IACJ,MAAM2D,EAAchS,KAAKC,MAAMoO,EAAI0D,cACnC,MAAO,CAAE8B,kBAAiB7B,cAAa,IAExCrD,OAAM,KACL,MAAM,IAAIlF,MAAM,sCACyCoK,EAAe,SACvE,GACD,KAENjQ,MAAKyR,IACL,MAAMC,EAAwB,CAAC,EAO/B,OALAD,EAAmBtH,SAAQ,IAAuC,IAAtC,gBAAE8F,EAAe,YAAE7B,GAAa,EAC1DsD,EAAsBzB,GAAmB7B,EACzCxV,KAAKiU,cAAcoD,GAAmB7B,CAAW,IAG5CsD,CAAqB,IAG9B,OACEzR,QAAQE,IAAI,CAACgQ,EAAwBqB,IAClCxR,MAAK,IAA0C,IAC1CuO,GADEmC,EAAee,GAAmB,EAaxC,OAXIf,IACFnC,EAAU,IAAKmC,EAAee,sBAC9BjC,EAAYjB,QAAUA,EACtBoD,EAAAA,EAAAA,KAAa,0BAA2B,CACtC7F,KAAMyC,EACNmC,cAAenC,KAInB3V,KAAK+T,OAAOlV,IAAIwW,EAAUuB,GAEtBA,EAAYR,MACP,CACLA,MAAOQ,EAAYR,MACnBpE,OAAQ4E,EAAY5E,QAIjB2D,CAAO,IAGfxD,OAAMlB,IACE,CACLmF,MAAOnF,EACPe,OAAQI,EAAmBnF,SAE7B,GAEN,IAcJ,OAXA4H,EACGzN,MAAK,KACJpH,KAAKgU,WAAWc,OAAOO,EAAS,IAEjClD,OAAMiE,IAEL,MADApW,KAAKgU,WAAWc,OAAOO,GACjBe,CAAK,IAGfpW,KAAKgU,WAAWnV,IAAIwW,EAAUR,GAEvBA,CACT,EAEA,EACApO,aAAA,SAAa8L,EAASlM,QAAO,IAAPA,IAAAA,EAAU,CAAC,GAC/B,MAAMgP,GAAWlL,EAAAA,EAAAA,IAASoI,GAC1B,GAAIvS,KAAK+T,OAAO9V,IAAIoX,GAAW,CAAC,IAAD,EAC7B,MAAMpC,EAAWjT,KAAK+T,OAAO7V,IAAImX,GAEjC,GAAIpC,EAAS0C,QACX,OAAO1C,EAAS0C,QAGlB,GAAW,QAAX,EAAItP,SAAO,OAAP,EAAS2S,iBACX,MAAO,CACL5C,MAAOnD,EAASmD,MAChBpE,OAAQiB,EAASjB,OAGvB,CAEF,EAAC,EAEDiH,eAAA,SAAe5D,GAEb,QAvkBkC,MACpC,GACE,eAAgB6D,gBACW,IAApBA,UAAUC,WACjB,CACA,IAAKD,UAAUC,WAAWC,eAAa,IAAQ3D,SAAS,MACtD,OAAO,EAET,GAAIyD,UAAUC,WAAWE,SACvB,OAAO,CAEX,CACA,OAAO,CAAI,EA2jBJC,OAKDJ,UAAUK,YAAaxG,EAAUpC,KAAKuI,UAAUK,cAKhDvZ,KAAK+T,OAAO9V,IAAIoX,GAKtB,EAAC,EAEDmE,SAAA,SAASnE,GACP,IAAKrV,KAAKiZ,eAAe5D,GACvB,MAAO,CACLjO,KAAME,GAAWA,GAAQ,GACzBmS,MAAO,QAGX,GAAIzZ,KAAKyU,kBAAkBxW,IAAIoX,GAC7B,MAAO,CACLjO,KAAME,GAAWA,GAAQ,GACzBmS,MAAO,QAIX,MAAMC,EAAQ,CACZpS,QAAS,KACT8J,OAAQ,KACRuI,QAAS,MAEXD,EAAMC,QAAU,IAAItS,SAAQ,CAACC,EAAS8J,KACpCsI,EAAMpS,QAAUA,EAChBoS,EAAMtI,OAASA,CAAM,IAEvBpR,KAAKwU,eAAe5M,KAAK,CAACyN,EAAUqE,IACpC,MAAME,EAAS,IAAIC,gBAgBnB,OAfAD,EAAOE,OAAO/X,iBAAiB,SAAS,KACtC,MAAMgY,EAAQ/Z,KAAKwU,eAAewF,WAAU,QAAEC,GAAE,SAAKA,IAAM5E,CAAQ,KAEpD,IAAX0E,GACF/Z,KAAKwU,eAAe1M,OAAOiS,EAAO,EACpC,IAGG/Z,KAAKuU,yBACRvU,KAAKuU,wBAAyB,EAC9BZ,YAAW,KACT3T,KAAKka,2BAA2B,GAC/B,MAGE,CACL9S,KAAM,CAACE,EAAS8J,IAAWsI,EAAMC,QAAQvS,KAAKE,EAAS8J,GACvDqI,MAAOG,EAAOH,MAAM/Y,KAAKkZ,GAE7B,EAAC,EAEDM,0BAAA,YACuB5Z,OAAO6Z,qBAAmB,CAAKC,GAAMzG,WAAWyG,EAAI,MAE5D,KACX,MAAMC,EAAara,KAAKwU,eAAe1M,OAAO,EAAG,GAC3CwS,EAAajT,QAAQE,IACzB8S,EAAWvT,KAAI,IAA2B,IAAzBuO,EAAUkF,GAAS,EASlC,OANKva,KAAKyU,kBAAkBxW,IAAIoX,KAC9BrV,KAAKgV,UAAU,qBAAsB,CAAEjR,SAAUsR,IACjDrV,KAAKyU,kBAAkB+F,IAAInF,IAIzBrV,KAAKiV,iBACAsF,EAASjT,SAAQ,GAGnBtH,KAAKya,YAAWtQ,EAAAA,EAAAA,IAASkL,IAAWjO,MAAK,KACzCpH,KAAK2U,kBAAkB1W,IAAIoX,KAC9BrV,KAAKgV,UAAU,yBAA0B,CAAEjR,SAAUsR,IACrDrV,KAAK2U,kBAAkB6F,IAAInF,IAG7BkF,EAASjT,SAAQ,EAAK,GACtB,KAIFtH,KAAKwU,eAAe7U,OACtB2a,EAAWlT,MAAK,KACduM,YAAW,KACT3T,KAAKka,2BAA2B,GAC/B,IAAK,IAGVla,KAAKuU,wBAAyB,CAChC,GAEJ,EAAC,EAEDkG,WAAA,SAAWpF,GACT,MAAMqF,EAAcpI,EAAkB+C,GAsBpC,OAAOsF,EAAeD,EAAa,CACjCE,YAAY,YACZC,GAAG,UACFzT,MAAK,IAGNpH,KAAKgW,iBAAiBX,IAG5B,EAAC,EAEDyF,SAAA,SAASvI,GACPvS,KAAKwG,SAAS+L,EAChB,EAAC,EAEDhM,2BAAA,SAA2BgM,GACzB,MAAM8C,GAAWlL,EAAAA,EAAAA,IAASoI,GACpBW,EAAOlT,KAAKkU,WAAWhW,IAAImX,GACjC,GAAInC,EAAM,CACR,MAAM4E,EAAgB9E,EAAgBE,EAAKyC,SAE3C,MAAM,GAAN,SACKoF,EAAoBjD,EAAc5E,KAAKC,qBAAmB,CAC7Db,EAAkB+C,IAEtB,CACE,OAAO,IAEX,EAAC,EAED2F,eAAA,SAAezI,GACb,MAAM8C,GAAWlL,EAAAA,EAAAA,IAASoI,GACpBW,EAAOlT,KAAK+T,OAAO7V,IAAImX,GAC7B,OAAQnC,GAAQA,EAAK0C,QACvB,EAAC,EAEDU,YAAA,SAAYhB,GACV,YADiB,IAAPA,IAAAA,EAAU,GACbtV,KAAK4U,YAA8B,4BAA4BxN,MACpEyK,IACE,MAAM,OAAEG,EAAM,aAAEuD,GAAiB1D,EAEjC,IAAIoJ,EAEJ,GAAe,MAAXjJ,GAAkBsD,EAAU,EAE9B,OAAOtV,KAAKsW,YAAYhB,EAAU,GAIpC,GAAe,MAAXtD,EACF,IACE,MAAMwD,EAAchS,KAAKC,MAAM8R,GAC/B,QAA2CpT,IAAvCqT,EAAYpC,uBACd,MAAM,IAAInG,MAAM,iCAGlBgO,EAAUzF,CAEV,CADA,MAAOvE,GACP,CAIJ,OAAOgK,CAAO,GAGpB,EAAC,EA/sBoB,GAktBvB,MAAMF,EAAsB5H,IACzB7S,OAAO4a,gBAAgB/H,IAAuB,IAAIrM,KACjDqU,GAASC,GAAkBD,IAGxB,IAkGH3P,EAlGS6P,EAAU,YACrB,WAAYC,EAAe5S,EAAYuK,GAAW,IAAD,EA2B9C,OARD,eAlBsB,SAACsI,EAAWC,GAKhC,QAL0C,IAAVA,IAAAA,EAAU,eAKrCF,EAHHE,EAAU,cAGmBD,GAC7B,MAAM,IAAItO,MAAM,gEACiDsO,EAAS,KAI5E,OACED,EAAcE,GAAYD,KAEvBpJ,OAAMlB,GAAOA,GAEpB,GAEqBvI,IAAW,KAE5BuK,GACF,EAAKiB,WAAWrV,KAAIsL,EAAAA,EAAAA,IAAS8I,EAAS7K,MAAO,CAC3CiN,SAAUpC,EAAS7K,KACnBuN,QAAS1C,EACTjB,OAAO,YAEV,CACH,EA7BqB,YA6BpB,kBAkEA,OAlEA,EAEDyI,WAAA,SAAWpF,GACT,OAAO,YAAMoF,WAAU,UAACpF,GAAUjO,MAAKL,IACrC,GAAIA,EAAOiL,SAAWI,EAAmBC,QACvC,OAAOhL,QAAQC,UAEjB,MAAM2L,EAAWlM,EAAO4O,QAClB4F,EAAYtI,EAASE,mBACrBsI,EAAgBV,EAAoBQ,GAC1C,OAAOlU,QAAQE,IAAIkU,EAAc3U,IAAI6T,IAAiBvT,MAAK,IAAM6L,GAAS,GAE9E,EAAC,EAED+C,iBAAA,SAAiBzD,GACf,OAAO,YAAMyD,iBAAgB,UAACzD,GAASnL,MAAKwG,GACtCA,EAAKgI,SACHlD,EAAiBH,GACZ3E,EAIF+E,EAAQJ,EAAQ,QAASnL,MAAKyK,GAChB,MAAfA,EAAIG,OAIC,CACLA,OAAQI,EAAmBnF,OAMxBW,IAGJA,GAEX,EAAC,EAEDqI,yBAAA,SAAyB1D,GACvB,OAAO,YAAM0D,yBAAwB,UAAC1D,GAASnL,MAAKwG,GAC9CA,EAAKgI,SACHlD,EAAiBH,GACZ3E,EAIF+E,EAAQJ,EAAQ,QAASnL,MAAKyK,GAChB,MAAfA,EAAIG,OAIC,CACLA,OAAQI,EAAmBnF,OAMxBW,IAGJA,GAEX,EAAC,EA/FoB,CAASgG,GAoGzB,MAAM8H,EAAYC,IACvBnQ,EAAWmQ,CAAO,EAGPC,EAAe,CAC1BtD,QAAS/F,GAAW/G,EAASgO,SAASjH,GAGtChM,2BAA4BgM,GAC1B/G,EAASjF,2BAA2BgM,GACtC/L,SAAU+L,GAAW/G,EAAShF,SAAS+L,GAEvC9L,aAAc,SAAC8L,EAASlM,GAAY,YAAL,IAAPA,IAAAA,EAAU,CAAC,GACjCmF,EAAS/E,aAAa8L,EAASlM,EAAQ,EACzCmT,SAAUjH,GAAW/G,EAASgO,SAASjH,GACvCyI,eAAgBzI,GAAW/G,EAASwP,eAAezI,GACnDuI,SAAUvI,GAAW/G,EAASsP,SAASvI,GACvC+D,YAAa,IAAM9K,EAAS8K,eAG9B,QAEO,SAASuF,IACd,OAAIrQ,EACKA,EAASyI,cAET,CAAC,CAEZ,CAEO,SAAS6H,IACd,OAAItQ,EACKA,EAAS8I,SAET,CAAC,CAEZ,wJCz8BO,SAASyH,EAAqB,GAAyB,IAAzB,SAAEnZ,EAAQ,SAAEoZ,GAAU,EAKzD,OAJAC,EAAAA,EAAAA,YAAU,KACRD,GAAU,IAGLpZ,CACT,CCXO,MAAMsZ,EAAmB,CAAC,OAAD,iECsEzB,SAASC,EAAYC,EAAQC,GAClC,GAAID,aAAkBE,aAAeD,aAAkBC,YAAa,CAClE,MAAMC,EAAQF,EAAOG,aAAa,SAGlC,GAAID,IAAUH,EAAOI,aAAa,SAAU,CAC1C,MAAMC,EAAWJ,EAAOK,WAAU,GAGlC,OAFAD,EAASnL,aAAa,QAAD,IACrBmL,EAASF,MAAQA,EACVA,IAAUH,EAAOG,OAASH,EAAOD,YAAYM,EACtD,CACF,CAEA,OAAOL,EAAOD,YAAYE,EAC5B,CCtEA,MAAMM,EAAavb,SAASqB,cAAc,OACpCma,EAAqB,IAAIlI,IACzBmI,EAAqB,IAAInI,IAgBzBoI,EAAkB,CACtBC,EACAC,EACAC,EACAC,KAEA,MAAMC,EAAa/b,SAASsQ,qBAAqBqL,GAAS,GAErDI,IAILA,EAAW7L,aAAa0L,EAAeC,GACvCC,EAAe1C,IAAIwC,GAAc,EAW7BI,EAAiB,KAAO,IAAD,EAC3B,MAAMC,EAAiB,GACjBC,EAAU,IAAI7U,IAEpB,IAAK,MAAMtH,KAAQwb,EAAWY,WAAY,CAAC,IAAD,IACxC,MAAMC,EAAWrc,EAAKqc,SAASC,cACzBlR,EAAoB,QAAlB,EAAGpL,EAAKuc,kBAAU,OAAI,QAAJ,EAAf,EAAiBnR,UAAE,WAAJ,EAAf,EAAqB5J,MAEhC,IAAKuZ,EAAiBzG,SAAS+H,GAE7B,SAGF,GAAY,SAARA,EAAqB,CACvB,IAAK,MAAMG,KAAaxc,EAAKuc,WAC3BZ,EAAgB,OAEda,EAAUlP,KACVkP,EAAUhb,MACVia,GAGJ,QACF,CAEA,GAAY,SAARY,EAAqB,CACvB,IAAK,MAAMG,KAAaxc,EAAKuc,WAC3BZ,EAAgB,OAEda,EAAUlP,KACVkP,EAAUhb,MACVka,GAGJ,QACF,CAEA,IAAIe,EAAazc,EAAKub,WAAU,GAIhC,GAHAkB,EAAWtM,aAAa,oBAAoB,GAGP,WAAjCsM,EAAWJ,SAASC,cAA4B,CAClD,MAAMzY,EAAS5D,SAASqB,cAAc,UACtC,IAAK,MAAMob,KAAQD,EAAWF,WAC5B1Y,EAAOsM,aAAauM,EAAKpP,KAAMoP,EAAKlb,OAEtCqC,EAAOY,UAAYgY,EAAWhY,UAC9BgY,EAAa5Y,CACf,CAEA,GAAIuH,EAAI,CACN,GAAK+Q,EAAQrf,IAAIsO,GAGV,CAAC,IAAD,EACL,MAAMuR,EAAgCR,EAAQpf,IAAIqO,GACM,QAAxD,EAAA8Q,EAAeS,GAA+BlM,kBAAU,OAAxD,EAA0DmM,YACxDV,EAAeS,IAEjBT,EAAeS,GAAiCF,EAEhD,QACF,CAVEP,EAAezV,KAAKgW,GACpBN,EAAQze,IAAI0N,EAAI8Q,EAAe1d,OAAS,EAU5C,MACE0d,EAAezV,KAAKgW,EAExB,CAEA,MAAMI,EAAuB5c,SAAS6c,iBAAiB,sBAEf,IAAD,EAAvC,GAAoC,IAAhCD,EAAqBre,OAEvB,YADA,EAAAyB,SAAS2E,MAAKmY,OAAM,QAAIb,GAI1B,MAAMc,EAAe,ID7ChB,SAAmB,GAAyC,IAAzC,SAAEC,EAAQ,SAAEC,EAAQ,QAAEC,EAAO,MAAEC,GAAO,EAC9D,IAAK,MAAMC,KAAuBJ,EAAU,CAC1C,MAAMK,EAAkBJ,EAASrE,WAAUtW,GACzCyY,EAAYzY,EAAG8a,MAGQ,IAArBC,EACFH,EAAQE,GAGRH,EAASvW,OAAO2W,EAAiB,EAErC,CAGA,IAAK,MAAMC,KAAWL,EACpBE,EAAMG,EAEV,CC4BEC,CAAU,CACRP,SAAUJ,EACVK,SAAUhB,EACViB,QAASnd,GAAQA,EAAKyQ,WAAWmM,YAAY5c,GAC7Cod,MAAOpd,GAAQgd,EAAavW,KAAKzG,MAGnC,EAAAC,SAAS2E,MAAKmY,OAAM,QAAIC,EAAa,EAkChC,SAASS,EAAsB,GAIlC,IAJkC,cACpClH,EAAa,mBACbmB,EAAkB,mBAClBgG,GACD,GACC5C,EAAAA,EAAAA,YAAU,KACR,GAAIvE,SAAAA,EAAeoH,KAAM,ED9JtB,SAA6B/Y,GAClC,GAAe,mBAAJA,EACT,MAAM,IAAIkH,MAAM,uDACyClH,EAAI,KAEjE,CC0JMgZ,CAAoBrH,EAAcoH,MAElC,MAAM,OAAEtc,IAAWwc,EAAAA,EAAAA,KAEbF,EAAOpH,EAAcoH,KAE3Btc,EAIE,gBAACuZ,EAAoB,CAACC,SAAUoB,GAC9B,gBAAC,EAAA6B,mBAAA,SAA2B,CAACtc,MAAOkW,GAClC,gBAAC,EAAAqG,iBAAgB,KACf,gBAACJ,ED1LN,CACLnd,SAAU,CACRoC,UAH0BpH,EC2LQkiB,GDxLlBld,SAASoC,UAE3BmG,OAAQvN,EAAMuN,OACd0D,KAAMjR,EAAMiR,MAAQ,CAAC,EACrBuR,WAAYxiB,EAAMwiB,WAClBC,YAAaziB,EAAMyiB,iBCuLfzC,EAEJ,CDjMG,IAAyBhgB,ECmM5B,MAAO,KAzJoB,MAC7B,MAAM0iB,EAAgBje,SAAS6c,iBAAiB,sBAEhD,IAAK,MAAM9c,KAAQke,EACjBle,EAAKyQ,WAAWmM,YAAY5c,EAC9B,EAqJIme,GAvLJ1C,EAAmBrL,SAAQyL,IACN5b,SAASsQ,qBAAqB,QAAQ,GAC9C6N,gBAAgBvC,EAAc,IAK3CH,EAAmBtL,SAAQyL,IACN5b,SAASsQ,qBAAqB,QAAQ,GAC9C6N,gBAAgBvC,EAAc,GAgLb,CAC3B,GAEL,CCxMA,SAASwC,EAAaze,GACpB,MAAM8d,EAAqB,IACtB9d,EACHmJ,OAAQ,KACHD,EAAAA,EAAAA,IAAgBlJ,EAAMY,SAASoC,aAC/BhD,EAAM+W,cAAcvE,KAAK6L,YAAYK,WAM5C,IAAIC,EAFkBzI,MAIpByI,EADE3e,EAAM+W,cAAcC,iBACRhX,EAAM+W,cAAcC,kBAEpBtV,EAAAA,EAAAA,gBANMwU,EAMsBlW,EAAM+W,cAAcpK,YAN/BuJ,EAAEnZ,SAAYmZ,EAM6B,IACrE4H,EACHrgB,IAAKuC,EAAMqH,MAAQrH,EAAM+W,cAAc5E,KAAK9K,OAMhDwW,EAAsB,CACpBlH,cAHoB3W,EAAM+W,cAAc/R,KAIxC8S,mBAAoB9X,EAAM+W,cAAce,mBACxCgG,uBAeF,OAZoB7J,EAAAA,EAAAA,GAAU,kBAE5B,CACEhS,QAAS0c,EACT3e,MAAO8d,GAETa,GACA,IAAiB,IAAhB,OAAE3Y,GAAQ,EACT,MAAO,CAAE/D,QAAS+D,EAAQhG,MAAO8d,EAAoB,IAEvDc,KAGJ,CAEAH,EAAa1c,UAAY,CACvBnB,SAAUie,IAAAA,OAAAA,WACV9H,cAAe8H,IAAAA,OAAAA,WACfhS,KAAMgS,IAAAA,OACNR,YAAaQ,IAAAA,OAAAA,YAGf,mJCxDO,MAAMC,EAAsB,CACjCtT,GAAG,mBACHuT,MAAO,CACLlf,SAAS,WACTmf,IAAK,EACLC,MAAO,EACPC,OAAQ,EACRC,QAAS,EACTC,SAAS,SACTC,KAAK,mBACLC,WAAW,SACXC,OAAQ,GAEV,YAAY,YACZ,cAAc,sBCHhB,SAASC,EAAcxc,GACrB,MAAMqG,GAAWC,EAAAA,EAAAA,GAAwBtG,IACnC,KAAE7C,EAAI,OAAEmH,GAAW/H,OAAOqB,SAEhC,OAAgB,MAAZyI,IACF9J,OAAOkgB,WAAWpW,EAASE,OAASjC,EAASnH,IACtC,EAIX,CAGA,IAAIuf,EAAS,GAEbngB,OAAOyB,iBAAiB,sBAAsB2e,IACxC,6BAA6B/P,KAAK+P,EAAMC,SACtCF,IACFngB,OAAOqB,SAASoC,SAAW0c,EAE/B,IAGF,MAAMG,EAAmB,CAACjf,EAAUkf,KAC7BN,EAAc5e,EAASoC,YAC1B0c,EAAY9e,EAASoC,UACrBiR,EAAAA,EAAAA,GAAU,mBAAoB,CAAErT,WAAUkf,iBAC5C,EAGIC,EAAgB,CAACnf,EAAUkf,KAC1BN,EAAc5e,EAASoC,YAC1BiR,EAAAA,EAAAA,GAAU,gBAAiB,CAAErT,WAAUkf,gBAOzC,EAGIE,EAAW,SAACC,EAAI3a,GAIpB,QAJ2B,IAAPA,IAAAA,EAAU,CAAC,GAIlB,iBAAF2a,EAET,YADAC,EAAAA,cAAAA,SAAuBD,GAIzB,MAAM,SAAEjd,EAAQ,OAAEsE,EAAM,KAAEnH,IAASggB,EAAAA,EAAAA,IAAUF,GACvC5W,GAAWC,EAAAA,EAAAA,GAAwBtG,GAUzC,GANIqG,IACF4W,EAAK5W,EAASE,OAASjC,EAASnH,GAK9BZ,OAAO6gB,aAET,YADA7gB,OAAOqB,SAAWoC,EAAWsE,EAASnH,GAMxC,MAAMkgB,EAAYzN,YAAW,KAC3BoF,EAAAA,EAAAA,KAAa,6BAA8B,CAAEhV,cAC7CiR,EAAAA,EAAAA,GAAU,uBAAwB,CAChCrT,SAAUrB,OAAOqB,UACjB,GACD,KAEHiO,EAAAA,GAAAA,SAAgB7L,EAAWsE,GAAQjB,MAAK0Q,IAOtC,IAAKA,GAAiBA,EAAc9F,SAAWI,EAAAA,GAAAA,MAI7C,OAHA9R,OAAO+gB,QAAQC,aAAa,CAAC,EAAE,GAAK3f,SAAS2H,MAC7ChJ,OAAOqB,SAAWoC,OAClBwd,aAAaH,GAM8BtJ,GAEzCA,EAAc5E,KAAKE,yBACnB9S,OAAOkhB,4BAIL,kBAAmBtI,WACoB,OAAvCA,UAAUuI,cAActJ,YACgB,cAAxCe,UAAUuI,cAActJ,WAAWhU,OAEnC+U,UAAUuI,cAActJ,WAAWuJ,YAAY,CAC7CC,UAAU,uBAIdrhB,OAAOqB,SAAWoC,EAAWsE,EAASnH,IAG1C0gB,EAAAA,EAAAA,UAAcZ,EAAI3a,GAClBkb,aAAaH,EAAU,GAE3B,EAEA,SAAStgB,EAAmBU,EAAgB,GAAgB,IAAf,SAAEG,GAAU,EACvD,MAAM,SAAEoC,EAAQ,KAAE7C,GAASS,EACrBkF,GAAUmO,EAAAA,EAAAA,GAAU,qBAAsB,CAC9CxT,kBAEAuC,WACAtC,YAAa,CAAEE,YACfkgB,uBAAwBjiB,GAAQ,CAC9B,EAGAI,KAAKE,cAAc+B,KAAKrC,EAAMA,EAAKpB,QAGvC,GAAIqI,EAAQlH,OAAS,EAGnB,OAAOkH,EAAQA,EAAQlH,OAAS,GAGlC,GAAI6B,EAAiB,CACnB,MACEG,UAAYoC,SAAU+d,IACpBtgB,EACJ,GAAIsgB,IAAgB/d,EAGlB,OAAO7C,EAAOgB,UAAUhB,EAAKjE,MAAM,IAAM,CAAC,EAAG,EAEjD,CACA,OAAO,CACT,CAYC,IAEK8kB,EAAc,YAClB,WAAYhhB,GAAQ,IAAD,EAEuB,OADxC,cAAMA,IAAM,MACPihB,gBAAkBrkB,EAAAA,YAAiB,CAC1C,EAJkB,YAIjB,kBAwBA,OAxBA,EAED2E,mBAAA,SAAmBzB,EAAWohB,GAC5BzhB,uBAAsB,KACpB,IAAI0hB,EAAQ,eAAkBliB,KAAKe,MAAMY,SAASoC,SAC9C3C,SAAS+gB,QACXD,EAAW9gB,SAAS+gB,OAEtB,MAAMC,EAAehhB,SAAS6c,iBAAiB,4BAC3CmE,GAAgBA,EAAaziB,SAC/BuiB,EAAWE,EAAa,GAAGC,aAE7B,MAAMC,EAAe,gBAAmBJ,EACxC,GAAIliB,KAAKgiB,gBAAgBvd,QAAS,CACRzE,KAAKgiB,gBAAgBvd,QAAQ8d,YAC7BD,IACtBtiB,KAAKgiB,gBAAgBvd,QAAQ8d,UAAYD,EAE7C,IAEJ,EAAC,EAED9f,OAAA,WACE,OAAO,uCAASqd,EAAmB,CAAEvb,IAAKtE,KAAKgiB,kBACjD,EAAC,EA5BiB,CAASrkB,EAAAA,WA+B7B,MAAM6kB,EAAuB,CAAC3B,EAAc4B,KAAkB,IAAD,IAC3D,OAAI5B,EAAavX,OAASmZ,EAAanZ,OAInCuX,SAAmB,QAAP,EAAZA,EAAc1c,aAAK,WAAP,EAAZ,EAAqB3F,QAAQikB,SAAmB,QAAP,EAAZA,EAActe,aAAK,WAAP,EAAZ,EAAqB3F,IAI1C,EAGd,IACMkkB,EAAY,YAChB,WAAY3hB,GAAQ,IAAD,EAEqB,OADtC,cAAMA,IAAM,KACZ6f,EAAiB7f,EAAMY,SAAU,MAAK,CACxC,EAJgB,YAIf,kBA2BA,OA3BA,EAEDE,kBAAA,WACEif,EAAc9gB,KAAKe,MAAMY,SAAU,KACrC,EAAC,EAEDghB,sBAAA,SAAsB9hB,GACpB,QAAI2hB,EAAqB3hB,EAAUc,SAAU3B,KAAKe,MAAMY,YACtDif,EAAiB5gB,KAAKe,MAAMY,SAAUd,EAAUc,WACzC,EAGX,EAAC,EAEDW,mBAAA,SAAmBzB,GACb2hB,EAAqB3hB,EAAUc,SAAU3B,KAAKe,MAAMY,WACtDmf,EAAc9gB,KAAKe,MAAMY,SAAUd,EAAUc,SAEjD,EAAC,EAEDa,OAAA,WACE,OACE,gBAAC,WAAc,KACZxC,KAAKe,MAAM6B,SACZ,gBAACmf,EAAc,CAACpgB,SAAUA,WAGhC,EAAC,EA/Be,CAAShE,EAAAA,mCCxN3B,SAASilB,EAAerX,EAAGsX,GACzB,IAAK,IAAI5S,KAAK1E,EACZ,KAAM0E,KAAK4S,GAAI,OAAO,EACvB,IAAK,IAAIC,KAAMD,EACd,GAAItX,EAAEuX,KAAQD,EAAEC,GAAK,OAAO,EAC7B,OAAO,CACV,CC8GA,MAlHqB,YACnB,WAAY/hB,GAAQ,IAAD,EACjB,gBAAO,KACP,MAAM,SAAEY,EAAQ,cAAEmW,GAAkB/W,EAQnC,OAPD,EAAKoD,MAAQ,CACXxC,SAAU,IAAKA,GACfmW,cACEA,GACAlI,EAAAA,GAAAA,aAAoBjO,EAASoC,SAAWpC,EAAS0G,OAAQ,CACvD2Q,kBAAkB,KAEvB,CACH,EAZmB,YAYlB,EAEM+J,yBAAP,WAA8CC,GAAY,IAA1B,SAAErhB,GAAU,EAC1C,GAAIqhB,EAAUrhB,SAAS2H,OAAS3H,EAAS2H,KAAM,CAQ7C,MAAO,CACLwO,cARoBlI,EAAAA,GAAAA,aACpBjO,EAASoC,SAAWpC,EAAS0G,OAC7B,CACE2Q,kBAAkB,IAMpBrX,SAAU,IAAKA,GAEnB,CAEA,MAAO,CACLA,SAAU,IAAKA,GAEnB,EAAC,kBA+EA,OA/EA,EAEDshB,cAAA,SAAc1Q,GACZ3C,EAAAA,GAAAA,SAAgB2C,GAASnL,MAAK0Q,IACxBA,GAAiBA,EAAc9F,SAAWI,EAAAA,GAAAA,MAC5CpS,KAAKkjB,SAAS,CACZvhB,SAAU,IAAKrB,OAAOqB,UACtBmW,mBAGFxX,OAAO+gB,QAAQC,aAAa,CAAC,EAAE,GAAK3f,SAAS2H,MAC7ChJ,OAAOqB,SAAW4Q,EACpB,GAEJ,EAAC,EAEDoQ,sBAAA,SAAsBV,EAAWkB,GAE/B,OAAKA,EAAUrL,cAkBX9X,KAAKmE,MAAM2T,gBAAkBqL,EAAUrL,gBAIzC9X,KAAKmE,MAAM2T,cAAcpK,YAAcyV,EAAUrL,cAAcpK,YAK7D1N,KAAKmE,MAAM2T,cAAcvE,OAAS4P,EAAUrL,cAAcvE,SAM5DvT,KAAKmE,MAAMxC,SAASnD,MAAQ2kB,EAAUxhB,SAASnD,MAC/C2kB,EAAUrL,cAAc5E,OACvBiQ,EAAUrL,cAAc5E,KAAKtJ,YAC5BuZ,EAAUrL,cAAc5E,KAAK9K,ODhFrC,SAA0BoD,EAAUyW,EAAWkB,GAC7C,OAAOP,EAAepX,EAASzK,MAAOkhB,IAAcW,EAAepX,EAASrH,MAAOgf,EACpF,CCkFUC,CAAepjB,KAAMiiB,EAAWkB,OAvCrCnjB,KAAKijB,cACHhB,EAAUtgB,SAASoC,SAAWke,EAAUtgB,SAAS0G,SAE5C,EAqCX,EAAC,EAED7F,OAAA,WAiBE,OAAOxC,KAAKe,MAAM6B,SAAS5C,KAAKmE,MAClC,EAAC,EA/GkB,CAASxG,EAAAA,+BC6B9B,MAAMiS,EAAS,IAAIyL,EAAAA,GAAWC,KAA2Bhb,OAAO2S,WAChEyI,EAAAA,EAAAA,IAAU9L,GACVA,EAAOmF,aAAaC,EAAAA,GAEpB,MAAM,OAAExS,EAAM,QAAE6gB,IAAYrE,EAAAA,EAAAA,KAE5B1e,OAAOgb,cAAgBA,EACvBhb,OAAOgjB,WAAavK,EAAAA,EACpBzY,OAAOijB,UAAY3H,EAAAA,GH4HjBqF,EAAAA,cAAAA,QAAqBrhB,IACnBA,EAAK+B,SAAS6hB,OAAS5jB,EAAK4jB,MAAM,IAGpCljB,OAAOmjB,QAAUzC,GAAMD,EAASC,EAAI,CAAEjL,SAAS,IAC/CzV,OAAOkgB,WAAaQ,GAAMD,EAASC,EAAI,CAAEjL,SAAS,IAClDzV,OAAOojB,YAAc,CAAC1C,EAAI3a,IAAY0a,EAASC,EAAI3a,GG9HrD,MAAMsd,EAAgB,wCAEtBC,EAAAA,EAAAA,GAAe,iBAAiBxc,MAAK,MAG/B4N,EAAAA,EAAAA,GAAU,yBAAyBhO,OAAOkE,SAASvL,OAAS,GAC9DyG,EAAQ,MAWV,MAAMyd,EAAe9iB,GACnB,gBAAC,EAAA+iB,YAAA,SAAoB,CACnBnhB,MAAO,CACLohB,QAAQ,IACRC,SAAS,MAGX,gBAAC,IAAiBjjB,IAIhBkjB,EAActmB,EAAAA,cAAoB,CAAC,GAEnCsQ,EAAgB,CACpBK,kBAAkB,WACnB,IAEK4V,EAAU,8DAiCb,OAjCa,wBACd1hB,OAAA,WACE,MAAM,SAAEI,GAAa5C,KAAKe,MAC1B,OACE,gBAAC,EAAAojB,SAAQ,MACN,QAAC,SAAExiB,GAAU,SACZ,gBAAC,EAAe,CAACA,SAAUA,IACxB,IAAkC,IAAjC,cAAEmW,EAAa,SAAEnW,GAAU,EAC3B,MAAMkX,GAAqBgD,EAAAA,EAAAA,MACrBuI,GAAetI,EAAAA,EAAAA,MAErB,OACE,gBAAC,cAA2B,CAACnZ,MAAOkW,GAClC,gBAAC,cAAsB,CAAClW,MAAOsL,GAC7B,gBAAC,cAA6B,CAACtL,MAAOyhB,GACpC,gBAAC,cAAyB,CACxBzhB,MAAOmV,EAAc5E,KAAKtG,WAE1B,gBAACqX,EAAYvhB,SAAQ,CACnBC,MAAO,CAAEmV,gBAAenW,aAEvBiB,MAKmB,GAGlB,GAI1B,EAAC,EAjCa,CAASjF,EAAAA,WAoCnB0mB,EAAe,8DAsClB,OAtCkB,wBACnB7hB,OAAA,WACE,OACE,gBAACyhB,EAAYK,SAAQ,MAClB,QAAC,cAAExM,EAAa,SAAEnW,GAAU,SAC3B,gBAAC+gB,EAAY,CAAC/gB,SAAUA,GACtB,gBAAC,KAAa,CACZA,SAAUA,EACVb,mBAAoBA,GAEpB,gBAAC,EAAAyjB,OAAM,CACLP,SAAU9a,GACVvH,SAAUA,EACV4K,GAAG,wBAEH,gBAACsX,EAAY,eACXzb,KACyB,cAAvB0P,EAAc5E,KAAK9K,MACI,cAAvB0P,EAAc5E,KAAK9K,MACfa,EAAAA,EAAAA,GAAYtH,EAASoC,SAAUmF,IAC/Bsb,WAEI1M,EAAc5E,KAAKtJ,WACnBkO,EAAc5E,KAAK9K,MACnBE,MAAM,KAAK,KAGjBtI,KAAKe,MAAK,CACdY,SAAUA,EACVmW,cAAeA,GACXA,EAAcvE,SAIX,GAIvB,EAAC,EAtCkB,CAAS5V,EAAAA,WAyC9B,MAAM,SAAE0X,EAAU1T,SAAU8iB,GAAenkB,OAYzC+U,GACAnM,GAAgBmM,IACdoP,EAAW1gB,UAAYsR,EAASI,SAAS,KAAOgP,EAAWpc,OAAM,OAEjEuH,EAAOpG,eAAcP,EAAAA,EAAAA,GAAYwb,EAAW1gB,SAAUmF,MACtDmM,EAASqP,MAAM,6BACfrP,EAASqP,MAAM,+CAGjB3D,EAAAA,EAAAA,UACE7X,GACEmM,GACEA,EAASI,SAAS,KAAwB,GAAjBgP,EAAWpc,QACtCoc,EAAWvjB,KACb,CACE6U,SAAS,IAMf,MAAM4O,EAAoB,KACxB,IACE,OAAOrhB,cAGT,CAFE,MACA,OAAO,IACT,GAGFsY,EAAAA,GAAAA,SAAsB6I,EAAW1gB,SAAW0gB,EAAWpc,QAAQjB,MAAK8L,IAAS,IAAD,EAC1E,MAAM5P,EAAiBqhB,IAEvB,GACEzR,SAAU,QAAN,EAAJA,EAAMA,YAAI,OAAV,EAAYE,wBACZF,EAAKA,KAAKE,yBAA2B9S,OAAOkhB,4BAI1C,kBAAmBtI,WACoB,OAAvCA,UAAUuI,cAActJ,YACgB,cAAxCe,UAAUuI,cAActJ,WAAWhU,OAEnC+U,UAAUuI,cAActJ,WAAWuJ,YAAY,CAC7CC,UAAU,uBAUVre,GAAgB,CAGlB,KAF2D,MAAxCA,EAAeC,QAAQogB,IAKxC,OAFArgB,EAAeO,QAAQ8f,EAAiB,UACxCrjB,OAAOqB,SAASijB,QAAO,EAG3B,CAOF,GAJIthB,GACFA,EAAeuhB,WAAWlB,IAGvBzQ,GAAQA,EAAKlB,SAAWI,EAAAA,GAAAA,MAA0B,CACrD,MAAMnD,EAAO,sBAAyBwV,EAAW1gB,SAAQ,kCAIzD,GAAImP,GAAQA,EAAKkD,MAEf,MADA0O,QAAQ1O,MAAMnH,GACRiE,EAAKkD,MAGb,MAAM,IAAInJ,MAAMgC,EAClB,CAEA,MAAM8V,GAAW/P,EAAAA,EAAAA,GAAU,kBAEzB,CAAEhS,QAAS,gBAACqhB,EAAe,OAC3B,gBAACA,EAAe,OAChB,IAAiB,IAAhB,OAAEtd,GAAQ,EACT,MAAO,CAAE/D,QAAS+D,EAAQ,IAE5B4Y,MAEIqF,EAAM,WACV,MAAMC,EAAsBtnB,EAAAA,QAAa,GAazC,OAXAA,EAAAA,WAAgB,KACTsnB,EAAoBxgB,UACvBwgB,EAAoBxgB,SAAU,EAC1BygB,YAAYC,MACdD,YAAYC,KAAK,0BAGnBnQ,EAAAA,EAAAA,GAAU,yBACZ,GACC,IAEI,gBAACkP,EAAU,KAAEa,EACtB,EAEMK,EAAUhkB,SAASC,eAAe,wBAIxC,IAAIgkB,EAAkB7iB,EAClB4iB,GAAWA,EAAQxiB,SAASjD,SAC9B0lB,EAAkBhC,GAGpB,MAAMiC,GAAWtQ,EAAAA,EAAAA,GAAU,8BAEzB7S,EACAkjB,GACA,GAEF,SAASE,IACP,MAAMC,EACS,oBAANllB,OACHc,SAASC,eAAe,aACxB,KAENikB,EAAS,gBAACN,EAAG,MAAKQ,EACpB,CAIA,MAAMC,EAAMrkB,SACZ,GACgB,aAAdqkB,EAAI3S,YACW,YAAd2S,EAAI3S,aAA6B2S,EAAIC,gBAAgBC,SAEtDhS,YAAW,WACT4R,GACF,GAAG,OACE,CACL,MAAM5d,EAAU,WACd8d,EAAIpjB,oBAAoB,mBAAoBsF,GAAS,GACrDrH,OAAO+B,oBAAoB,OAAQsF,GAAS,GAE5C4d,GACF,EAEAE,EAAI1jB,iBAAiB,mBAAoB4F,GAAS,GAClDrH,OAAOyB,iBAAiB,OAAQ4F,GAAS,EAC3C,CAEM,GACN,+EC1SJ,UAlByB,IAAmB,IAAlB,SAAEhG,GAAU,EACpC,MAAMmW,EAAgBlI,EAAAA,GAAAA,aAAoBjO,EAASoC,UACnD,OAAK+T,EAGEna,EAAAA,cAAoBioB,EAAAA,EAAsB,CAC/CjkB,WACAmW,mBACGA,EAAcvE,OALV,IAMP,wBCfkB0D,MAKpB/Q,EAAOxJ,SALaua,EAKW7Q,EAAQ,OALR6Q,EAAEnZ,SAAYmZ,qECA/C,MAAMnQ,EAAM,IAAI7H,QAET,SAAS+f,IACd,MAAM6G,EAAiBzf,EAAQ,KAY/B,MAAO,CAAE5D,OAVM,CAACK,EAAWijB,KACzB,IAAIC,EAAOjf,EAAI5I,IAAI4nB,GACdC,GACHjf,EAAIjI,IAAIinB,EAAKC,EAAOF,EAAeG,WAAWF,IAEhDC,EAAKvjB,OAAOK,EAAU,EAKPwgB,QAFD,CAACxgB,EAAWijB,IAAOD,EAAeI,YAAYH,EAAIjjB,GAGpE,qECZA,MAAMqjB,EAAc,IAAIzd,IAClB0d,EAAwB,IAAI1d,IAU3B,SAAS4B,EAAwBtG,GACtC,IAAIqG,EAAW8b,EAAYhoB,IAAI6F,GAI/B,OAHKqG,IACHA,EAAW+b,EAAsBjoB,IAAI6F,EAAS0Z,gBAEzCrT,CACT,IAdAgc,SAAkBhc,IACZA,EAASic,WACXF,EAAsBtnB,IAAIuL,EAASkc,SAAUlc,GAE7C8b,EAAYrnB,IAAIuL,EAASkc,SAAUlc,EACrC,4DCTwB,WAAxB9J,OAAOqB,SAAS4kB,UACQ,cAAxBjmB,OAAOqB,SAAS6kB,SAEhB1B,QAAQ1O,MAAM,gFAGL,kBAAmB8C,WAC5BA,UAAUuI,cACPgF,SAAyB,UACzBrf,MAAK,SAAUsf,GACdA,EAAI3kB,iBAAiB,eAAe,MAClCiT,EAAAA,EAAAA,GAAU,6BAA8B,CAAEyM,cAAeiF,IAGzD,MAAMC,EAAmBD,EAAIE,WAC7B9B,QAAQ+B,IAAI,mBAAoBF,GAChCA,EAAiB5kB,iBAAiB,eAAe,KAC/C,OAAQ4kB,EAAiBxiB,OACvB,IAAI,YACE+U,UAAUuI,cAActJ,YAK1B7X,OAAO6gB,cAAe,GAEtBnM,EAAAA,EAAAA,GAAU,6BAA8B,CAAEyM,cAAeiF,IAGrDpmB,OAAOwmB,qBACThC,QAAQ+B,IAAI,4CACZvmB,OAAOqB,SAASijB,YAKlBE,QAAQ+B,IAAI,sCAKZ7R,EAAAA,EAAAA,GAAU,2BAA4B,CAAEyM,cAAeiF,KAEzD,MAEF,IAAI,YACF5B,QAAQ1O,MAAM,oDACdpB,EAAAA,EAAAA,GAAU,2BAA4B,CAAEyM,cAAeiF,IACvD,MAEF,IAAI,aACF1R,EAAAA,EAAAA,GAAU,wBAAyB,CAAEyM,cAAeiF,IAC/C,GAET,GAEN,IACCvU,OAAM,SAAUzO,GACfohB,QAAQ1O,MAAM,4CAA6C1S,EAC7D,8JC5DJ,MAAM+J,EAAuB9P,EAAAA,cAAoB,CAAC,GAC5CuQ,EAAgBvQ,EAAAA,cAAoB,CAAC,GACrCkP,EAAmBlP,EAAAA,cAAoB,CAAC,GACxCoP,EAAqBpP,EAAAA,cAAoB,CAAC,qICDhD,MAAMshB,GCgB+BxQ,EDhBkB,cCgBZsY,EDhB2B,CAAC,ECiBjEppB,EAAAA,oBAjBsB,SAAC8Q,EAAMsY,GAajC,YAb6C,IAAZA,IAAAA,EAAe,MAE3CC,WAAWC,mBACdD,WAAWC,iBAAmB,CAAC,GAG5BD,WAAWC,iBAAiBxY,KAC/BuY,WAAWC,iBAAiBxY,GAAQ9Q,EAAAA,oBAClC8Q,EACAsY,IAIGC,WAAWC,iBAAiBxY,EACrC,CAIWyY,CAAoBzY,EAAMsY,GAG5BppB,EAAAA,cAAoBopB,IAL7B,IAAqCtY,EAAMsY,EDd3C,SAASI,EAAwB,GAA2C,IAA3C,gBAAEC,EAAe,KAAExZ,EAAI,MAAEyZ,EAAK,OAAE7kB,GAAQ,EACvE,MAAM8kB,EAAY1Z,EACdA,EAAKA,KACLwZ,EAAgBC,IAAUD,EAAgBC,GAAOzZ,KAErD,OACE,gBAAC,WAAc,KACZ0Z,GAAa9kB,EAAO8kB,IACnBA,GAAa,2BAAK,yBAG1B,CAKA,MAAMC,EAAcxmB,IAClB,MAAM,KAAE6M,EAAI,MAAEyZ,EAAK,OAAE7kB,EAAM,SAAEI,GAAa7B,EAS1C,OACE,gBAACke,EAAmBqF,SAAQ,MACzB8C,GACC,gBAACD,EAAuB,CACtBvZ,KAAMA,EACNyZ,MAAOA,EACP7kB,OAAQA,GAAUI,EAClBwkB,gBAAiBA,KAGO,EAW5BI,EAAiBH,IAAU,IAAD,EAErB1pB,EAAAA,WAUT,MAAM8pB,EAAU9pB,EAAAA,WAAiBshB,GAKjC,GAAIyI,MAAMC,OAAON,IACf,MAAM,IAAIpa,MAAM,8KAIMoa,EAAK,SAI7B,GAAkB,QAAlB,EAAII,EAAQJ,UAAM,OAAd,EAAgBzZ,KAClB,OAAO6Z,EAAQJ,GAAOzZ,KAEtB,MAAM,IAAIX,MACR,uMAIJ,qCElFa,SAAShE,EAAY2e,EAAKC,GACvC,YAD6C,IAANA,IAAAA,EAAM,IACxCA,EAIDD,IAAQC,EACJ,IAGJD,EAAIxe,WAAcye,EAAM,KACnBD,EAAI3qB,MAAM4qB,EAAOloB,QAGnBioB,EAXEA,CAYX,iKCnBO,MAAM9G,EAAgB,WAE1BgH,GAEI,IAJuB,SAC5BnmB,GACD,OAAe,IAAbmmB,IAAAA,EAAgB,CACjBC,kBAAkB,IAElB,MAAMC,EAAU5mB,SAASgE,cAAc,yBACjC6iB,EAAgBD,EAAQxL,aAAa,QACrC0L,EAAeF,EAAQxL,aAAa,qBACpC2L,EAAWH,EAAQxL,aAAa,iBAEtC,GAAIyL,GAAiBC,GAAgBC,EAAU,CAC7C,IAAIxlB,EAAWulB,EAAY,KAAKC,EAAWxmB,EAASoC,SACpD,MAAM,iBACJgkB,GACED,EAECC,IACHplB,GAAShB,EAAS0G,QAGpB1F,GAAShB,EAAST,KAClB8mB,EAAQ1W,aAAa,OAAD,GAAY3O,EAClC,CACF,uGCnBO,MAAMme,EAAgB,SAAuB,EAEjDgH,GAAgB,IAFiC,SAClDnmB,GACD,CAYD,oCCbA,IAAIymB,EAAUhiB,EAAQ,sCCHtB1J,EAAQ2rB,gBAAkB,CACxBC,SAAU,IACVC,aAAc,GACdC,gBAAiB,QACjBC,sBAAsB,EACtBC,cAAc,EACdC,kBAAkB,EAClBC,UAAU,EACVC,UAAU,EACVC,WAAW,EACXC,QAAS,OACTC,SAAU,QACVC,uBAAuB,EACvBC,gBAAgB,GAElBxsB,EAAQysB,UAAY,mBACpBzsB,EAAQ0sB,WAAa,0BACrB1sB,EAAQ2sB,kBAAoB,4BAC5B3sB,EAAQ4sB,qBAAuB,wEClB/B,IAAIC,EAAWnjB,EAAQ,KACnBiiB,EAAkBkB,EAASlB,gBAC3Be,EAAaG,EAASH,WACtBE,EAAuBC,EAASD,qBAChCD,EAAoBE,EAASF,kBAEjC3sB,EAAQokB,cAAgB,SAAU0I,EAAoB1B,GAkCpD,IAjCA,IAAIzhB,EAAUhI,OAAOqX,OAAO,CAAC,EAAG2S,EAAiBP,GAC7C2B,EAAgBroB,SAAS6c,iBAAiB,IAAMoL,GAGhDK,EAAQ,SAAezZ,GACzB,IAAI0Z,EAAeF,EAAcxZ,GAC7B2Z,EAAoBD,EAAavkB,cAAc,IAAMkkB,GACrDO,EAAeF,EAAavkB,cAAc,IAAMgkB,GAEhDU,EAAc,WAChBF,EAAkB9J,MAAMiK,WAAa,oBACrCF,EAAa/J,MAAMiK,WAAa,eAChCC,GACF,EAEIA,EAAkB,SAASA,IAC7BJ,EAAkB9J,MAAMmK,QAAU,EAClCJ,EAAa/J,MAAMmK,QAAU,EAC7BJ,EAAa/J,MAAMoK,MAAQ,UAC3BL,EAAa/J,MAAMqK,UAAY,2BAA6B9jB,EAAQmiB,gBACpEqB,EAAaxnB,oBAAoB,OAAQynB,GACzCD,EAAaxnB,oBAAoB,QAAS2nB,EAC5C,EAEAH,EAAa/J,MAAMmK,QAAU,EAC7BJ,EAAa9nB,iBAAiB,OAAQ+nB,GACtCD,EAAa9nB,iBAAiB,QAASioB,GAEnCH,EAAaO,UACfJ,GAEJ,EAES/Z,EAAI,EAAGA,EAAIwZ,EAAc9pB,OAAQsQ,IACxCyZ,EAAMzZ,EAEV,iCCGA/J,EAAOxJ,QA5BS,SAAS2tB,EAAWC,EAAQ/e,EAAGsX,EAAG0H,EAAGC,EAAG9mB,EAAG+mB,GAOzD,IAAKJ,EAAW,CACd,IAAIjU,EACJ,QAAejU,IAAXmoB,EACFlU,EAAQ,IAAInJ,MACV,qIAGG,CACL,IAAIrN,EAAO,CAAC2L,EAAGsX,EAAG0H,EAAGC,EAAG9mB,EAAG+mB,GACvBC,EAAW,GACftU,EAAQ,IAAInJ,MACVqd,EAAOvU,QAAQ,OAAO,WAAa,OAAOnW,EAAK8qB,IAAa,MAExDjc,KAAO,qBACf,CAGA,MADA2H,EAAMuU,YAAc,EACdvU,CACR,CACF,yCCrCiBwU,EAAE,EAAQ,MAASC,EAAE,CAACC,QAAO,GAAIpa,EAAE,IAAIjI,IAAIwR,EAAE3J,OAAOya,IAAI,iBAAiBC,EAAE1a,OAAOya,IAAI,cAAcE,EAAE3a,OAAOya,IAAI,uBAAuBG,EAAEN,EAAEO,mDAAmDC,gBAAkF,SAASC,EAAE9f,EAAEsX,EAAE0H,GAAGvqB,KAAKsrB,QAAQ/f,EAAEvL,KAAKurB,OAAO1I,EAAE7iB,KAAKwrB,UAAUjB,CAAC,CACjW,SAASkB,EAAElgB,GAAG,OAAOA,EAAE+f,SAAS,KAAK,EAAE,OAAO/f,EAAEggB,OAAO,KAAK,EAAE,IAAI1I,EAAErf,KAAKC,MAAM8H,EAAEggB,OAAOhgB,EAAEigB,UAAUE,WAAuB,OAAZngB,EAAE+f,QAAQ,EAAS/f,EAAEggB,OAAO1I,EAAE,KAAK,EAAa,IAAI,IAAI0H,GAAnB1H,EAAEtX,EAAEggB,QAAmBI,OAAOnB,EAAE,EAAEA,EAAED,EAAE5qB,OAAO6qB,IAAI,CAAC,IAAI9mB,EAAEgN,EAAExS,IAAIqsB,EAAEC,IAAI,GAAG,OAAO9mB,EAAE,MAAMA,CAAE,CAAyG,OAAxG6mB,EAAEqB,EAAoB/I,EAAEtW,IAAIsW,EAAE,MAAMA,EAAEpU,KAAK8b,EAAE,KAAK1H,EAAEpU,KAAK8b,EAAEhtB,WAAWgtB,EAAEzsB,QAAQysB,EAAEA,EAAE1H,EAAEpU,MAAMlD,EAAE+f,QAAQ,EAAS/f,EAAEggB,OAAO1I,EAAE,KAAK,EAAE,MAAMtX,EAAE,QAAQ,MAAMA,EAAEggB,OAAQ,CAAC,SAASM,IAAoB,OAAOJ,EAAjBK,EAAE9rB,KAAK,GAAc,CAAC,SAAS+rB,EAAExgB,EAAEsX,GAAG,OAAO,IAAIwI,EAAE,EAAExI,EAAEtX,EAAE,CAC3d,SAASygB,EAAEzgB,GAAG,GAAG,OAAOA,EAAE,IAAI,IAAIsX,EAAE,EAAEA,EAAEtX,EAAE5L,OAAOkjB,KAAI,EAAGtX,EAAEsX,KAAK,CAAC,SAASoJ,EAAE1gB,EAAEsX,GAAG,GAAG,IAAItX,EAAE+f,QAAQ,CAAC,IAAIf,EAAEhf,EAAEggB,OAAOhgB,EAAE+f,QAAQ,EAAE/f,EAAEggB,OAAO1I,EAAEmJ,EAAEzB,EAAE,CAAC,CAAC,SAAS2B,EAAE3gB,EAAEsX,GAAGtX,EAAE4gB,QAAQ5a,SAAQ,SAAShG,GAAG0gB,EAAE1gB,EAAEsX,EAAE,GAAE,CAAC,SAASiJ,EAAEvgB,EAAEsX,GAAG,IAAI0H,EAAEhf,EAAE4gB,QAAQ3B,EAAED,EAAErsB,IAAI2kB,GAAqC,OAAlC2H,IAAIA,EAAE,IAAIa,EAAE,EAAE,KAAK9f,GAAGgf,EAAE1rB,IAAIgkB,EAAE2H,IAAWA,CAAC,CAC3B,SAAS4B,EAAE7gB,GAAG2gB,EAAE3gB,EAAE0B,MAAM,sBAAsB,CAC7S,SAASof,EAAE9gB,EAAEsX,GAAG,GAAG,KAAKA,EAAE,CAAC,IAAI0H,EAAE1H,EAAE,GAAG2H,EAAE3H,EAAE9a,QAAQ,IAAI,GAAGrE,EAAE4oB,SAASzJ,EAAEvhB,UAAU,EAAEkpB,GAAG,IAAuB,OAAnBA,EAAE3H,EAAEvhB,UAAUkpB,EAAE,GAAUD,GAAG,IAAK,KAAiBA,GAAb1H,EAAEtX,EAAE4gB,SAAajuB,IAAIwF,IAAI,IAAI6mB,EAAEe,UAAU/f,EAAEgf,EAAEgB,OAAOhB,EAAEe,QAAQ,EAAEf,EAAEgB,OAAOf,EAAEwB,EAAEzgB,IAAIsX,EAAEhkB,IAAI6E,EAAE,IAAI2nB,EAAE,EAAEb,EAAEjf,IAAI,MAAM,IAAK,IAAgBgf,GAAZ1H,EAAEtX,EAAE4gB,SAAYjuB,IAAIwF,GAAG8mB,EAAEhnB,KAAKC,MAAM+mB,EAAEjf,EAAEmgB,WAAW,IAAIjB,EAAElf,EAAEghB,eAAqC9B,GAAtBD,EAAEC,EAAEA,EAAED,EAAEje,IAAIie,EAAE/b,MAAM+b,GAAMmB,OAAO,IAAI,IAAIa,EAAE,EAAEA,EAAE/B,EAAE9qB,OAAO6sB,IAAI,CAAC,IAAIC,EAAEhC,EAAE+B,GAAG,QAAG,IAAS9b,EAAExS,IAAIuuB,GAAG,CAAC,IAAIC,EAAE,IAAuBD,GAAGxV,EAAEvG,EAAE7R,IAAI6B,KAAKgQ,EAAE+b,EAAE,MAAME,EAAEjc,EAAE7R,IAAI6B,KAAKgQ,EAAE+b,GAAGC,EAAEtlB,KAAK6P,EAAE0V,GAAGjc,EAAE7R,IAAI4tB,EACxfC,EAAE,CAAC,CAACnC,EAAE,IAAIA,EAAEe,UAAU/f,EAAEgf,EAAEgB,OAAOhB,EAAEe,QAAQ,EAAEf,EAAEgB,OAAOf,EAAEwB,EAAEzgB,IAAIsX,EAAEhkB,IAAI6E,EAAE,IAAI2nB,EAAE,EAAEb,EAAEjf,IAAI,MAAM,IAAK,IAAIA,EAAE4gB,QAAQttB,IAAI6E,EAAEqoB,EAAExgB,EAL2G,SAAWA,GAA2C,OAAxC2f,EAAE3f,KAAK2f,EAAE3f,GAAGqf,EAAE1D,oBAAoB3b,EAAE0f,IAAWC,EAAE3f,EAAE,CAK1KqhB,CAAEpC,GAAG9nB,WAAW,MAAM,IAAK,IAAI6nB,EAAE/mB,KAAKC,MAAM+mB,GAAGjf,EAAE4gB,QAAQttB,IAAI6E,EAAEqoB,EAAExgB,EAAE+E,OAAOya,IAAIR,KAAK,MAAM,IAAK,IAAI1H,EAAErf,KAAKC,MAAM+mB,IAAGD,EAAEtd,MAAM4V,EAAE5T,UAAWD,MAAM6T,EAAE7T,OAAmBwb,GAAb3H,EAAEtX,EAAE4gB,SAAajuB,IAAIwF,IAAIuoB,EAAEzB,EAAED,GAAG1H,EAAEhkB,IAAI6E,EAAE,IAAI2nB,EAAE,EAAEd,EAAEhf,IAAI,MAAM,QAAQ,MAAM0B,MAAM,8EAA+E,CAAC,CAC5a,SAAS4f,EAAEthB,GAAG,OAAO,SAASsX,EAAE0H,GAAG,MAAM,iBAAkBA,EAH3D,SAAWhf,EAAEsX,EAAE0H,GAAG,OAAOA,EAAE,IAAI,IAAK,IAAI,MAAG,MAAMA,EAAStQ,EAAK,MAAMsQ,EAAE,IAAI,MAAMA,EAAE,GAAUA,EAAEjpB,UAAU,GAAiDmqB,EAAhBlgB,EAAEugB,EAAEvgB,EAAhC+gB,SAAS/B,EAAEjpB,UAAU,GAAG,MAAyB,IAAK,IAAI,OAAqCiK,EAAEugB,EAAEvgB,EAAhC+gB,SAAS/B,EAAEjpB,UAAU,GAAG,KAAa,CAACwrB,SAAS9B,EAAE+B,SAASxhB,EAAEyhB,MAAMvB,GAAG,OAAOlB,CAAC,CAGjM0C,CAAE1hB,EAAEvL,EAAKuqB,GAAG,iBAAkBA,GAAG,OAAOA,EAAKA,EAAE,KAAKtQ,EAAE,CAAC6S,SAAS7S,EAAEvS,KAAK6iB,EAAE,GAAG/rB,IAAI+rB,EAAE,GAAGjmB,IAAI,KAAKvD,MAAMwpB,EAAE,GAAG2C,OAAO,MAAM3C,EAAKA,CAAC,CAAC,CAAC,SAAS4C,EAAE5hB,GAAG,IAAIsX,EAAE,IAAIuK,YAAiH,OAA3F7hB,EAAE,CAACghB,eAAehhB,EAAE4gB,QAA5B,IAAI1jB,IAAkCiL,SAASmY,EAAEwB,YAAY,GAAGC,eAAezK,IAAK6I,UAAUmB,EAAEthB,GAAUA,CAAC,CAC5U,SAASgiB,EAAEhiB,EAAEsX,GAAqR,SAAS2H,EAAE3H,GAAGqJ,EAAE3gB,EAAEsX,EAAE,CAAC,IAAInf,EAAEmf,EAAE2K,YAAY9pB,EAAEzB,OAAOmF,MAApU,SAASmjB,EAAE1H,GAAG,IAAI2J,EAAE3J,EAAElgB,MAAM,IAAGkgB,EAAE4K,KAAc,CAAC5K,EAAE2J,EAAEA,EAAEjhB,EAAE+hB,eAAe,IAAI,IAAIb,EAAE5J,EAAE9a,QAAQ,KAAK,EAAE0kB,GAAG,CAAC,IAAIhC,EAAElf,EAAE8hB,YAAgBpW,EAAE4L,EAAE6K,SAAS,EAAEjB,GAAGxV,EAAEuV,EAAEmB,OAAO1W,GAAGoV,EAAE9gB,EAAEkf,EAAExT,GAAG1L,EAAE8hB,YAAY,GAAqBZ,GAAlB5J,EAAEA,EAAE6K,SAASjB,EAAE,IAAO1kB,QAAQ,GAAG,CAA8B,OAA7BwD,EAAE8hB,aAAab,EAAEmB,OAAO9K,EAAEgI,GAAUnnB,EAAEzB,OAAOmF,KAAKmjB,EAAEC,EAAE,CAA1O4B,EAAE7gB,EAAyO,GAA0Dif,EAAE,CAPKa,EAAE5sB,UAAU2I,KAAK,SAASmE,GAAG,IAAIvL,KAAKsrB,SAAS,OAAOtrB,KAAKurB,SAASvrB,KAAKurB,OAAO,IAAIvrB,KAAKurB,OAAO3jB,KAAK2D,IAAIA,GAAG,EAQ9c7O,EAAQic,yBAAyB,SAASpN,EAAEsX,GAA+C,OAAP0K,EAArC1K,EAAEsK,EAAEtK,GAAGA,EAAE+K,UAAU/K,EAAE+K,UAAU,MAAUriB,GAAUsX,CAAC,oCCdjG3c,EAAOxJ,QAAU,EAAjB,wBCGFwJ,EAAOxJ,QANP,SAAgC2S,GAC9B,QAAa,IAATA,EACF,MAAM,IAAIC,eAAe,6DAE3B,OAAOD,CACT,EACyCnJ,EAAOxJ,QAAQa,YAAa,EAAM2I,EAAOxJ,QAAiB,QAAIwJ,EAAOxJ,8BCN9G,IAAIgO,EAAiB,EAAQ,MAM7BxE,EAAOxJ,QALP,SAAwBmxB,EAAUC,GAChCD,EAASpvB,UAAYJ,OAAOmJ,OAAOsmB,EAAWrvB,WAC9CovB,EAASpvB,UAAUuN,YAAc6hB,EACjCnjB,EAAemjB,EAAUC,EAC3B,EACiC5nB,EAAOxJ,QAAQa,YAAa,EAAM2I,EAAOxJ,QAAiB,QAAIwJ,EAAOxJ,0BCDtGwJ,EAAOxJ,QALP,SAAgCkB,GAC9B,OAAOA,GAAOA,EAAIL,WAAaK,EAAM,CACnC,QAAWA,EAEf,EACyCsI,EAAOxJ,QAAQa,YAAa,EAAM2I,EAAOxJ,QAAiB,QAAIwJ,EAAOxJ,0BCL9G,SAASqxB,EAAgBtjB,EAAGwP,GAK1B,OAJA/T,EAAOxJ,QAAUqxB,EAAkB1vB,OAAOqM,eAAiBrM,OAAOqM,eAAehK,OAAS,SAAyB+J,EAAGwP,GAEpH,OADAxP,EAAEG,UAAYqP,EACPxP,CACT,EAAGvE,EAAOxJ,QAAQa,YAAa,EAAM2I,EAAOxJ,QAAiB,QAAIwJ,EAAOxJ,QACjEqxB,EAAgBtjB,EAAGwP,EAC5B,CACA/T,EAAOxJ,QAAUqxB,EAAiB7nB,EAAOxJ,QAAQa,YAAa,EAAM2I,EAAOxJ,QAAiB,QAAIwJ,EAAOxJ,yCCLvG2B,OAAOC,eAAe5B,EAAS,aAA/B,CAA+CiG,OAAO,IA6CtDjG,EAAQsxB,YAVY,iBAWpBtxB,EAAQoJ,iBAHkBmoB,GAvCJ,EAACA,EAAQC,KAC3B,MAAM,QAAEjpB,EAAU,MAAOkpB,GAAmBF,GAAU,CAAC,EACjDG,EAAY5qB,KAAKI,UAAUuqB,GAAgB,CAACvD,EAAGS,KAChC,mBAANA,IACPA,EAAIgD,OAAOhD,IACLjiB,WAAWwhB,EAAI,OACjBS,EAAI,YAAcA,GAGnBA,KAEX,MAAO,CACH,uBACAhtB,OAAO6O,KAAKihB,GAAgBxuB,OAAS,EAC/B,iCAAiCyuB,MACjC,mBACN,kBACAnpB,EAAQtF,OAAS,EAAI,WAAW6D,KAAKI,UAAUqB,MAAc,GAC7D,oCACAipB,GACFnf,KAAK,GAAG,EAmBuBuf,CAAcL,EAzC1B,ogDCHV,SAASM,EAAeV,EAAUC,GAC/CD,EAASpvB,UAAYJ,OAAOmJ,OAAOsmB,EAAWrvB,WAC9CovB,EAASpvB,UAAUuN,YAAc6hB,GACjC,OAAeA,EAAUC,EAC3B,qCCLe,SAASC,EAAgBtjB,EAAGwP,GAKzC,OAJA8T,EAAkB1vB,OAAOqM,eAAiBrM,OAAOqM,eAAehK,OAAS,SAAyB+J,EAAGwP,GAEnG,OADAxP,EAAEG,UAAYqP,EACPxP,CACT,EACOsjB,EAAgBtjB,EAAGwP,EAC5B,ohCCNgG,SAASxP,IAAI,OAAOA,EAAEpM,OAAOqX,OAAOrX,OAAOqX,OAAOhV,OAAO,SAASgD,GAAG,IAAI,IAAIwnB,EAAE,EAAEA,EAAExrB,UAAUC,OAAOurB,IAAI,CAAC,IAAID,EAAEvrB,UAAUwrB,GAAG,IAAI,IAAIxa,KAAKua,EAAE5sB,OAAOI,UAAUC,eAAeC,KAAKssB,EAAEva,KAAKhN,EAAEgN,GAAGua,EAAEva,GAAG,CAAC,OAAOhN,CAAC,EAAE+G,EAAE1K,MAAMC,KAAKN,UAAU,CAAC,SAAS6L,EAAE7H,EAAEwnB,GAAG,GAAG,MAAMxnB,EAAE,MAAM,CAAC,EAAE,IAAIunB,EAAEva,EAAEjG,EAAE,CAAC,EAAEc,EAAElN,OAAO6O,KAAKxJ,GAAG,IAAIgN,EAAE,EAAEA,EAAEnF,EAAE5L,OAAO+Q,IAAIwa,EAAEnjB,QAAQkjB,EAAE1f,EAAEmF,KAAK,IAAIjG,EAAEwgB,GAAGvnB,EAAEunB,IAAI,OAAOxgB,CAAC,CAAC,MAAMgI,EAAE/O,IAAI,MAAM2E,OAAO6iB,EAAEhqB,KAAK+pB,EAAE3hB,KAAKoH,EAAE8d,OAAO/jB,EAAE8b,SAAShb,EAAEkjB,KAAKhc,EAAE+T,SAASvW,EAAEye,KAAKnE,GAAG7mB,EAAE/B,SAAS,IAAIoC,SAAS8mB,GAAGnnB,EAAE/B,SAAS,OAAOkpB,GAAGna,GAAGkc,IAAI/B,EAAE,IAAIxhB,IAAIqH,GAAG3M,UAAU,CAACA,SAASygB,UAAUtiB,UAAU2oB,IAAIxiB,OAAO6iB,EAAEhqB,KAAK+pB,EAAE3hB,KAAKoH,EAAE8d,OAAO/jB,EAAE8b,SAAShb,EAAEkjB,KAAKhc,EAAE+T,SAASvW,EAAEye,KAAKnE,EAAEpmB,MAAMT,EAAE2d,QAAQld,MAAM3F,IAAIkF,EAAE2d,QAAQld,OAAOT,EAAE2d,QAAQld,MAAM3F,KAAK,UAAS,EAAGyR,EAAE,CAACvM,EAAEwnB,KAAK,IAAID,EAAE,GAAGva,EAAE+B,EAAE/O,GAAG6H,GAAE,EAAG0E,EAAE,OAAO,MAAM,CAAKtO,eAAW,OAAO+O,CAAC,EAAMie,oBAAgB,OAAOpjB,CAAC,EAAEqjB,wBAAwBrjB,GAAE,EAAG0E,GAAG,EAAE4e,OAAO3D,GAAGD,EAAErjB,KAAKsjB,GAAG,MAAMzgB,EAAE,KAAKiG,EAAE+B,EAAE/O,GAAGwnB,EAAE,CAACvpB,SAAS+O,EAAE8S,OAAO,OAAM,EAAG,OAAO9f,EAAE3B,iBAAiB,WAAW0I,GAAG,KAAK/G,EAAErB,oBAAoB,WAAWoI,GAAGwgB,EAAEA,EAAEjkB,QAAOtD,GAAGA,IAAIwnB,GAAC,CAAE,EAAEnK,SAASmK,GAAG/mB,MAAMomB,EAAExU,QAAQ6W,GAAE,GAAI,CAAC,GAAG,GAAG,iBAAiB1B,EAAExnB,EAAE2d,QAAQyN,GAAG5D,OAAO,CAACX,EAAE9f,EAAE,CAAC,EAAE8f,EAAE,CAAC/rB,IAAIoZ,KAAKmX,MAAM,KAAK,IAAIxjB,GAAGqhB,EAAElpB,EAAE2d,QAAQC,aAAaiJ,EAAE,KAAKW,GAAGxnB,EAAE2d,QAAQ2N,UAAUzE,EAAE,KAAKW,EAA+C,CAA5C,MAAMD,GAAGvnB,EAAE/B,SAASirB,EAAE,UAAU,UAAU1B,EAAE,CAAC,CAACxa,EAAE+B,EAAE/O,GAAG6H,GAAE,EAAG,MAAMsf,EAAE,IAAIxjB,SAAQ3D,GAAGuM,EAAEvM,IAAG,OAAOunB,EAAE1Z,SAAQ7N,GAAGA,EAAE,CAAC/B,SAAS+O,EAAE8S,OAAO,WAAUqH,CAAC,EAAC,EAAGN,EAAE,CAAC7mB,EAAE,OAAO,MAAMwnB,EAAExnB,EAAEqE,QAAQ,KAAKkjB,EAAE,CAAClnB,SAASmnB,GAAG,EAAExnB,EAAEurB,OAAO,EAAE/D,GAAGxnB,EAAE2E,OAAO6iB,GAAG,EAAExnB,EAAEurB,OAAO/D,GAAG,IAAI,IAAIxa,EAAE,EAAE,MAAMjG,EAAE,CAACwgB,GAAG1f,EAAE,CAAC,MAAM,MAAM,CAAK5J,eAAW,OAAO8I,EAAEiG,EAAE,EAAE3O,iBAAiB2B,EAAEwnB,GAAG,EAAE7oB,oBAAoBqB,EAAEwnB,GAAG,EAAE7J,QAAQ,CAAKvS,cAAU,OAAOrE,CAAC,EAAMsP,YAAQ,OAAOrJ,CAAC,EAAMvM,YAAQ,OAAOoH,EAAEmF,EAAE,EAAEse,UAAUtrB,EAAEwnB,EAAED,GAAG,MAAMxY,EAAExC,EAAE,IAAIgb,EAAE3iB,MAAM,KAAKoI,IAAIjG,EAAE7C,KAAK,CAAC7D,SAAS0O,EAAEpK,OAAO4H,EAAEtQ,OAAO,IAAIsQ,IAAIA,IAAI1E,EAAE3D,KAAKlE,EAAE,EAAE4d,aAAa5d,EAAEwnB,EAAED,GAAG,MAAMxY,EAAExC,EAAE,IAAIgb,EAAE3iB,MAAM,KAAKmC,EAAEiG,GAAG,CAAC3M,SAAS0O,EAAEpK,OAAO4H,GAAG1E,EAAEmF,GAAGhN,CAAC,EAAEorB,GAAGprB,GAAG,MAAMwnB,EAAExa,EAAEhN,EAAEwnB,EAAE,GAAGA,EAAE3f,EAAE5L,OAAO,IAAI+Q,EAAEwa,EAAE,GAAE,EAAG0B,IAAI,oBAAoBtsB,SAASA,OAAOc,WAAWd,OAAOc,SAASqB,eAAeooB,EAAE5a,EAAE2c,EAAEtsB,OAAOiqB,MAAMxJ,SAAS9G,GAAG4Q,EAAE,SAAS2B,EAAE9oB,EAAEunB,GAAG,OAAO,sBAAsB,EAAEvnB,EAAEunB,EAAE,QAAQjE,WAAWC,mBAAmBD,WAAWC,iBAAiB,CAAC,GAAGD,WAAWC,iBAAiBvjB,KAAKsjB,WAAWC,iBAAiBvjB,GAAG,sBAAsBA,EAAEunB,IAAIjE,WAAWC,iBAAiBvjB,IAApM,CAAyMA,EAAEunB,GAAG,gBAAgBA,EAAE,CAAC,MAAMhU,EAAEuV,EAAE,OAAO,CAACzI,QAAQ,IAAIC,SAAS,MAAMwG,EAAEgC,EAAE,YAAY/B,EAAE,IAAI,aAAaxT,GAAG6U,EAAE,IAAI,aAAatB,GAAG,SAASyC,EAAEvpB,GAAG1D,KAAKkvB,IAAIxrB,CAAC,CAAC,MAAM+oB,EAAE/oB,GAAGA,aAAaupB,EAAE5B,EAAE3nB,IAAI,MAAM,IAAIupB,EAAEvpB,EAAC,EAAG,SAASmf,EAAEqI,GAAG,MAAMlK,GAAGiK,EAAElV,QAAQrF,GAAE,EAAGvM,MAAMsG,EAAE0kB,QAAQ5jB,EAAEwY,QAAQtR,GAAGyY,EAAE,aAAY,KAAK7jB,QAAQC,UAAUF,MAAK,KAAK,MAAM1D,EAAEknB,EAAEK,EAAExY,GAAGwH,EAAEmV,EAAE1rB,EAAEwnB,GAAG,CAACnV,QAAQrF,EAAEvM,MAAMsG,GAAE,GAAE,GAAG,IAAI,MAAMwF,EAAE2a,EAAEK,EAAExY,GAAG,OAAOlH,GAAG8f,EAAE+D,EAAEnf,EAAEib,IAAI,IAAI,CAAC,MAAMmE,EAAEnE,IAAI,MAAMD,EAAEa,KAAK/H,QAAQrT,GAAG+Z,IAAiB,OAAO,gBAAgB5H,EAAEpY,EAAE,CAAC,EAAEwgB,EAAE,CAAClH,QAAQrT,GAAGwa,GAAE,EAAGmE,EAAEvsB,UAAU,CAAC0N,KAAK,SAASwQ,GAAG,qBAAqB,MAAMiL,EAAE,CAACvoB,EAAEwnB,IAAIxnB,EAAEurB,OAAO,EAAE/D,EAAEvrB,UAAUurB,EAAEO,EAAE,CAAC/nB,EAAEwnB,KAAK,IAAID,EAAExgB,EAAE,MAAMc,GAAG2f,EAAE5iB,MAAM,KAAKmK,EAAE6c,EAAE/jB,GAAG0E,EAAE,KAAKwC,EAAE,GAAG8X,EAAEgF,EAAE7rB,GAAG,IAAI,IAAIA,EAAE,EAAE6H,EAAEgf,EAAE5qB,OAAO+D,EAAE6H,EAAE7H,IAAI,CAAC,IAAI6H,GAAE,EAAG,MAAMqhB,EAAErC,EAAE7mB,GAAGsG,MAAM,GAAG4iB,EAAE9uB,QAAQ,CAAC2M,EAAE,CAACT,MAAM4iB,EAAE1iB,OAAO,CAAC,EAAEglB,IAAIhE,GAAG,QAAQ,CAAC,MAAML,EAAEyE,EAAE1C,EAAExkB,MAAM6R,EAAE,CAAC,EAAEuS,EAAEgD,KAAKC,IAAIhd,EAAE9S,OAAOkrB,EAAElrB,QAAQ,IAAIsX,EAAE,EAAE,KAAKA,EAAEuV,EAAEvV,IAAI,CAAC,MAAMvT,EAAEmnB,EAAE5T,GAAGiU,EAAEzY,EAAEwE,GAAG,GAAGyY,EAAEhsB,GAAG,CAACuW,EAAEvW,EAAEzG,MAAM,IAAI,KAAKwV,EAAExV,MAAMga,GAAGnQ,IAAIkC,oBAAoB+F,KAAK,KAAK,KAAK,CAAC,QAAG,IAASmc,EAAE,CAAC3f,GAAE,EAAG,KAAK,CAAC,MAAM0f,EAAE0E,EAAEC,KAAKlsB,GAAG,GAAGunB,IAAIhb,EAAE,CAAC,MAAMvM,GAAG,IAAImsB,EAAE9nB,QAAQkjB,EAAE,IAAI,EAAEvnB,EAAE,6BAA6BunB,EAAE,gEAAgE2B,EAAExkB,UAAU,MAAMqC,EAAEzB,mBAAmBkiB,GAAGjR,EAAEgR,EAAE,IAAIxgB,CAAC,MAAM,GAAG/G,IAAIwnB,EAAE,CAAC3f,GAAE,EAAG,KAAK,CAAC,CAAC,IAAIA,EAAE,CAAC0f,EAAE,CAACjhB,MAAM4iB,EAAE1iB,OAAO+P,EAAEiV,IAAI,IAAIzc,EAAExV,MAAM,EAAEga,GAAGlI,KAAK,MAAM,KAAK,CAAC,CAAC,OAAOkc,GAAGxgB,GAAG,MAAMohB,EAAE,CAACnoB,EAAEwnB,IAAIO,EAAE,CAAC,CAACrjB,KAAK1E,IAAIwnB,GAAGN,EAAE,CAAClnB,EAAEwnB,KAAK,GAAGe,EAAEvoB,EAAE,KAAK,OAAOA,EAAE,MAAMunB,EAAEva,GAAGhN,EAAE4E,MAAM,MAAMmC,GAAGygB,EAAE5iB,MAAM,KAAKiD,EAAE+jB,EAAErE,GAAGxY,EAAE6c,EAAE7kB,GAAG,GAAG,KAAKc,EAAE,GAAG,OAAOukB,EAAErlB,EAAEiG,GAAG,IAAIub,EAAE1gB,EAAE,GAAG,KAAK,CAAC,MAAM7H,EAAE+O,EAAExS,OAAOsL,GAAGwD,KAAK,KAAK,OAAO+gB,GAAG,MAAMrlB,EAAE,GAAG,KAAK/G,EAAEgN,EAAE,CAAC,MAAMT,EAAEwC,EAAExS,OAAOsL,GAAGgf,EAAE,GAAG,IAAI,IAAI7mB,EAAE,EAAEwnB,EAAEjb,EAAEtQ,OAAO+D,EAAEwnB,EAAExnB,IAAI,CAAC,MAAMwnB,EAAEjb,EAAEvM,GAAG,OAAOwnB,EAAEX,EAAE5K,MAAM,MAAMuL,GAAGX,EAAE3iB,KAAKsjB,EAAE,CAAC,OAAO4E,EAAE,IAAIvF,EAAExb,KAAK,KAAK2B,EAAC,EAAG0e,EAAE,CAAC1rB,EAAEwnB,KAAK,MAAMD,EAAEva,EAAE,IAAIhN,EAAE4E,MAAM,KAAK,IAAImC,EAAE,IAAI6kB,EAAErE,GAAGnkB,KAAIpD,IAAI,MAAMunB,EAAE0E,EAAEC,KAAKlsB,GAAG,OAAOunB,EAAEC,EAAED,EAAE,IAAIvnB,KAAIqL,KAAK,KAAK,MAAMpN,UAAU0G,OAAOkD,EAAE,IAAI,CAAC,GAAG2f,EAAEzY,EAAElH,EAAEjD,MAAM,KAAK,IAAI,GAAG,OAAOmC,EAAEqlB,EAAErlB,EAAEiG,EAAE+B,GAAGhI,GAAGslB,EAAE,CAACrsB,EAAEwnB,KAAK,MAAMD,EAAEvnB,GAAGssB,EAAEtsB,GAAG,OAAO4rB,EAAE5rB,GAAGsD,OAAOikB,GAAGgF,OAAOlhB,KAAK,OAAOugB,EAAEpE,GAAGlkB,OAAOikB,GAAGgF,OAAOlhB,KAAK,IAAG,EAAG4gB,EAAE,SAASK,EAAEtsB,GAAGisB,EAAEhf,KAAKjN,GAAGgsB,EAAEhsB,GAAGA,GAAG,MAAMA,EAAE,GAAGwsB,EAAE,CAACxsB,EAAEwnB,KAAI,CAAElhB,MAAMtG,EAAEysB,MAAMzsB,EAAE5F,QAAQ,EAAEwxB,EAAE5rB,EAAE0E,MAAMnB,QAAO,CAACvD,EAAEwnB,KAAKxnB,GAAG,EAAE,CAACA,GAAG,KAAKA,EAAT,CAAYwnB,GAAGxnB,GAAG,EAAEssB,EAAE9E,GAAGxnB,GAAG,EAAEgsB,EAAExE,GAAGxnB,GAAG,EAAEA,GAAG,EAAEA,IAAG,GAAGqW,MAAMmR,IAAIqE,EAAE7rB,GAAGA,EAAEoD,IAAIopB,GAAGD,MAAK,CAACvsB,EAAEwnB,IAAIxnB,EAAEysB,MAAMjF,EAAEiF,MAAM,EAAEzsB,EAAEysB,MAAMjF,EAAEiF,OAAO,EAAEzsB,EAAEqW,MAAMmR,EAAEnR,QAAOuV,EAAE5rB,GAAGA,EAAEqS,QAAQ,eAAe,IAAIzN,MAAM,KAAKwnB,EAAE,CAACpsB,KAAKwnB,IAAIxnB,IAAIwnB,EAAEA,EAAElkB,QAAOtD,GAAGA,GAAGA,EAAE/D,OAAO,MAAKurB,EAAEvrB,OAAO,EAAE,IAAIurB,EAAEnc,KAAK,OAAO,IAAI8gB,EAAE,CAAC,MAAM,QAAQ3D,EAAE,CAACxoB,EAAEwnB,KAAK,MAAMD,EAAE5sB,OAAO6O,KAAKxJ,GAAG,OAAOunB,EAAEtrB,SAAStB,OAAO6O,KAAKge,GAAGvrB,QAAQsrB,EAAEmF,OAAMnF,GAAGC,EAAExsB,eAAeusB,IAAIvnB,EAAEunB,KAAKC,EAAED,IAAE,EAAGoF,EAAE3sB,GAAGA,EAAEqS,QAAQ,eAAe,IAAIoX,EAAEjC,GAAGD,IAAI,IAAIA,EAAE,OAAO,KAAK,GAAGA,EAAEvjB,OAAO,YAAYujB,EAAElqB,MAAM6B,SAAS,OAAO,eAAeqoB,EAAElqB,MAAM6B,SAASuqB,EAAEjC,IAAI,GAAG,EAAED,EAAElqB,MAAMqH,MAAM6iB,EAAElqB,MAAMjD,SAASmtB,EAAEvjB,OAAO2nB,EAAE,iIAAiIpE,EAAEvjB,UAAU,KAAKujB,EAAEvjB,OAAO2nB,GAAGpE,EAAElqB,MAAMyP,MAAMya,EAAElqB,MAAMigB,IAAI,mBAAmBiK,EAAElqB,MAAMyP,aAAaya,EAAElqB,MAAMigB,qEAAqE,IAAIiK,EAAEvjB,OAAO2nB,IAAIU,EAAE9E,EAAElqB,MAAMyP,KAAKya,EAAElqB,MAAMigB,KAAK,mBAAmBiK,EAAElqB,MAAMyP,YAAYya,EAAElqB,MAAMigB,kGAAkGiK,EAAElqB,MAAMjD,QAAQ,MAAM,CAAC6E,MAAMsoB,EAAEntB,SAAQ,GAAI,MAAM2M,EAAEwgB,EAAEvjB,OAAO2nB,EAAEpE,EAAElqB,MAAMyP,KAAKya,EAAElqB,MAAMqH,KAAKmD,EAAE,MAAMd,EAAEygB,EAAE,GAAGmF,EAAEnF,MAAMmF,EAAE5lB,KAAK,MAAM,CAAC9H,MAAMsoB,EAAEntB,QAAQmtB,EAAElqB,MAAMjD,QAAQsK,KAAK6iB,EAAElqB,MAAM6B,SAAS,GAAGytB,EAAE9kB,OAAOA,EAAC,EAAG+kB,EAAE,CAAC,YAAYtF,EAAE,CAAC,KAAK,QAAQ,UAAU,YAAYuF,EAAE,CAAC,OAAO,IAAIC,WAAW9D,GAAG,qBAAE,IAASA,IAAIA,EAAEhpB,GAAGA,GAAG,MAAM6pB,EAAE,OAAOvB,EAAEU,GAAE,CAACxB,EAAED,KAAK,IAAIwF,SAAS/f,GAAGwa,EAAEzY,EAAElH,EAAE2f,EAAEoF,GAAG,MAAMvM,QAAQ9T,GAAGwa,KAAK9oB,SAAS4oB,GAAGuB,KAAK9K,GAAG4L,EAAEzoB,MAAM0mB,EAAE9U,QAAQyW,EAAEkE,SAASzZ,EAAEsW,GAAG9a,EAAE+X,EAAEjf,EAAEkH,EAAEuY,GAAGiC,EAAErC,EAAEgC,EAAE3c,GAAGwc,EAAEjI,UAAUyI,GAAG5B,EAAEd,EAAExmB,WAAW0oB,EAAE5J,EAAEoJ,EAAE1B,EAAExmB,SAAS0oB,GAAgB,OAAO,gBAAgB,IAAIhiB,EAAE,CAACnG,IAAI2mB,GAAGva,EAAE,eAAe2a,EAAE,YAAO,GAAQb,EAAEvT,EAAE,CAAC0Z,UAAUtF,EAAEuF,mBAAmB/N,EAAEvZ,KAAK2jB,EAAEtrB,SAAS4oB,IAAI,CAACjhB,KAAK2jB,EAAE4D,QAAQntB,IAAI,GAAG8mB,EAAEqG,SAASrG,EAAEqG,QAAQntB,GAAG,CAACA,IAAIA,EAAEotB,kBAAkB,IAAIptB,EAAEqtB,UAAUrtB,EAAEstB,SAASttB,EAAEutB,QAAQvtB,EAAEwtB,SAASxtB,EAAEytB,UAA3E,CAAsFztB,GAAG,CAACA,EAAE0tB,iBAAiB,IAAIlG,EAAEsB,EAAE,GAAG,kBAAkBA,GAAGnB,EAAE,CAAC,MAAM3nB,EAAE6H,EAAEd,EAAE,CAAC,EAAE8f,EAAEpmB,OAAOosB,GAAGrF,EAAEgB,EAAEzhB,EAAE,CAAC,EAAEogB,GAAGnnB,EAAE,CAACuW,EAAEgT,EAAE,CAAC9oB,MAAM0mB,EAAE9U,QAAQmV,GAAG,KAAI,IAAIc,EAAE1sB,YAAY,OAAO0sB,EAAElpB,UAAU,CAACke,GAAG,qBAAqB,MAAMoL,UAAU,YAAYpgB,eAAetI,GAAG2tB,SAAS3tB,GAAG1D,KAAKV,YAAY,uBAAuB,CAACgyB,qBAAqB5tB,GAAG1D,KAAKkjB,SAAS,CAAC,GAAGljB,KAAKe,MAAMwwB,WAAW7tB,EAAE,CAAClB,SAAS,OAAOxC,KAAKe,MAAM6B,QAAQ,EAAE,MAAM4uB,EAAE,gBAAgB,CAACF,kBAAkB,CAAC7sB,aAAQ,GAAQ2R,WAAM,EAAOqb,SAAS,KAAI,IAAK,SAAS5E,GAAGjqB,SAASsoB,IAAI,MAAMD,EAAEva,GAAG,aAAajG,EAAE,WAAWc,EAAE,WAAU,KAAI,CAAE+lB,kBAAkB7mB,EAAE2L,MAAM6U,EAAEwG,SAAS/gB,KAAI,CAACua,IAAiB,OAAO,gBAAgBuG,EAAE9uB,SAAS,CAACC,MAAM4I,GAAgB,gBAAgB6gB,EAAE,CAAChW,MAAM6U,EAAEsG,QAAQ,CAAC7tB,EAAEwnB,KAAKxa,EAAEhN,GAAG,MAAM+G,EAAEhG,SAASgG,EAAEhG,QAAQf,EAAEwnB,EAAC,GAAIA,GAAG,CAAC2B,EAAEvtB,YAAY,+BAA+B,MAAMysB,EAAE,SAASb,GAAG,IAAID,EAAEva,EAAE,SAASnF,EAAE0f,GAAgB,OAAO,gBAAgB4B,EAAE,KAAkB,gBAAgB3B,EAAEzgB,EAAE,CAACjM,IAAI,oBAAoBysB,IAAI,CAAC,OAAO1f,EAAEjM,YAAY,qBAAqB,OAAO2rB,EAAE,OAAOva,EAAEwa,EAAE5rB,aAAaoR,EAAEwa,EAAEzc,MAAMwc,EAAE,eAAe1f,CAAC,CAAnP,EAAqP,EAAE8V,QAAQ6J,EAAEL,EAAEjoB,SAASqoB,MAAM,MAAMtpB,SAAS+O,GAAGwa,GAAGzgB,EAAEc,GAAG,WAAW,CAAC5J,SAAS+O,KAAK+B,GAAG,SAASyY,GAAG,MAAMD,EAAE,aAAauG,GAAGvG,EAAEqG,kBAAkB7sB,aAAQ,EAAO,MAAMiM,EAAE,eAAc,KAAKua,EAAEwG,cAAS,EAAM,GAAG,IAAI,MAAM,CAACxG,EAAE7U,MAAM1F,EAAE,CAA3I,GAA+I,GAAG,aAAY,KAAKwa,EAAE0D,uBAAsB,GAAG,CAACnkB,EAAE9I,WAAW,aAAY,KAAK,IAAI+B,GAAE,EAAG,MAAMunB,EAAEC,EAAE2D,QAAO,EAAEltB,SAASupB,MAAM7jB,QAAQC,UAAUF,MAAK,KAAK5G,uBAAsB,KAAKkD,GAAG6H,EAAE,CAAC5J,SAASupB,GAAE,GAAE,GAAE,IAAI,MAAM,KAAKxnB,GAAE,EAAGunB,GAAE,CAAC,GAAG,IAAIxY,EAAE,CAAC,IAAIga,EAAEha,GAAG,MAAMA,EAAEwH,EAAExH,EAAEyc,IAAI,CAACnZ,SAAQ,GAAI,CAAc,OAAO,gBAAgByU,EAAE9nB,SAAS,CAACC,MAAM8H,GAAG,mBAAmBwgB,EAAEA,EAAExgB,GAAGwgB,GAAG,KAAI,IAAIoB,EAAE,EAAEzpB,SAASsoB,MAAM,MAAMD,EAAEa,IAAI,OAAOb,EAAEC,EAAED,GAAgB,gBAAgBc,EAAE,KAAKb,EAAC,EAAGyB,EAAE,EAAExb,IAAI+Z,EAAEtoB,SAASqoB,MAAM,MAAMva,EAAEwa,EAAEnjB,QAAQ,KAAK,IAAI0C,EAAEc,EAAE,GAAG,OAAOmF,GAAG,GAAGjG,EAAEygB,EAAE5pB,UAAU,EAAEoP,GAAGnF,EAAE2f,EAAE5pB,UAAUoP,IAAIjG,EAAEygB,EAAe,gBAAgBV,EAAE9nB,SAAS,CAACC,MAAM,CAAChB,SAAS,CAACoC,SAAS0G,EAAEpC,OAAOkD,EAAErK,KAAK,MAAM+pB,EAAC,EAAGyG,GAAE,EAAEtpB,KAAK1E,EAAEd,SAASsoB,MAAM,MAAMnH,QAAQkH,GAAGR,KAAK9oB,SAAS+O,GAAGob,IAAIvgB,EAAEqf,EAAElnB,EAAEunB,GAAGxY,EAAEoZ,EAAEtgB,EAAEmF,EAAE3M,UAAU,OAAOmnB,EAAE,CAACvpB,SAAS+O,EAAEgU,MAAMjS,EAAEhI,EAAE,CAAC,EAAEgI,EAAEvI,OAAO,CAACglB,IAAIzc,EAAEyc,IAAI9mB,KAAK1E,IAAI,MAAK,EAAGiuB,GAAE,CAAC,MAAM,WAAW,aAAaC,GAAE,CAAC,WAAW,QAAQ,YAAY,MAAM,YAAYC,GAAG3G,IAAI,IAAIgE,IAAIjE,EAAEtpB,SAAS+O,EAAEhD,UAAU+E,GAAGyY,EAAEjb,EAAE1E,EAAE2f,EAAEyG,IAAgB,OAAO,gBAAgBG,GAAGrnB,EAAE,CAAC,EAAEwF,EAAE,CAACvC,UAAU+E,EAAEyc,IAAIjE,EAAEtpB,SAAS+O,IAAG,EAAG,IAAI0H,GAAG,EAAE,MAAM0Z,GAAG5G,IAAI,IAAItoB,SAASqoB,EAAEnL,MAAMpP,EAAEhD,UAAU+E,EAAE,MAAMyc,IAAIjf,EAAEtO,SAAS4oB,GAAGW,EAAE0B,EAAErhB,EAAE2f,EAAE0G,IAAG,MAAM/G,EAAE,WAAW5Q,EAAE,UAAS,GAAIuS,EAAE,SAASvc,GAAGgH,EAAE,SAASsT,EAAExmB,UAAUymB,EAAE,UAAS,GAAI,aAAY,KAAKpS,KAAKqS,IAAI,KAAKrS,KAAK,IAAIA,KAAK6B,EAAExV,SAAQ,EAAE,IAAI,IAAI,aAAY,KAAK,IAAIf,GAAE,EAAGwnB,GAAE,EAAGjb,IAAIuc,EAAE/nB,UAAU+nB,EAAE/nB,QAAQwL,EAAEvM,GAAE,GAAI6mB,EAAExmB,WAAWkT,EAAExS,UAAUwS,EAAExS,QAAQ8lB,EAAExmB,SAASmnB,GAAE,GAAIV,EAAE/lB,QAAQf,GAAGwnB,GAAGX,EAAExmB,WAAWkM,EAAEua,EAAE/lB,SAASgmB,GAAE,GAAG,CAACxa,EAAEsa,IAAI,MAAME,EAAE,eAAc,KAAK,IAAI/mB,EAAkCuW,EAAExV,QAAQwV,EAAExV,SAAQ,GAAIf,EAAEmnB,EAAEpmB,QAAQ+lB,EAAE/lB,SAASf,GAAGA,EAAEquB,QAAQ,GAAG,IAAiB,OAAO,gBAAgBtf,EAAEhI,EAAE,CAACqV,MAAMrV,EAAE,CAACunB,QAAQ,QAAQthB,GAAGuhB,SAAS,KAAK3tB,IAAIumB,GAAG+B,GAAG3B,EAAC,EAAGiH,GAAG,CAAC,WAAW,UAAU,WAAW,WAAW,UAAU,aAAaC,GAAGjH,IAAI,MAAMD,EAAER,IAAI/Z,EAAEob,IAAiB,OAAO,gBAAgBsG,GAAG3nB,EAAE,CAAC,EAAEwgB,EAAEva,EAAEwa,GAAE,EAAG,SAASkH,GAAGlH,GAAG,MAAMvpB,SAASspB,EAAEoH,QAAQ3hB,GAAE,EAAG9N,SAAS6P,EAAEuR,SAAS/T,EAAEvC,UAAU6c,EAAE,OAAOW,EAAE0B,EAAErhB,EAAE2f,EAAEgH,IAAIrH,EAAE,mBAAmBpY,GAAGxL,QAAO,CAACvD,EAAEwnB,KAAK,MAAMD,EAAEkC,EAAEld,EAAFkd,CAAKjC,GAAG,OAAOxnB,EAAEzD,OAAOgrB,EAAC,GAAG,KAAKlnB,SAASkW,GAAGgR,EAAEuB,EAAEf,EAAEZ,EAAE5Q,GAAG,GAAGuS,EAAE,CAAC,MAAMtiB,OAAOghB,EAAEgE,IAAI3jB,EAAEvB,MAAMyI,EAAEzI,OAAOrH,MAAMkoB,IAAI2B,EAAEvS,EAAExH,EAAE3U,QAAQmS,EAAEwC,EAAErK,KAAK2N,QAAQ,MAAM,IAAIyU,EAAE/f,EAAE,CAAC,EAAEygB,EAAE,CAACgE,IAAI3jB,EAAE5J,SAASspB,IAAIR,EAAE,eAAeI,EAAEL,EAAEK,EAAE9pB,MAAM6B,SAAsB,gBAAgBuvB,GAAG,CAACxwB,SAASspB,EAAEoH,QAAQ3hB,GAAGma,EAAE9pB,MAAM6B,eAAU,GAAQkpB,EAAEpb,EAAEmhB,GAAGtH,EAAE0C,EAAEvc,EAAEjG,EAAE,CAACykB,IAAI3jB,EAAE5J,SAASspB,EAAEvd,UAAU6c,GAAGqC,GAAGA,EAAe,OAAO,gBAAgB3V,EAAEvU,SAAS,CAACC,MAAM,CAACohB,QAAQxY,EAAEyY,SAAS/J,IAAiB,gBAAgB6R,EAAEmB,EAAExC,GAAG,CAAC,OAAO,IAAI,CAAC,MAAM6H,GAAG,KAAK,MAAM5uB,EAAEooB,IAAI,IAAIpoB,EAAE,MAAM,IAAIuJ,MAAM,8JAA8J,OAAOvJ,EAAE/B,UAAU4wB,GAAG,KAAK,MAAM,IAAItlB,MAAM,wEAAuE,EAAGulB,GAAG,KAAK,MAAM9uB,EAAE+mB,IAAI,IAAI/mB,EAAE,MAAM,IAAIuJ,MAAM,4JAA4J,MAAMie,EAAEoH,KAAKrH,EAAEY,EAAEnoB,EAAEsgB,SAASkH,EAAEnnB,UAAU,OAAOknB,EAAEA,EAAE/gB,OAAO,MAAMuoB,GAAG/uB,IAAI,IAAIA,EAAE,MAAM,IAAIuJ,MAAM,4EAA4E,MAAMie,EAAET,IAAI,IAAIS,EAAE,MAAM,IAAIje,MAAM,2JAA2J,MAAMge,EAAEqH,KAAK5hB,EAAEka,EAAElnB,EAAEwnB,EAAEnH,SAASxY,EAAEsgB,EAAEnb,EAAEua,EAAElnB,UAAU,OAAOwH,EAAEd,EAAE,CAAC,EAAEc,EAAErB,OAAO,CAACglB,IAAI3jB,EAAE2jB,IAAI9mB,KAAK1E,IAAI,sNCCv+U,SAASuM,IAAI,OAAOA,EAAE5R,OAAOqX,OAAOrX,OAAOqX,OAAOhV,OAAO,SAASwqB,GAAG,IAAI,IAAIxnB,EAAE,EAAEA,EAAEhE,UAAUC,OAAO+D,IAAI,CAAC,IAAIgN,EAAEhR,UAAUgE,GAAG,IAAI,IAAIunB,KAAKva,EAAErS,OAAOI,UAAUC,eAAeC,KAAK+R,EAAEua,KAAKC,EAAED,GAAGva,EAAEua,GAAG,CAAC,OAAOC,CAAC,EAAEjb,EAAElQ,MAAMC,KAAKN,UAAU,CAAC,SAAS6L,EAAE2f,GAAG,IAAIxnB,EAAEwnB,GAAG,IAAIxa,EAAE,GAAGua,EAAE,GAAG,MAAMxgB,EAAE/G,EAAEqE,QAAQ,MAAM,IAAI0C,IAAIwgB,EAAEvnB,EAAEzG,MAAMwN,GAAG/G,EAAEA,EAAEzG,MAAM,EAAEwN,IAAI,MAAMgI,EAAE/O,EAAEqE,QAAQ,KAAK,OAAO,IAAI0K,IAAI/B,EAAEhN,EAAEzG,MAAMwV,GAAG/O,EAAEA,EAAEzG,MAAM,EAAEwV,IAAI,CAAC1O,SAASL,EAAE2E,OAAO,MAAMqI,EAAE,GAAGA,EAAExP,KAAK,MAAM+pB,EAAE,GAAGA,EAAE,CAAC,MAAMV,EAAE,6BAA6BM,EAAEK,IAAI,GAAG,iBAAiBA,EAAE,OAAO,CAACA,GAAGX,EAAE5Z,KAAKua,GAAX,CAAeA,EAAC,EAAwH,SAASsB,EAAEtB,EAAExnB,EAAkG,IAAkB,IAAIgN,EAAE,IAAIma,EAAEK,GAAG,OAAOA,EAAE,GAAGA,EAAE9hB,WAAW,OAAO8hB,EAAE9hB,WAAW,OAAO,OAAO8hB,EAAE,MAAMD,EAAE,OAAOva,EAAE,MAAMhN,EAAEA,EAAtP,IAA6PgN,EAAE,IAAI,MAAM,GAAG,MAAMua,GAAGA,EAAEnuB,SAAS,KAAKmuB,EAAEhuB,MAAM,GAAG,GAAGguB,IAAIC,EAAE9hB,WAAW,KAAK8hB,EAAE,IAAIA,KAAK,CAAC,MAAMT,EAAES,GAAG,MAAMA,OAAE,EAAOA,EAAE9hB,WAAW,KAA4E,SAAS4mB,EAAE9E,EAAExnB,GAAG,MAAMK,SAAS2M,EAAErI,OAAO4iB,EAAE/pB,KAAKuJ,GAAGc,EAAE2f,GAAG,MAAM,IAAG,OAAExa,EAAEhN,KAAKunB,IAAIxgB,GAAG,CAAC,MAAM+f,EAAE,CAACU,EAAExnB,IAAI,iBAAiBwnB,EAAEA,EAAEL,EAAEK,GAAGT,EAAES,GAAG,SAASA,GAAG,MAAMxnB,EAAE8oB,EAAEtB,GAAGxa,EAAlL,SAAwL,OAAgCsf,EAAEtsB,EAAEgN,EAAI,CAAvE,CAAyEwa,GAAG,SAASA,EAAExnB,GAAG,GAAG+mB,EAAES,GAAG,OAAOA,EAAE,MAAMD,EAA1Q,SAAgRxgB,GAAE,aAAEygB,EAAExnB,GAAG,OAAgCssB,EAAEvlB,EAAEwgB,EAAI,CAA5F,CAA8FC,EAAExnB,GAAGwnB,EAAEjU,EAAE,CAAC,KAAK,WAAW,UAAU,eAAe,kBAAkB,cAAc,WAAW,kBAAkB,QAAQ,UAAU,aAAa,SAAS6U,EAAEZ,GAAG,OAAOsB,EAAEtB,EAAv5B,GAA65B,CAAC,MAAMG,EAAE,CAACqH,gBAAgB,SAASC,YAAY,SAASC,gBAAgB,QAAQ,SAAS/P,EAAEqI,GAAgB,OAAO,gBAAgB,WAAE,MAAK,EAAEvpB,SAAS+O,KAAkB,gBAAgB+a,EAAExb,EAAE,CAAC,EAAEib,EAAE,CAAC2H,UAAUniB,MAAK,CAAC,MAAM+a,UAAU,YAAYzf,YAAYkf,GAAGmG,MAAMnG,GAAGlrB,KAAK8yB,gBAAgB,EAAElC,mBAAmB1F,EAAEyF,UAAUjtB,MAAM1D,KAAKe,MAAM6xB,gBAAgB1H,EAAExnB,GAAG,CAACqvB,UAAU,CAAC/yB,KAAKe,MAAMgyB,UAAU/yB,KAAKe,MAAM2xB,iBAAiB1rB,OAAOkE,SAAS6D,KAAK,KAAK+Q,MAAM7P,EAAE,CAAC,EAAEjQ,KAAKe,MAAM+e,MAAM9f,KAAKe,MAAM4xB,cAAc,KAAK,IAAIjvB,GAAE,EAAG,oBAAoBpD,QAAQA,OAAO0yB,uBAAuBtvB,GAAE,GAAI1D,KAAKmE,MAAM,CAAC8uB,YAAYvvB,GAAG1D,KAAKkzB,cAAc,KAAKlzB,KAAKmzB,UAAUnzB,KAAKmzB,UAAUzyB,KAAKV,KAAK,CAACozB,YAAY,IAAIlI,EAAE5qB,OAAOqB,SAASoC,SAASzD,OAAOqB,SAAS0G,OAAOrI,KAAKe,MAAM8xB,WAAW7yB,KAAKe,MAAM8xB,UAAU9uB,WAAWmnB,EAAElrB,KAAKe,MAAM8xB,UAAU9uB,SAAS/D,KAAKe,MAAM8xB,UAAUxqB,QAAQ,MAAM3E,EAAE6H,EAAEif,EAAExqB,KAAKe,MAAMigB,GAAGkK,IAAIxa,EAAEhN,EAAEK,SAASL,EAAE2E,OAAO,GAAG6iB,IAAIxa,EAAE,OAAO6S,UAAUjL,QAAQ5H,EAAE,CAACtO,uBAAuB,IAAIpC,KAAKqzB,GAAG,OAAO,MAAM7nB,SAAS0f,EAAEpF,GAAGpiB,GAAG1D,KAAKqzB,GAAGrzB,KAAKkzB,eAAelzB,KAAKkzB,cAAczZ,QAAQyR,EAAEoI,UAAU5vB,GAAGwnB,EAAEqI,YAAY,CAACJ,UAAUjI,GAAGlrB,KAAKe,MAAM0vB,UAAUpyB,OAAOI,UAAUC,eAAeC,KAAKqB,KAAKe,MAAM0vB,SAAS,WAAWzwB,KAAKe,MAAM0vB,SAAShsB,QAAQymB,EAAElrB,KAAKe,MAAM0vB,UAAUzwB,KAAKe,MAAM0vB,SAASvF,GAAGlrB,KAAKmE,MAAM8uB,aAAa/H,IAAIlrB,KAAKqzB,GAAG,EAAEnI,EAAExnB,KAAK,MAAMgN,EAAE,IAAIpQ,OAAO0yB,sBAAqBtiB,IAAIA,EAAEa,SAAQb,IAAIwa,IAAIxa,EAAE8iB,QAAQ9vB,EAAEgN,EAAE+iB,gBAAgB/iB,EAAEgjB,kBAAkB,EAAC,GAAE,IAAI,OAAOhjB,EAAEijB,QAAQzI,GAAG,CAAC1f,SAASkF,EAAEoV,GAAGoF,EAAG,EAAtK,CAAwKA,GAAEA,IAAIA,EAAElrB,KAAKkzB,cAAclzB,KAAKozB,YAAYpzB,KAAKkzB,eAAelzB,KAAKkzB,cAAczZ,OAAM,IAAI,CAACjX,SAAS,MAAM0oB,EAAElrB,KAAKe,OAAOigB,GAAGtQ,EAAEggB,SAASzF,EAAEjrB,KAAK8yB,gBAAgBjC,QAAQpe,EAAEmhB,aAAarJ,EAAEpmB,MAAM8V,EAAElE,QAAQyW,EAAEqG,UAAUpI,GAAGS,EAAE0B,EAAE,SAAS1B,EAAExnB,GAAG,GAAG,MAAMwnB,EAAE,MAAM,CAAC,EAAE,IAAIxa,EAAEua,EAAExgB,EAAE,CAAC,EAAEgI,EAAEpU,OAAO6O,KAAKge,GAAG,IAAID,EAAE,EAAEA,EAAExY,EAAE9S,OAAOsrB,IAAIvnB,EAAEqE,QAAQ2I,EAAE+B,EAAEwY,KAAK,IAAIxgB,EAAEiG,GAAGwa,EAAExa,IAAI,OAAOjG,CAAC,CAAjI,CAAmIygB,EAAEjU,GAAgN+Y,EAAExF,EAAE9Z,EAAE+Z,EAAE1mB,UAAU,OAAO8mB,EAAEmF,GAAgB,gBAAgB,OAAE/f,EAAE,CAAC+Q,GAAGgP,EAAE7rB,MAAM8V,EAAEyW,SAASzF,EAAEwF,SAASzwB,KAAKmzB,UAAUS,aAAa1I,IAAIX,GAAGA,EAAEW,GAAG,MAAMxnB,EAAE6H,EAAEykB,GAAGzM,UAAUzI,SAASpX,EAAEK,SAASL,EAAE2E,OAAM,EAAGwoB,QAAQ3F,IAAI,GAAGzY,GAAGA,EAAEyY,KAAK,IAAIA,EAAE6F,QAAQ/wB,KAAKe,MAAMyyB,QAAQtI,EAAE4F,kBAAkB5F,EAAE8F,SAAS9F,EAAE+F,QAAQ/F,EAAEgG,SAAShG,EAAEiG,UAAU,CAACjG,EAAEkG,iBAAiB,IAAI1tB,EAAE8oB,EAAE,MAAM9b,EAAE8T,UAAUwL,KAAKvF,EAAE1mB,SAAS,kBAAkByoB,GAAG9b,IAAIhN,GAAE,GAAIpD,OAAOojB,YAAYsM,EAAE,CAAC7rB,MAAM8V,EAAElE,QAAQrS,GAAG,CAAC,OAAM,CAAC,GAAIkpB,IAAiB,gBAAgB,IAAI3c,EAAE,CAAC3G,KAAK0mB,GAAGpD,GAAG,EAAEnB,EAAE3oB,UAAUmN,EAAE,CAAC,EAAEob,EAAE,CAACwF,QAAQ,OAAO7P,GAAG,oBAAoBjL,QAAQ,OAAO5R,MAAM,WAAW,MAAMirB,EAAE,cAAa,CAAClE,EAAExa,IAAiB,gBAAgBmS,EAAE5S,EAAE,CAACwgB,SAAS/f,GAAGwa,MAAK+B,EAAE,CAAC/B,EAAExnB,KAAKpD,OAAOojB,YAAY8G,EAAEU,EAAE5qB,OAAOqB,SAASoC,UAAUL,EAAC,mQCA90I,SAAS+G,IAAI,OAAOA,EAAEpM,OAAOqX,OAAOrX,OAAOqX,OAAOhV,OAAO,SAASwqB,GAAG,IAAI,IAAIxnB,EAAE,EAAEA,EAAEhE,UAAUC,OAAO+D,IAAI,CAAC,IAAIgN,EAAEhR,UAAUgE,GAAG,IAAI,IAAIunB,KAAKva,EAAErS,OAAOI,UAAUC,eAAeC,KAAK+R,EAAEua,KAAKC,EAAED,GAAGva,EAAEua,GAAG,CAAC,OAAOC,CAAC,EAAEzgB,EAAE1K,MAAMC,KAAKN,UAAU,CAAC,MAAM6L,EAAE,IAAI9C,IAAIoiB,EAAE,CAAC3sB,IAAIgtB,GAAG3f,EAAErN,IAAIgtB,IAAI,GAAGrsB,IAAIqsB,EAAExnB,GAAG,MAAMgN,EAAEnF,EAAErN,IAAIgtB,IAAI,GAAGxa,EAAE9I,KAAKlE,GAAG6H,EAAE1M,IAAIqsB,EAAExa,EAAE,EAAEoE,OAAOoW,GAAG3f,EAAEuJ,OAAOoW,EAAE,GAAGzY,EAAE,oBAAoBpD,MAAMA,KAAK8K,qBAAqB9K,KAAK8K,oBAAoBzZ,KAAKJ,SAAS,SAAS4qB,GAAG,MAAMxnB,EAAEkU,KAAKmX,MAAM,OAAOpb,YAAW,WAAWuX,EAAE,CAAC2I,YAAW,EAAGC,cAAc,WAAW,OAAOtE,KAAKC,IAAI,EAAE,IAAI7X,KAAKmX,MAAMrrB,GAAG,GAAG,GAAE,EAAE,EAAE,IAAI6mB,EAAYW,KAAiFX,IAAIA,EAAE,CAAC,IAAnFwJ,YAAY,eAAe7I,EAAE8I,KAAK,OAAO9I,EAAE+I,cAAc,kBAA8B,MAAMhkB,EAAE,IAAIyE,IAAI,CAAC,MAAM,WAAW,0BAA0B,WAAW,SAAS,YAAYkY,EAAE,IAAIlY,IAAI8V,EAAE,IAAI/hB,IAAI,SAASgiB,EAAE/mB,GAAgB,OAAO,gBAAgB,WAAE,MAAK,IAAiB,gBAAgBuW,EAAEvW,IAAG,CAAC,SAASuW,EAAEvJ,GAAG,MAAMwjB,IAAI3oB,EAAE4oB,SAASlkB,EAAEsa,EAAEwJ,aAAarjB,GAAG,CAAC,GAAG3M,SAAS6oB,IAAG,mBAAI,IAAG,gBAAE,KAAK,IAAI1B,EAAE,OAAOjb,GAAG,KAAKsa,EAAEwJ,YAAY7I,EAAEY,EAAEpb,GAAG,MAAM,KAAK6Z,EAAEyJ,KAAKvhB,GAAE,KAAKyY,EAAEY,EAAEpb,EAAC,IAAI,MAAM,KAAK6Z,EAAE0J,cAAc,CAAC,MAAM/I,EAAErI,EAAEnS,GAAGma,EAAEhsB,IAAI+tB,EAAE1B,EAAE,EAAE,MAAM,KAAK,MAAMlmB,OAAOtB,EAAE0wB,aAAa1jB,EAAE2jB,cAAcpJ,GAAGC,GAAG,CAAC,EAAExa,IAAI,MAAMhN,GAAGA,EAAErB,oBAAoB,OAAOqO,IAAIua,IAAI,MAAMvnB,GAAGA,EAAErB,oBAAoB,QAAQ4oB,IAAI,MAAMvnB,GAAGA,EAAE4B,QAAO,CAAC,GAAG,IAAI2K,IAAIsa,EAAE0J,cAAc,CAAC,MAAMvwB,EAAEuT,EAAEvG,GAAGua,EAAEpI,EAAEnS,GAAG,MAAM,oBAAoBpQ,QAAQuqB,EAAEhsB,IAAI+tB,EAAE3B,GAAgB,gBAAgB,SAASvnB,EAAE+G,EAAE,CAAC/C,KAAK,iBAAiB,gBAAgBuI,EAAE2K,YAAY,aAAaqQ,EAAE,CAACqJ,wBAAwB,CAACC,OAAOtd,EAAEvG,MAAMjG,EAAE,CAAC/C,KAAK,iBAAiBwsB,IAAI1H,EAAEjhB,GAAG,gBAAgB0E,EAAE2K,YAAY,aAAaqQ,GAAG,CAAC,OAAO,IAAI,CAAC,SAASa,EAAEZ,GAAG,MAAM3e,GAAG7I,EAAEwwB,IAAIxjB,EAAEyjB,SAASlJ,EAAEV,EAAEwJ,YAAYS,OAAOjpB,EAAEgmB,QAAQ1G,GAAGK,GAAG,CAAC,EAAEzY,EAAE/O,GAAGgN,EAAET,EAAE,CAAC,OAAO,SAASwa,EAAE,CAACgK,KAAKlpB,EAAE6K,MAAMyU,GAAG,GAAGpY,EAAE,CAAC,IAAI,MAAMyY,KAAKjb,EAAE,GAAG,MAAMwa,GAAGA,EAAES,GAAG,CAAC,IAAIjR,EAAE,MAAMvW,EAAE8mB,EAAEtsB,IAAIuU,IAAI,CAAC,GAAGiiB,UAAUhkB,EAAE,KAAK,MAAMhN,OAAE,EAAOA,EAAEwnB,KAAK,CAAC,EAAE,IAAIY,EAAEU,EAAE9b,EAAE9I,KAAK,MAAM6iB,OAAE,EAAOA,EAAES,IAAI,MAAMxnB,GAAG,OAAOuW,EAAEvW,EAAEwnB,KAAKjR,EAAEyG,MAAM,MAAM+J,GAAG,OAAOqB,EAAErB,EAAES,KAAKY,EAAEntB,KAAK8rB,EAAE,MAAM/mB,GAAG,OAAO8oB,EAAE9oB,EAAEwnB,SAAI,EAAOsB,EAAE9L,OAAO8J,EAAE3rB,IAAI4T,EAAEhI,EAAE,CAAC,EAAE/G,EAAE,CAAC,CAACwnB,GAAG,CAACwJ,UAAUhkB,KAAK,CAAC,GAAGkc,EAAE3uB,IAAIwU,GAAG,OAAO,IAAI,CAAC,MAAM4Y,EAAEpU,EAAEiU,GAAGN,EAAE/H,EAAEqI,GAAGO,EAAErqB,SAASqB,cAAc,UAAUiB,IAAI+nB,EAAElf,GAAG7I,GAAG+nB,EAAE/lB,QAAQyuB,SAASlJ,EAAE,IAAI,MAAMC,EAAExnB,KAAKrF,OAAOyQ,QAAQ8b,GAAGa,EAAEna,aAAa4Z,EAAExnB,GAAG2nB,IAAII,EAAEpJ,YAAYgJ,GAAG3a,IAAI+a,EAAEyI,IAAIxjB,GAAG,MAAMub,EAAE,CAAC,EAAE,GAAGxZ,EAAE,CAAC,IAAI,MAAMyY,KAAKjb,EAAE,CAAC,MAAMvM,EAAEA,GAAG+oB,EAAE/oB,EAAE+O,EAAEyY,GAAGO,EAAE1pB,iBAAiBmpB,EAAExnB,GAAGuoB,EAAE,GAAGf,aAAaxnB,CAAC,CAACkpB,EAAEpS,IAAI/H,EAAE,CAAC,OAAOrR,SAASuzB,KAAK3uB,YAAYylB,GAAG,CAACzmB,OAAOymB,EAAE2I,aAAanI,EAAEmI,aAAaC,cAAcpI,EAAEoI,cAAc,CAAC,SAASpd,EAAEiU,GAAG,MAAMoJ,wBAAwB5wB,EAAEd,SAAS8N,EAAE,IAAIwa,GAAG,CAAC,GAAGqJ,OAAOtJ,EAAE,IAAIvnB,GAAG,CAAC,EAAE,OAAOunB,GAAGva,CAAC,CAAC,SAASmS,EAAEqI,GAAG,MAAMxnB,EAAE,CAAC,EAAE,IAAI,MAAMgN,EAAEua,KAAK5sB,OAAOyQ,QAAQoc,GAAGjb,EAAEhS,IAAIyS,KAAKhN,EAAEgN,GAAGua,GAAG,OAAOvnB,CAAC,CAAC,SAAS8oB,EAAEtB,GAAG,GAAGA,EAAE,MAAM,4BAA4BniB,mBAAmBmiB,IAAI,CAAC,SAASuB,EAAEvB,EAAExnB,EAAEgN,GAAG,MAAMua,EAAET,EAAEtsB,IAAIwF,IAAI,CAAC,EAAE,IAAI,MAAMA,KAAK,MAAMunB,GAAG,OAAOxgB,EAAEwgB,EAAEva,SAAI,EAAOjG,EAAEiqB,YAAY,GAAG,CAAC,IAAIjqB,EAAE/G,EAAEwnB,EAAE,CAACV,EAAE3rB,IAAI6E,EAAE,CAAC,CAACgN,GAAG,CAACgQ,MAAMwK,IAAI","sources":["webpack://skohub-blog/./node_modules/gatsby-page-utils/dist/apply-trailing-slash-option.js","webpack://skohub-blog/./node_modules/gatsby-react-router-scroll/index.js","webpack://skohub-blog/./node_modules/gatsby-react-router-scroll/scroll-handler.js","webpack://skohub-blog/./node_modules/gatsby-react-router-scroll/session-storage.js","webpack://skohub-blog/./node_modules/gatsby-react-router-scroll/use-scroll-restoration.js","webpack://skohub-blog/./node_modules/gatsby/dist/internal-plugins/partytown/gatsby-browser.js","webpack://skohub-blog/./node_modules/gatsby/dist/internal-plugins/partytown/utils/get-forwards.js","webpack://skohub-blog/./node_modules/gatsby/dist/internal-plugins/partytown/utils/inject-partytown-snippet.js","webpack://skohub-blog/./.cache/_this_is_virtual_fs_path_/$virtual/async-requires.js","webpack://skohub-blog/./.cache/api-runner-browser-plugins.js","webpack://skohub-blog/./.cache/api-runner-browser.js","webpack://skohub-blog/./.cache/create-content-digest-browser-shim.js","webpack://skohub-blog/./.cache/emitter.js","webpack://skohub-blog/./node_modules/mitt/dist/mitt.es.js","webpack://skohub-blog/./.cache/normalize-page-path.js","webpack://skohub-blog/./.cache/find-path.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/getPrototypeOf.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/isNativeReflectConstruct.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/construct.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/wrapNativeSuper.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/isNativeFunction.js","webpack://skohub-blog/./.cache/slice/server-slice-renderer.js","webpack://skohub-blog/./.cache/slice/server-slice.js","webpack://skohub-blog/./.cache/slice/inline-slice.js","webpack://skohub-blog/./.cache/slice.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/assertThisInitialized.js","webpack://skohub-blog/./.cache/gatsby-browser-entry.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/arrayLikeToArray.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/toConsumableArray.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/arrayWithoutHoles.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/iterableToArray.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/unsupportedIterableToArray.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/nonIterableSpread.js","webpack://skohub-blog/./.cache/prefetch.js","webpack://skohub-blog/./.cache/loader.js","webpack://skohub-blog/./.cache/head/components/fire-callback-in-effect.js","webpack://skohub-blog/./.cache/head/constants.js","webpack://skohub-blog/./.cache/head/utils.js","webpack://skohub-blog/./.cache/head/head-export-handler-for-browser.js","webpack://skohub-blog/./.cache/page-renderer.js","webpack://skohub-blog/./.cache/route-announcer-props.js","webpack://skohub-blog/./.cache/navigation.js","webpack://skohub-blog/./node_modules/shallow-compare/es/index.js","webpack://skohub-blog/./.cache/ensure-resources.js","webpack://skohub-blog/./.cache/production-app.js","webpack://skohub-blog/./.cache/public-page-renderer-prod.js","webpack://skohub-blog/./.cache/public-page-renderer.js","webpack://skohub-blog/./.cache/react-dom-utils.js","webpack://skohub-blog/./.cache/redirect-utils.js","webpack://skohub-blog/./.cache/register-service-worker.js","webpack://skohub-blog/./.cache/slice/context.js","webpack://skohub-blog/./.cache/static-query.js","webpack://skohub-blog/./.cache/context-utils.js","webpack://skohub-blog/./.cache/strip-prefix.js","webpack://skohub-blog/./node_modules/gatsby-plugin-canonical-urls/gatsby-browser.js","webpack://skohub-blog/./node_modules/gatsby-plugin-manifest/gatsby-browser.js","webpack://skohub-blog/./node_modules/gatsby-plugin-manifest/get-manifest-pathname.js","webpack://skohub-blog/./node_modules/gatsby-remark-images/constants.js","webpack://skohub-blog/./node_modules/gatsby-remark-images/gatsby-browser.js","webpack://skohub-blog/./node_modules/invariant/browser.js","webpack://skohub-blog/./node_modules/react-server-dom-webpack/cjs/react-server-dom-webpack.production.min.js","webpack://skohub-blog/./node_modules/react-server-dom-webpack/index.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/assertThisInitialized.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/inheritsLoose.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/interopRequireDefault.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/setPrototypeOf.js","webpack://skohub-blog/./node_modules/@builder.io/partytown/integration/index.cjs","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/inheritsLoose.js","webpack://skohub-blog/./node_modules/@babel/runtime/helpers/esm/setPrototypeOf.js","webpack://skohub-blog/./node_modules/@gatsbyjs/reach-router/dist/index.modern.mjs","webpack://skohub-blog/./node_modules/gatsby-link/dist/index.modern.mjs","webpack://skohub-blog/./node_modules/gatsby-script/dist/index.modern.mjs"],"sourcesContent":["\"use strict\";\n\nexports.__esModule = true;\nexports.applyTrailingSlashOption = void 0;\nconst applyTrailingSlashOption = (input, option = `always`) => {\n const hasHtmlSuffix = input.endsWith(`.html`);\n const hasXmlSuffix = input.endsWith(`.xml`);\n const hasPdfSuffix = input.endsWith(`.pdf`);\n if (input === `/`) return input;\n if (hasHtmlSuffix || hasXmlSuffix || hasPdfSuffix) {\n option = `never`;\n }\n if (option === `always`) {\n return input.endsWith(`/`) ? input : `${input}/`;\n }\n if (option === `never`) {\n return input.endsWith(`/`) ? input.slice(0, -1) : input;\n }\n return input;\n};\nexports.applyTrailingSlashOption = applyTrailingSlashOption;","\"use strict\";\n\nexports.__esModule = true;\nexports.useScrollRestoration = exports.ScrollContext = void 0;\nvar _scrollHandler = require(\"./scroll-handler\");\nexports.ScrollContext = _scrollHandler.ScrollHandler;\nvar _useScrollRestoration = require(\"./use-scroll-restoration\");\nexports.useScrollRestoration = _useScrollRestoration.useScrollRestoration;","\"use strict\";\n\nvar _interopRequireDefault = require(\"@babel/runtime/helpers/interopRequireDefault\");\nexports.__esModule = true;\nexports.ScrollHandler = exports.ScrollContext = void 0;\nvar _assertThisInitialized2 = _interopRequireDefault(require(\"@babel/runtime/helpers/assertThisInitialized\"));\nvar _inheritsLoose2 = _interopRequireDefault(require(\"@babel/runtime/helpers/inheritsLoose\"));\nvar React = _interopRequireWildcard(require(\"react\"));\nvar _propTypes = _interopRequireDefault(require(\"prop-types\"));\nvar _sessionStorage = require(\"./session-storage\");\nfunction _getRequireWildcardCache(nodeInterop) { if (typeof WeakMap !== \"function\") return null; var cacheBabelInterop = new WeakMap(); var cacheNodeInterop = new WeakMap(); return (_getRequireWildcardCache = function _getRequireWildcardCache(nodeInterop) { return nodeInterop ? cacheNodeInterop : cacheBabelInterop; })(nodeInterop); }\nfunction _interopRequireWildcard(obj, nodeInterop) { if (!nodeInterop && obj && obj.__esModule) { return obj; } if (obj === null || typeof obj !== \"object\" && typeof obj !== \"function\") { return { default: obj }; } var cache = _getRequireWildcardCache(nodeInterop); if (cache && cache.has(obj)) { return cache.get(obj); } var newObj = {}; var hasPropertyDescriptor = Object.defineProperty && Object.getOwnPropertyDescriptor; for (var key in obj) { if (key !== \"default\" && Object.prototype.hasOwnProperty.call(obj, key)) { var desc = hasPropertyDescriptor ? Object.getOwnPropertyDescriptor(obj, key) : null; if (desc && (desc.get || desc.set)) { Object.defineProperty(newObj, key, desc); } else { newObj[key] = obj[key]; } } } newObj.default = obj; if (cache) { cache.set(obj, newObj); } return newObj; }\nvar ScrollContext = /*#__PURE__*/React.createContext(new _sessionStorage.SessionStorage());\nexports.ScrollContext = ScrollContext;\nScrollContext.displayName = \"GatsbyScrollContext\";\nvar ScrollHandler = /*#__PURE__*/function (_React$Component) {\n (0, _inheritsLoose2.default)(ScrollHandler, _React$Component);\n function ScrollHandler() {\n var _this;\n for (var _len = arguments.length, args = new Array(_len), _key = 0; _key < _len; _key++) {\n args[_key] = arguments[_key];\n }\n _this = _React$Component.call.apply(_React$Component, [this].concat(args)) || this;\n _this._stateStorage = new _sessionStorage.SessionStorage();\n _this._isTicking = false;\n _this._latestKnownScrollY = 0;\n _this.scrollListener = function () {\n _this._latestKnownScrollY = window.scrollY;\n if (!_this._isTicking) {\n _this._isTicking = true;\n requestAnimationFrame(_this._saveScroll.bind((0, _assertThisInitialized2.default)(_this)));\n }\n };\n _this.windowScroll = function (position, prevProps) {\n if (_this.shouldUpdateScroll(prevProps, _this.props)) {\n window.scrollTo(0, position);\n }\n };\n _this.scrollToHash = function (hash, prevProps) {\n var node = document.getElementById(hash.substring(1));\n if (node && _this.shouldUpdateScroll(prevProps, _this.props)) {\n node.scrollIntoView();\n }\n };\n _this.shouldUpdateScroll = function (prevRouterProps, routerProps) {\n var shouldUpdateScroll = _this.props.shouldUpdateScroll;\n if (!shouldUpdateScroll) {\n return true;\n }\n\n // Hack to allow accessing this._stateStorage.\n return shouldUpdateScroll.call((0, _assertThisInitialized2.default)(_this), prevRouterProps, routerProps);\n };\n return _this;\n }\n var _proto = ScrollHandler.prototype;\n _proto._saveScroll = function _saveScroll() {\n var key = this.props.location.key || null;\n if (key) {\n this._stateStorage.save(this.props.location, key, this._latestKnownScrollY);\n }\n this._isTicking = false;\n };\n _proto.componentDidMount = function componentDidMount() {\n window.addEventListener(\"scroll\", this.scrollListener);\n var scrollPosition;\n var _this$props$location = this.props.location,\n key = _this$props$location.key,\n hash = _this$props$location.hash;\n if (key) {\n scrollPosition = this._stateStorage.read(this.props.location, key);\n }\n\n /** If a hash is present in the browser url as the component mounts (i.e. the user is navigating\n * from an external website) then scroll to the hash instead of any previously stored scroll\n * position. */\n if (hash) {\n this.scrollToHash(decodeURI(hash), undefined);\n } else if (scrollPosition) {\n this.windowScroll(scrollPosition, undefined);\n }\n };\n _proto.componentWillUnmount = function componentWillUnmount() {\n window.removeEventListener(\"scroll\", this.scrollListener);\n };\n _proto.componentDidUpdate = function componentDidUpdate(prevProps) {\n var _this$props$location2 = this.props.location,\n hash = _this$props$location2.hash,\n key = _this$props$location2.key;\n var scrollPosition;\n if (key) {\n scrollPosition = this._stateStorage.read(this.props.location, key);\n }\n\n /** There are two pieces of state: the browser url and\n * history state which keeps track of scroll position\n * Native behaviour prescribes that we ought to restore scroll position\n * when a user navigates back in their browser (this is the `POP` action)\n * Currently, reach router has a bug that prevents this at https://github.com/reach/router/issues/228\n * So we _always_ stick to the url as a source of truth — if the url\n * contains a hash, we scroll to it\n */\n\n if (hash) {\n this.scrollToHash(decodeURI(hash), prevProps);\n } else {\n this.windowScroll(scrollPosition, prevProps);\n }\n };\n _proto.render = function render() {\n return /*#__PURE__*/React.createElement(ScrollContext.Provider, {\n value: this._stateStorage\n }, this.props.children);\n };\n return ScrollHandler;\n}(React.Component);\nexports.ScrollHandler = ScrollHandler;\nScrollHandler.propTypes = {\n shouldUpdateScroll: _propTypes.default.func,\n children: _propTypes.default.element.isRequired,\n location: _propTypes.default.object.isRequired\n};","\"use strict\";\n\nexports.__esModule = true;\nexports.SessionStorage = void 0;\nvar STATE_KEY_PREFIX = \"@@scroll|\";\nvar GATSBY_ROUTER_SCROLL_STATE = \"___GATSBY_REACT_ROUTER_SCROLL\";\nvar SessionStorage = /*#__PURE__*/function () {\n function SessionStorage() {}\n var _proto = SessionStorage.prototype;\n _proto.read = function read(location, key) {\n var stateKey = this.getStateKey(location, key);\n try {\n var value = window.sessionStorage.getItem(stateKey);\n return value ? JSON.parse(value) : 0;\n } catch (e) {\n if (process.env.NODE_ENV !== \"production\") {\n console.warn(\"[gatsby-react-router-scroll] Unable to access sessionStorage; sessionStorage is not available.\");\n }\n if (window && window[GATSBY_ROUTER_SCROLL_STATE] && window[GATSBY_ROUTER_SCROLL_STATE][stateKey]) {\n return window[GATSBY_ROUTER_SCROLL_STATE][stateKey];\n }\n return 0;\n }\n };\n _proto.save = function save(location, key, value) {\n var stateKey = this.getStateKey(location, key);\n var storedValue = JSON.stringify(value);\n try {\n window.sessionStorage.setItem(stateKey, storedValue);\n } catch (e) {\n if (window && window[GATSBY_ROUTER_SCROLL_STATE]) {\n window[GATSBY_ROUTER_SCROLL_STATE][stateKey] = JSON.parse(storedValue);\n } else {\n window[GATSBY_ROUTER_SCROLL_STATE] = {};\n window[GATSBY_ROUTER_SCROLL_STATE][stateKey] = JSON.parse(storedValue);\n }\n if (process.env.NODE_ENV !== \"production\") {\n console.warn(\"[gatsby-react-router-scroll] Unable to save state in sessionStorage; sessionStorage is not available.\");\n }\n }\n };\n _proto.getStateKey = function getStateKey(location, key) {\n var stateKeyBase = \"\" + STATE_KEY_PREFIX + location.pathname;\n return key === null || typeof key === \"undefined\" ? stateKeyBase : stateKeyBase + \"|\" + key;\n };\n return SessionStorage;\n}();\nexports.SessionStorage = SessionStorage;","\"use strict\";\n\nexports.__esModule = true;\nexports.useScrollRestoration = useScrollRestoration;\nvar _scrollHandler = require(\"./scroll-handler\");\nvar _react = require(\"react\");\nvar _reachRouter = require(\"@gatsbyjs/reach-router\");\nfunction useScrollRestoration(identifier) {\n var location = (0, _reachRouter.useLocation)();\n var state = (0, _react.useContext)(_scrollHandler.ScrollContext);\n var ref = (0, _react.useRef)(null);\n (0, _react.useLayoutEffect)(function () {\n if (ref.current) {\n var position = state.read(location, identifier);\n ref.current.scrollTo(0, position || 0);\n }\n }, [location.key]);\n return {\n ref: ref,\n onScroll: function onScroll() {\n if (ref.current) {\n state.save(location, identifier, ref.current.scrollTop);\n }\n }\n };\n}","\"use strict\";\n\nexports.__esModule = true;\nexports.onInitialClientRender = void 0;\nvar _gatsbyScript = require(\"gatsby-script\");\nvar _injectPartytownSnippet = require(\"./utils/inject-partytown-snippet\");\n// Makes sure off-main-thread scripts are loaded in `gatsby develop`\nconst onInitialClientRender = () => {\n if (process.env.NODE_ENV !== `development`) {\n return;\n }\n (0, _injectPartytownSnippet.injectPartytownSnippet)(_gatsbyScript.collectedScriptsByPage.get(window.location.pathname));\n\n // Clear scripts after we've used them to avoid leaky behavior\n _gatsbyScript.collectedScriptsByPage.delete(window.location.pathname);\n};\n\n// Client-side navigation (CSR, e.g. Gatsby Link navigations) are broken upstream in Partytown.\n// We need an official API from Partytown for handling re-configuration and on-demand script loading.\n// Until then, `off-main-thread` scripts load only on server-side navigation (SSR).\n// See https://github.com/BuilderIO/partytown/issues/74 for more details.\nexports.onInitialClientRender = onInitialClientRender;\n//# sourceMappingURL=gatsby-browser.js.map","\"use strict\";\n\nexports.__esModule = true;\nexports.getForwards = getForwards;\nfunction getForwards(collectedScripts) {\n return collectedScripts === null || collectedScripts === void 0 ? void 0 : collectedScripts.flatMap(script => (script === null || script === void 0 ? void 0 : script.forward) || []);\n}\n//# sourceMappingURL=get-forwards.js.map","\"use strict\";\n\nexports.__esModule = true;\nexports.injectPartytownSnippet = injectPartytownSnippet;\nvar _integration = require(\"@builder.io/partytown/integration\");\nvar _getForwards = require(\"./get-forwards\");\n// Adapted from https://github.com/BuilderIO/partytown/blob/main/src/react/snippet.tsx to only include CSR logic\nfunction injectPartytownSnippet(collectedScripts) {\n if (!collectedScripts.length) {\n return;\n }\n const existingSnippet = document.querySelector(`script[data-partytown]`);\n const existingSandbox = document.querySelector(`iframe[src*=\"~partytown/partytown-sandbox-sw\"]`);\n if (existingSnippet) {\n existingSnippet.remove();\n }\n if (existingSandbox) {\n existingSandbox.remove();\n }\n const forwards = (0, _getForwards.getForwards)(collectedScripts);\n const snippet = document.createElement(`script`);\n snippet.dataset.partytown = ``;\n snippet.innerHTML = (0, _integration.partytownSnippet)({\n forward: forwards\n });\n document.head.appendChild(snippet);\n}\n//# sourceMappingURL=inject-partytown-snippet.js.map","exports.components = {\n \"component---src-pages-404-js\": () => import(\"./../../../src/pages/404.js\" /* webpackChunkName: \"component---src-pages-404-js\" */),\n \"component---src-pages-about-js\": () => import(\"./../../../src/pages/about.js\" /* webpackChunkName: \"component---src-pages-about-js\" */),\n \"component---src-pages-contact-js\": () => import(\"./../../../src/pages/contact.js\" /* webpackChunkName: \"component---src-pages-contact-js\" */),\n \"component---src-pages-index-js\": () => import(\"./../../../src/pages/index.js\" /* webpackChunkName: \"component---src-pages-index-js\" */),\n \"component---src-pages-using-typescript-tsx\": () => import(\"./../../../src/pages/using-typescript.tsx\" /* webpackChunkName: \"component---src-pages-using-typescript-tsx\" */),\n \"component---src-templates-blog-post-js\": () => import(\"./../../../src/templates/blog-post.js\" /* webpackChunkName: \"component---src-templates-blog-post-js\" */)\n}\n\n","module.exports = [{\n plugin: require('../node_modules/gatsby-plugin-canonical-urls/gatsby-browser.js'),\n options: {\"plugins\":[],\"siteUrl\":\"https://blog.skohub.io\"},\n },{\n plugin: require('../node_modules/gatsby-remark-images/gatsby-browser.js'),\n options: {\"plugins\":[],\"maxWidth\":630,\"showCaptions\":true,\"linkImagesToOriginal\":true,\"markdownCaptions\":false,\"backgroundColor\":\"white\",\"quality\":50,\"withWebp\":false,\"withAvif\":false,\"loading\":\"lazy\",\"decoding\":\"async\",\"disableBgImageOnAlpha\":false,\"disableBgImage\":false},\n },{\n plugin: require('../node_modules/gatsby-plugin-manifest/gatsby-browser.js'),\n options: {\"plugins\":[],\"name\":\"Skohub Blog\",\"short_name\":\"Skohub Blog\",\"start_url\":\"/\",\"background_color\":\"#ffffff\",\"theme_color\":\"#26c884\",\"display\":\"minimal-ui\",\"icon\":\"src/images/skohub.png\",\"legacy\":true,\"theme_color_in_head\":true,\"cache_busting_mode\":\"query\",\"crossOrigin\":\"anonymous\",\"include_favicon\":true,\"cacheDigest\":\"3b68e7175ff10cac1dc50da0827ce2e1\"},\n },{\n plugin: require('../gatsby-browser.js'),\n options: {\"plugins\":[]},\n },{\n plugin: require('../node_modules/gatsby/dist/internal-plugins/partytown/gatsby-browser.js'),\n options: {\"plugins\":[]},\n }]\n","const plugins = require(`./api-runner-browser-plugins`)\nconst { getResourceURLsForPathname, loadPage, loadPageSync } =\n require(`./loader`).publicLoader\n\nexports.apiRunner = (api, args = {}, defaultReturn, argTransform) => {\n // Hooks for gatsby-cypress's API handler\n if (process.env.CYPRESS_SUPPORT) {\n if (window.___apiHandler) {\n window.___apiHandler(api)\n } else if (window.___resolvedAPIs) {\n window.___resolvedAPIs.push(api)\n } else {\n window.___resolvedAPIs = [api]\n }\n }\n\n let results = plugins.map(plugin => {\n if (!plugin.plugin[api]) {\n return undefined\n }\n\n args.getResourceURLsForPathname = getResourceURLsForPathname\n args.loadPage = loadPage\n args.loadPageSync = loadPageSync\n\n const result = plugin.plugin[api](args, plugin.options)\n if (result && argTransform) {\n args = argTransform({ args, result, plugin })\n }\n return result\n })\n\n // Filter out undefined results.\n results = results.filter(result => typeof result !== `undefined`)\n\n if (results.length > 0) {\n return results\n } else if (defaultReturn) {\n return [defaultReturn]\n } else {\n return []\n }\n}\n\nexports.apiRunnerAsync = (api, args, defaultReturn) =>\n plugins.reduce(\n (previous, next) =>\n next.plugin[api]\n ? previous.then(() => next.plugin[api](args, next.options))\n : previous,\n Promise.resolve()\n )\n","exports.createContentDigest = () => ``\n","import mitt from \"mitt\"\n\nconst emitter = mitt()\nexport default emitter\n","// \n// An event handler can take an optional event argument\n// and should not return a value\n \n \n\n// An array of all currently registered event handlers for a type\n \n \n// A map of event types and their corresponding event handlers.\n \n \n \n \n\n/** Mitt: Tiny (~200b) functional event emitter / pubsub.\n * @name mitt\n * @returns {Mitt}\n */\nfunction mitt(all ) {\n\tall = all || Object.create(null);\n\n\treturn {\n\t\t/**\n\t\t * Register an event handler for the given type.\n\t\t *\n\t\t * @param {String} type\tType of event to listen for, or `\"*\"` for all events\n\t\t * @param {Function} handler Function to call in response to given event\n\t\t * @memberOf mitt\n\t\t */\n\t\ton: function on(type , handler ) {\n\t\t\t(all[type] || (all[type] = [])).push(handler);\n\t\t},\n\n\t\t/**\n\t\t * Remove an event handler for the given type.\n\t\t *\n\t\t * @param {String} type\tType of event to unregister `handler` from, or `\"*\"`\n\t\t * @param {Function} handler Handler function to remove\n\t\t * @memberOf mitt\n\t\t */\n\t\toff: function off(type , handler ) {\n\t\t\tif (all[type]) {\n\t\t\t\tall[type].splice(all[type].indexOf(handler) >>> 0, 1);\n\t\t\t}\n\t\t},\n\n\t\t/**\n\t\t * Invoke all handlers for the given type.\n\t\t * If present, `\"*\"` handlers are invoked after type-matched handlers.\n\t\t *\n\t\t * @param {String} type The event type to invoke\n\t\t * @param {Any} [evt] Any value (object is recommended and powerful), passed to each handler\n\t\t * @memberOf mitt\n\t\t */\n\t\temit: function emit(type , evt ) {\n\t\t\t(all[type] || []).slice().map(function (handler) { handler(evt); });\n\t\t\t(all['*'] || []).slice().map(function (handler) { handler(type, evt); });\n\t\t}\n\t};\n}\n\nexport default mitt;\n//# sourceMappingURL=mitt.es.js.map\n","export default pathAndSearch => {\n if (pathAndSearch === undefined) {\n return pathAndSearch\n }\n let [path, search = ``] = pathAndSearch.split(`?`)\n if (search) {\n search = `?` + search\n }\n\n if (path === `/`) {\n return `/` + search\n }\n if (path.charAt(path.length - 1) === `/`) {\n return path.slice(0, -1) + search\n }\n return path + search\n}\n","import { pick } from \"@gatsbyjs/reach-router\"\nimport stripPrefix from \"./strip-prefix\"\nimport normalizePagePath from \"./normalize-page-path\"\nimport { maybeGetBrowserRedirect } from \"./redirect-utils.js\"\n\nconst pathCache = new Map()\nlet matchPaths = []\n\nconst trimPathname = rawPathname => {\n let newRawPathname = rawPathname\n const queryIndex = rawPathname.indexOf(`?`)\n\n if (queryIndex !== -1) {\n const [path, qs] = rawPathname.split(`?`)\n newRawPathname = `${path}?${encodeURIComponent(qs)}`\n }\n\n const pathname = decodeURIComponent(newRawPathname)\n\n // Remove the pathPrefix from the pathname.\n const trimmedPathname = stripPrefix(\n pathname,\n decodeURIComponent(__BASE_PATH__)\n )\n // Remove any hashfragment\n .split(`#`)[0]\n\n return trimmedPathname\n}\n\nfunction absolutify(path) {\n // If it's already absolute, return as-is\n if (\n path.startsWith(`/`) ||\n path.startsWith(`https://`) ||\n path.startsWith(`http://`)\n ) {\n return path\n }\n // Calculate path relative to current location, adding a trailing slash to\n // match behavior of @reach/router\n return new URL(\n path,\n window.location.href + (window.location.href.endsWith(`/`) ? `` : `/`)\n ).pathname\n}\n\n/**\n * Set list of matchPaths\n *\n * @param {Array<{path: string, matchPath: string}>} value collection of matchPaths\n */\nexport const setMatchPaths = value => {\n matchPaths = value\n}\n\n/**\n * Return a matchpath url\n * if `match-paths.json` contains `{ \"/foo*\": \"/page1\", ...}`, then\n * `/foo?bar=far` => `/page1`\n *\n * @param {string} rawPathname A raw pathname\n * @return {string|null}\n */\nexport const findMatchPath = rawPathname => {\n const trimmedPathname = cleanPath(rawPathname)\n\n const pickPaths = matchPaths.map(({ path, matchPath }) => {\n return {\n path: matchPath,\n originalPath: path,\n }\n })\n\n const path = pick(pickPaths, trimmedPathname)\n\n if (path) {\n return normalizePagePath(path.route.originalPath)\n }\n\n return null\n}\n\n/**\n * Return a matchpath params from reach/router rules\n * if `match-paths.json` contains `{ \":bar/*foo\" }`, and the path is /baz/zaz/zoo\n * then it returns\n * { bar: baz, foo: zaz/zoo }\n *\n * @param {string} rawPathname A raw pathname\n * @return {object}\n */\nexport const grabMatchParams = rawPathname => {\n const trimmedPathname = cleanPath(rawPathname)\n\n const pickPaths = matchPaths.map(({ path, matchPath }) => {\n return {\n path: matchPath,\n originalPath: path,\n }\n })\n\n const path = pick(pickPaths, trimmedPathname)\n\n if (path) {\n return path.params\n }\n\n return {}\n}\n\n// Given a raw URL path, returns the cleaned version of it (trim off\n// `#` and query params), or if it matches an entry in\n// `match-paths.json`, its matched path is returned\n//\n// E.g. `/foo?bar=far` => `/foo`\n//\n// Or if `match-paths.json` contains `{ \"/foo*\": \"/page1\", ...}`, then\n// `/foo?bar=far` => `/page1`\nexport const findPath = rawPathname => {\n const trimmedPathname = trimPathname(absolutify(rawPathname))\n if (pathCache.has(trimmedPathname)) {\n return pathCache.get(trimmedPathname)\n }\n\n const redirect = maybeGetBrowserRedirect(rawPathname)\n if (redirect) {\n return findPath(redirect.toPath)\n }\n\n let foundPath = findMatchPath(trimmedPathname)\n\n if (!foundPath) {\n foundPath = cleanPath(rawPathname)\n }\n\n pathCache.set(trimmedPathname, foundPath)\n\n return foundPath\n}\n\n/**\n * Clean a url and converts /index.html => /\n * E.g. `/foo?bar=far` => `/foo`\n *\n * @param {string} rawPathname A raw pathname\n * @return {string}\n */\nexport const cleanPath = rawPathname => {\n const trimmedPathname = trimPathname(absolutify(rawPathname))\n\n let foundPath = trimmedPathname\n if (foundPath === `/index.html`) {\n foundPath = `/`\n }\n\n foundPath = normalizePagePath(foundPath)\n\n return foundPath\n}\n","export default function _getPrototypeOf(o) {\n _getPrototypeOf = Object.setPrototypeOf ? Object.getPrototypeOf.bind() : function _getPrototypeOf(o) {\n return o.__proto__ || Object.getPrototypeOf(o);\n };\n return _getPrototypeOf(o);\n}","export default function _isNativeReflectConstruct() {\n if (typeof Reflect === \"undefined\" || !Reflect.construct) return false;\n if (Reflect.construct.sham) return false;\n if (typeof Proxy === \"function\") return true;\n try {\n Boolean.prototype.valueOf.call(Reflect.construct(Boolean, [], function () {}));\n return true;\n } catch (e) {\n return false;\n }\n}","import setPrototypeOf from \"./setPrototypeOf.js\";\nimport isNativeReflectConstruct from \"./isNativeReflectConstruct.js\";\nexport default function _construct(Parent, args, Class) {\n if (isNativeReflectConstruct()) {\n _construct = Reflect.construct.bind();\n } else {\n _construct = function _construct(Parent, args, Class) {\n var a = [null];\n a.push.apply(a, args);\n var Constructor = Function.bind.apply(Parent, a);\n var instance = new Constructor();\n if (Class) setPrototypeOf(instance, Class.prototype);\n return instance;\n };\n }\n return _construct.apply(null, arguments);\n}","import getPrototypeOf from \"./getPrototypeOf.js\";\nimport setPrototypeOf from \"./setPrototypeOf.js\";\nimport isNativeFunction from \"./isNativeFunction.js\";\nimport construct from \"./construct.js\";\nexport default function _wrapNativeSuper(Class) {\n var _cache = typeof Map === \"function\" ? new Map() : undefined;\n _wrapNativeSuper = function _wrapNativeSuper(Class) {\n if (Class === null || !isNativeFunction(Class)) return Class;\n if (typeof Class !== \"function\") {\n throw new TypeError(\"Super expression must either be null or a function\");\n }\n if (typeof _cache !== \"undefined\") {\n if (_cache.has(Class)) return _cache.get(Class);\n _cache.set(Class, Wrapper);\n }\n function Wrapper() {\n return construct(Class, arguments, getPrototypeOf(this).constructor);\n }\n Wrapper.prototype = Object.create(Class.prototype, {\n constructor: {\n value: Wrapper,\n enumerable: false,\n writable: true,\n configurable: true\n }\n });\n return setPrototypeOf(Wrapper, Class);\n };\n return _wrapNativeSuper(Class);\n}","export default function _isNativeFunction(fn) {\n return Function.toString.call(fn).indexOf(\"[native code]\") !== -1;\n}","import React from \"react\"\n\nexport const ServerSliceRenderer = ({ sliceId, children }) => {\n const contents = [\n React.createElement(`slice-start`, {\n id: `${sliceId}-1`,\n }),\n React.createElement(`slice-end`, {\n id: `${sliceId}-1`,\n }),\n ]\n\n if (children) {\n // if children exist, we split the slice into a before and after piece\n // see renderSlices in render-html\n contents.push(children)\n contents.push(\n React.createElement(`slice-start`, {\n id: `${sliceId}-2`,\n }),\n React.createElement(`slice-end`, {\n id: `${sliceId}-2`,\n })\n )\n }\n\n return contents\n}\n","import React, { useContext } from \"react\"\nimport { createContentDigest } from \"gatsby-core-utils/create-content-digest\"\nimport { SlicesMapContext, SlicesPropsContext } from \"./context\"\nimport { ServerSliceRenderer } from \"./server-slice-renderer\"\n\nconst getSliceId = (sliceName, sliceProps) => {\n if (!Object.keys(sliceProps).length) {\n return sliceName\n }\n\n const propsString = createContentDigest(sliceProps)\n return `${sliceName}-${propsString}`\n}\n\nexport const ServerSlice = ({\n sliceName,\n allowEmpty,\n children,\n ...sliceProps\n}) => {\n const slicesMap = useContext(SlicesMapContext)\n const slicesProps = useContext(SlicesPropsContext)\n const concreteSliceName = slicesMap[sliceName]\n\n if (!concreteSliceName) {\n if (allowEmpty) {\n return null\n } else {\n throw new Error(\n `Slice \"${concreteSliceName}\" for \"${sliceName}\" slot not found`\n )\n }\n }\n\n const sliceId = getSliceId(concreteSliceName, sliceProps)\n\n // set props on context object for static-entry to return\n let sliceUsage = slicesProps[sliceId]\n if (!sliceUsage) {\n slicesProps[sliceId] = sliceUsage = {\n props: sliceProps,\n sliceName: concreteSliceName,\n hasChildren: !!children,\n }\n } else {\n if (children) {\n sliceUsage.hasChildren = true\n }\n }\n\n return {children}\n}\n","import React, { useContext } from \"react\"\nimport { SlicesMapContext, SlicesResultsContext } from \"./context\"\n\nexport const InlineSlice = ({\n sliceName,\n allowEmpty,\n children,\n ...sliceProps\n}) => {\n const slicesMap = useContext(SlicesMapContext)\n const slicesResultsMap = useContext(SlicesResultsContext)\n const concreteSliceName = slicesMap[sliceName]\n const slice = slicesResultsMap.get(concreteSliceName)\n\n if (!slice) {\n if (allowEmpty) {\n return null\n } else {\n throw new Error(\n `Slice \"${concreteSliceName}\" for \"${sliceName}\" slot not found`\n )\n }\n }\n\n return (\n \n {children}\n \n )\n}\n","\"use client\"\n\nimport React, { useContext } from \"react\"\nimport { ServerSlice } from \"./slice/server-slice\"\nimport { InlineSlice } from \"./slice/inline-slice\"\nimport { SlicesContext } from \"./slice/context\"\n\nexport function Slice(props) {\n if (process.env.GATSBY_SLICES) {\n // we use sliceName internally, so remap alias to sliceName\n const internalProps = {\n ...props,\n sliceName: props.alias,\n }\n delete internalProps.alias\n delete internalProps.__renderedByLocation\n\n const slicesContext = useContext(SlicesContext)\n\n // validate props\n const propErrors = validateSliceProps(props)\n if (Object.keys(propErrors).length) {\n throw new SlicePropsError(\n slicesContext.renderEnvironment === `browser`,\n internalProps.sliceName,\n propErrors,\n props.__renderedByLocation\n )\n }\n\n if (slicesContext.renderEnvironment === `server`) {\n return \n } else if (slicesContext.renderEnvironment === `browser`) {\n // in the browser, we'll just render the component as is\n return \n } else if (slicesContext.renderEnvironment === `engines`) {\n // if we're in SSR, we'll just render the component as is\n return \n } else if (slicesContext.renderEnvironment === `slices`) {\n // we are not yet supporting nested slices\n\n let additionalContextMessage = ``\n\n // just in case generating additional contextual information fails, we still want the base message to show\n // and not show another cryptic error message\n try {\n additionalContextMessage = `\\n\\nSlice component \"${slicesContext.sliceRoot.name}\" (${slicesContext.sliceRoot.componentPath}) tried to render `\n } catch {\n // don't need to handle it, we will just skip the additional context message if we fail to generate it\n }\n\n throw new Error(\n `Nested slices are not supported.${additionalContextMessage}\\n\\nSee https://gatsbyjs.com/docs/reference/built-in-components/gatsby-slice#nested-slices`\n )\n } else {\n throw new Error(\n `Slice context \"${slicesContext.renderEnvironment}\" is not supported.`\n )\n }\n } else {\n throw new Error(`Slices are disabled.`)\n }\n}\n\nclass SlicePropsError extends Error {\n constructor(inBrowser, sliceName, propErrors, renderedByLocation) {\n const errors = Object.entries(propErrors)\n .map(\n ([key, value]) =>\n `not serializable \"${value}\" type passed to \"${key}\" prop`\n )\n .join(`, `)\n\n const name = `SlicePropsError`\n let stack = ``\n let message = ``\n\n if (inBrowser) {\n // They're just (kinda) kidding, I promise... You can still work here <3\n // https://www.gatsbyjs.com/careers/\n const fullStack =\n React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.ReactDebugCurrentFrame.getCurrentStack()\n\n // remove the first line of the stack trace\n const stackLines = fullStack.trim().split(`\\n`).slice(1)\n stackLines[0] = stackLines[0].trim()\n stack = `\\n` + stackLines.join(`\\n`)\n\n message = `Slice \"${sliceName}\" was passed props that are not serializable (${errors}).`\n } else {\n // we can't really grab any extra info outside of the browser, so just print what we can\n message = `${name}: Slice \"${sliceName}\" was passed props that are not serializable (${errors}).`\n const stackLines = new Error().stack.trim().split(`\\n`).slice(2)\n stack = `${message}\\n${stackLines.join(`\\n`)}`\n }\n\n super(message)\n this.name = name\n if (stack) {\n this.stack = stack\n } else {\n Error.captureStackTrace(this, SlicePropsError)\n }\n\n if (renderedByLocation) {\n this.forcedLocation = { ...renderedByLocation, functionName: `Slice` }\n }\n }\n}\n\nconst validateSliceProps = (\n props,\n errors = {},\n seenObjects = [],\n path = null\n) => {\n // recursively validate all props\n for (const [name, value] of Object.entries(props)) {\n if (\n value === undefined ||\n value === null ||\n (!path && name === `children`)\n ) {\n continue\n }\n\n const propPath = path ? `${path}.${name}` : name\n\n if (typeof value === `function`) {\n errors[propPath] = typeof value\n } else if (typeof value === `object` && seenObjects.indexOf(value) <= 0) {\n seenObjects.push(value)\n validateSliceProps(value, errors, seenObjects, propPath)\n }\n }\n\n return errors\n}\n","export default function _assertThisInitialized(self) {\n if (self === void 0) {\n throw new ReferenceError(\"this hasn't been initialised - super() hasn't been called\");\n }\n return self;\n}","import loader from \"./loader\"\n\nconst prefetchPathname = loader.enqueue\n\nfunction graphql() {\n throw new Error(\n `It appears like Gatsby is misconfigured. Gatsby related \\`graphql\\` calls ` +\n `are supposed to only be evaluated at compile time, and then compiled away. ` +\n `Unfortunately, something went wrong and the query was left in the compiled code.\\n\\n` +\n `Unless your site has a complex or custom babel/Gatsby configuration this is likely a bug in Gatsby.`\n )\n}\n\nexport { default as PageRenderer } from \"./public-page-renderer\"\nexport { useScrollRestoration } from \"gatsby-react-router-scroll\"\nexport {\n Link,\n withPrefix,\n withAssetPrefix,\n navigate,\n parsePath,\n} from \"gatsby-link\"\n\nexport { graphql, prefetchPathname }\nexport { StaticQuery, StaticQueryContext, useStaticQuery } from \"./static-query\"\nexport { Slice } from \"./slice\"\nexport * from \"gatsby-script\"\n","export default function _arrayLikeToArray(arr, len) {\n if (len == null || len > arr.length) len = arr.length;\n for (var i = 0, arr2 = new Array(len); i < len; i++) arr2[i] = arr[i];\n return arr2;\n}","import arrayWithoutHoles from \"./arrayWithoutHoles.js\";\nimport iterableToArray from \"./iterableToArray.js\";\nimport unsupportedIterableToArray from \"./unsupportedIterableToArray.js\";\nimport nonIterableSpread from \"./nonIterableSpread.js\";\nexport default function _toConsumableArray(arr) {\n return arrayWithoutHoles(arr) || iterableToArray(arr) || unsupportedIterableToArray(arr) || nonIterableSpread();\n}","import arrayLikeToArray from \"./arrayLikeToArray.js\";\nexport default function _arrayWithoutHoles(arr) {\n if (Array.isArray(arr)) return arrayLikeToArray(arr);\n}","export default function _iterableToArray(iter) {\n if (typeof Symbol !== \"undefined\" && iter[Symbol.iterator] != null || iter[\"@@iterator\"] != null) return Array.from(iter);\n}","import arrayLikeToArray from \"./arrayLikeToArray.js\";\nexport default function _unsupportedIterableToArray(o, minLen) {\n if (!o) return;\n if (typeof o === \"string\") return arrayLikeToArray(o, minLen);\n var n = Object.prototype.toString.call(o).slice(8, -1);\n if (n === \"Object\" && o.constructor) n = o.constructor.name;\n if (n === \"Map\" || n === \"Set\") return Array.from(o);\n if (n === \"Arguments\" || /^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(n)) return arrayLikeToArray(o, minLen);\n}","export default function _nonIterableSpread() {\n throw new TypeError(\"Invalid attempt to spread non-iterable instance.\\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.\");\n}","const support = function (feature) {\n if (typeof document === `undefined`) {\n return false\n }\n const fakeLink = document.createElement(`link`)\n try {\n if (fakeLink.relList && typeof fakeLink.relList.supports === `function`) {\n return fakeLink.relList.supports(feature)\n }\n } catch (err) {\n return false\n }\n return false\n}\n\nconst linkPrefetchStrategy = function (url, options) {\n return new Promise((resolve, reject) => {\n if (typeof document === `undefined`) {\n reject()\n return\n }\n\n const link = document.createElement(`link`)\n link.setAttribute(`rel`, `prefetch`)\n link.setAttribute(`href`, url)\n\n Object.keys(options).forEach(key => {\n link.setAttribute(key, options[key])\n })\n\n link.onload = resolve\n link.onerror = reject\n\n const parentElement =\n document.getElementsByTagName(`head`)[0] ||\n document.getElementsByName(`script`)[0].parentNode\n parentElement.appendChild(link)\n })\n}\n\nconst xhrPrefetchStrategy = function (url) {\n return new Promise((resolve, reject) => {\n const req = new XMLHttpRequest()\n req.open(`GET`, url, true)\n\n req.onload = () => {\n if (req.status === 200) {\n resolve()\n } else {\n reject()\n }\n }\n\n req.send(null)\n })\n}\n\nconst supportedPrefetchStrategy = support(`prefetch`)\n ? linkPrefetchStrategy\n : xhrPrefetchStrategy\n\nconst preFetched = {}\n\nconst prefetch = function (url, options) {\n return new Promise(resolve => {\n if (preFetched[url]) {\n resolve()\n return\n }\n\n supportedPrefetchStrategy(url, options)\n .then(() => {\n resolve()\n preFetched[url] = true\n })\n .catch(() => {}) // 404s are logged to the console anyway\n })\n}\n\nexport default prefetch\n","import { createFromReadableStream } from \"react-server-dom-webpack\"\nimport prefetchHelper from \"./prefetch\"\nimport emitter from \"./emitter\"\nimport { setMatchPaths, findPath, findMatchPath } from \"./find-path\"\n\n/**\n * Available resource loading statuses\n */\nexport const PageResourceStatus = {\n /**\n * At least one of critical resources failed to load\n */\n Error: `error`,\n /**\n * Resources loaded successfully\n */\n Success: `success`,\n}\n\nconst preferDefault = m => (m && m.default) || m\n\nconst stripSurroundingSlashes = s => {\n s = s[0] === `/` ? s.slice(1) : s\n s = s.endsWith(`/`) ? s.slice(0, -1) : s\n return s\n}\n\nconst createPageDataUrl = rawPath => {\n const [path, maybeSearch] = rawPath.split(`?`)\n const fixedPath = path === `/` ? `index` : stripSurroundingSlashes(path)\n return `${__PATH_PREFIX__}/page-data/${fixedPath}/page-data.json${\n maybeSearch ? `?${maybeSearch}` : ``\n }`\n}\n\n/**\n * Utility to check the path that goes into doFetch for e.g. potential malicious intentions.\n * It checks for \"//\" because with this you could do a fetch request to a different domain.\n */\nconst shouldAbortFetch = rawPath => rawPath.startsWith(`//`)\n\nfunction doFetch(url, method = `GET`) {\n return new Promise(resolve => {\n const req = new XMLHttpRequest()\n req.open(method, url, true)\n req.onreadystatechange = () => {\n if (req.readyState == 4) {\n resolve(req)\n }\n }\n req.send(null)\n })\n}\n\nconst doesConnectionSupportPrefetch = () => {\n if (\n `connection` in navigator &&\n typeof navigator.connection !== `undefined`\n ) {\n if ((navigator.connection.effectiveType || ``).includes(`2g`)) {\n return false\n }\n if (navigator.connection.saveData) {\n return false\n }\n }\n return true\n}\n\n// Regex that matches common search crawlers\nconst BOT_REGEX = /bot|crawler|spider|crawling/i\n\nconst toPageResources = (pageData, component = null, head) => {\n const page = {\n componentChunkName: pageData.componentChunkName,\n path: pageData.path,\n webpackCompilationHash: pageData.webpackCompilationHash,\n matchPath: pageData.matchPath,\n staticQueryHashes: pageData.staticQueryHashes,\n getServerDataError: pageData.getServerDataError,\n slicesMap: pageData.slicesMap ?? {},\n }\n\n return {\n component,\n head,\n json: pageData.result,\n page,\n }\n}\n\nfunction waitForResponse(response) {\n return new Promise(resolve => {\n try {\n const result = response.readRoot()\n resolve(result)\n } catch (err) {\n if (\n Object.hasOwnProperty.call(err, `_response`) &&\n Object.hasOwnProperty.call(err, `_status`)\n ) {\n setTimeout(() => {\n waitForResponse(response).then(resolve)\n }, 200)\n } else {\n throw err\n }\n }\n })\n}\n\nexport class BaseLoader {\n constructor(loadComponent, matchPaths) {\n // Map of pagePath -> Page. Where Page is an object with: {\n // status: PageResourceStatus.Success || PageResourceStatus.Error,\n // payload: PageResources, // undefined if PageResourceStatus.Error\n // }\n // PageResources is {\n // component,\n // json: pageData.result,\n // page: {\n // componentChunkName,\n // path,\n // webpackCompilationHash,\n // staticQueryHashes\n // },\n // staticQueryResults\n // }\n this.pageDb = new Map()\n this.inFlightDb = new Map()\n this.staticQueryDb = {}\n this.pageDataDb = new Map()\n this.partialHydrationDb = new Map()\n this.slicesDataDb = new Map()\n this.sliceInflightDb = new Map()\n this.slicesDb = new Map()\n this.isPrefetchQueueRunning = false\n this.prefetchQueued = []\n this.prefetchTriggered = new Set()\n this.prefetchCompleted = new Set()\n this.loadComponent = loadComponent\n setMatchPaths(matchPaths)\n }\n\n inFlightNetworkRequests = new Map()\n\n memoizedGet(url) {\n let inFlightPromise = this.inFlightNetworkRequests.get(url)\n\n if (!inFlightPromise) {\n inFlightPromise = doFetch(url, `GET`)\n this.inFlightNetworkRequests.set(url, inFlightPromise)\n }\n\n // Prefer duplication with then + catch over .finally to prevent problems in ie11 + firefox\n return inFlightPromise\n .then(response => {\n this.inFlightNetworkRequests.delete(url)\n return response\n })\n .catch(err => {\n this.inFlightNetworkRequests.delete(url)\n throw err\n })\n }\n\n setApiRunner(apiRunner) {\n this.apiRunner = apiRunner\n this.prefetchDisabled = apiRunner(`disableCorePrefetching`).some(a => a)\n }\n\n fetchPageDataJson(loadObj) {\n const { pagePath, retries = 0 } = loadObj\n const url = createPageDataUrl(pagePath)\n return this.memoizedGet(url).then(req => {\n const { status, responseText } = req\n\n // Handle 200\n if (status === 200) {\n try {\n const jsonPayload = JSON.parse(responseText)\n if (jsonPayload.path === undefined) {\n throw new Error(`not a valid pageData response`)\n }\n\n const maybeSearch = pagePath.split(`?`)[1]\n if (maybeSearch && !jsonPayload.path.includes(maybeSearch)) {\n jsonPayload.path += `?${maybeSearch}`\n }\n\n return Object.assign(loadObj, {\n status: PageResourceStatus.Success,\n payload: jsonPayload,\n })\n } catch (err) {\n // continue regardless of error\n }\n }\n\n // Handle 404\n if (status === 404 || status === 200) {\n // If the request was for a 404/500 page and it doesn't exist, we're done\n if (pagePath === `/404.html` || pagePath === `/500.html`) {\n return Object.assign(loadObj, {\n status: PageResourceStatus.Error,\n })\n }\n\n // Need some code here to cache the 404 request. In case\n // multiple loadPageDataJsons result in 404s\n return this.fetchPageDataJson(\n Object.assign(loadObj, { pagePath: `/404.html`, notFound: true })\n )\n }\n\n // handle 500 response (Unrecoverable)\n if (status === 500) {\n return this.fetchPageDataJson(\n Object.assign(loadObj, {\n pagePath: `/500.html`,\n internalServerError: true,\n })\n )\n }\n\n // Handle everything else, including status === 0, and 503s. Should retry\n if (retries < 3) {\n return this.fetchPageDataJson(\n Object.assign(loadObj, { retries: retries + 1 })\n )\n }\n\n // Retried 3 times already, result is an error.\n return Object.assign(loadObj, {\n status: PageResourceStatus.Error,\n })\n })\n }\n\n fetchPartialHydrationJson(loadObj) {\n const { pagePath, retries = 0 } = loadObj\n const url = createPageDataUrl(pagePath).replace(`.json`, `-rsc.json`)\n return this.memoizedGet(url).then(req => {\n const { status, responseText } = req\n\n // Handle 200\n if (status === 200) {\n try {\n return Object.assign(loadObj, {\n status: PageResourceStatus.Success,\n payload: responseText,\n })\n } catch (err) {\n // continue regardless of error\n }\n }\n\n // Handle 404\n if (status === 404 || status === 200) {\n // If the request was for a 404/500 page and it doesn't exist, we're done\n if (pagePath === `/404.html` || pagePath === `/500.html`) {\n return Object.assign(loadObj, {\n status: PageResourceStatus.Error,\n })\n }\n\n // Need some code here to cache the 404 request. In case\n // multiple loadPageDataJsons result in 404s\n return this.fetchPartialHydrationJson(\n Object.assign(loadObj, { pagePath: `/404.html`, notFound: true })\n )\n }\n\n // handle 500 response (Unrecoverable)\n if (status === 500) {\n return this.fetchPartialHydrationJson(\n Object.assign(loadObj, {\n pagePath: `/500.html`,\n internalServerError: true,\n })\n )\n }\n\n // Handle everything else, including status === 0, and 503s. Should retry\n if (retries < 3) {\n return this.fetchPartialHydrationJson(\n Object.assign(loadObj, { retries: retries + 1 })\n )\n }\n\n // Retried 3 times already, result is an error.\n return Object.assign(loadObj, {\n status: PageResourceStatus.Error,\n })\n })\n }\n\n loadPageDataJson(rawPath) {\n const pagePath = findPath(rawPath)\n if (this.pageDataDb.has(pagePath)) {\n const pageData = this.pageDataDb.get(pagePath)\n if (process.env.BUILD_STAGE !== `develop` || !pageData.stale) {\n return Promise.resolve(pageData)\n }\n }\n\n return this.fetchPageDataJson({ pagePath }).then(pageData => {\n this.pageDataDb.set(pagePath, pageData)\n\n return pageData\n })\n }\n\n loadPartialHydrationJson(rawPath) {\n const pagePath = findPath(rawPath)\n if (this.partialHydrationDb.has(pagePath)) {\n const pageData = this.partialHydrationDb.get(pagePath)\n if (process.env.BUILD_STAGE !== `develop` || !pageData.stale) {\n return Promise.resolve(pageData)\n }\n }\n\n return this.fetchPartialHydrationJson({ pagePath }).then(pageData => {\n this.partialHydrationDb.set(pagePath, pageData)\n\n return pageData\n })\n }\n\n loadSliceDataJson(sliceName) {\n if (this.slicesDataDb.has(sliceName)) {\n const jsonPayload = this.slicesDataDb.get(sliceName)\n return Promise.resolve({ sliceName, jsonPayload })\n }\n\n const url = `${__PATH_PREFIX__}/slice-data/${sliceName}.json`\n return doFetch(url, `GET`).then(res => {\n const jsonPayload = JSON.parse(res.responseText)\n\n this.slicesDataDb.set(sliceName, jsonPayload)\n return { sliceName, jsonPayload }\n })\n }\n\n findMatchPath(rawPath) {\n return findMatchPath(rawPath)\n }\n\n // TODO check all uses of this and whether they use undefined for page resources not exist\n loadPage(rawPath) {\n const pagePath = findPath(rawPath)\n if (this.pageDb.has(pagePath)) {\n const page = this.pageDb.get(pagePath)\n if (process.env.BUILD_STAGE !== `develop` || !page.payload.stale) {\n if (page.error) {\n return Promise.resolve({\n error: page.error,\n status: page.status,\n })\n }\n\n return Promise.resolve(page.payload)\n }\n }\n\n if (this.inFlightDb.has(pagePath)) {\n return this.inFlightDb.get(pagePath)\n }\n\n const loadDataPromises = [\n this.loadAppData(),\n this.loadPageDataJson(pagePath),\n ]\n\n if (global.hasPartialHydration) {\n loadDataPromises.push(this.loadPartialHydrationJson(pagePath))\n }\n\n const inFlightPromise = Promise.all(loadDataPromises).then(allData => {\n const [appDataResponse, pageDataResponse, rscDataResponse] = allData\n\n if (\n pageDataResponse.status === PageResourceStatus.Error ||\n rscDataResponse?.status === PageResourceStatus.Error\n ) {\n return {\n status: PageResourceStatus.Error,\n }\n }\n\n let pageData = pageDataResponse.payload\n\n const {\n componentChunkName,\n staticQueryHashes: pageStaticQueryHashes = [],\n slicesMap = {},\n } = pageData\n\n const finalResult = {}\n\n const dedupedSliceNames = Array.from(new Set(Object.values(slicesMap)))\n\n const loadSlice = slice => {\n if (this.slicesDb.has(slice.name)) {\n return this.slicesDb.get(slice.name)\n } else if (this.sliceInflightDb.has(slice.name)) {\n return this.sliceInflightDb.get(slice.name)\n }\n\n const inFlight = this.loadComponent(slice.componentChunkName).then(\n component => {\n return {\n component: preferDefault(component),\n sliceContext: slice.result.sliceContext,\n data: slice.result.data,\n }\n }\n )\n\n this.sliceInflightDb.set(slice.name, inFlight)\n inFlight.then(results => {\n this.slicesDb.set(slice.name, results)\n this.sliceInflightDb.delete(slice.name)\n })\n\n return inFlight\n }\n\n return Promise.all(\n dedupedSliceNames.map(sliceName => this.loadSliceDataJson(sliceName))\n ).then(slicesData => {\n const slices = []\n const dedupedStaticQueryHashes = [...pageStaticQueryHashes]\n\n for (const { jsonPayload, sliceName } of Object.values(slicesData)) {\n slices.push({ name: sliceName, ...jsonPayload })\n for (const staticQueryHash of jsonPayload.staticQueryHashes) {\n if (!dedupedStaticQueryHashes.includes(staticQueryHash)) {\n dedupedStaticQueryHashes.push(staticQueryHash)\n }\n }\n }\n\n const loadChunkPromises = [\n Promise.all(slices.map(loadSlice)),\n this.loadComponent(componentChunkName, `head`),\n ]\n\n if (!global.hasPartialHydration) {\n loadChunkPromises.push(this.loadComponent(componentChunkName))\n }\n\n // In develop we have separate chunks for template and Head components\n // to enable HMR (fast refresh requires single exports).\n // In production we have shared chunk with both exports. Double loadComponent here\n // will be deduped by webpack runtime resulting in single request and single module\n // being loaded for both `component` and `head`.\n // get list of components to get\n const componentChunkPromises = Promise.all(loadChunkPromises).then(\n components => {\n const [sliceComponents, headComponent, pageComponent] = components\n\n finalResult.createdAt = new Date()\n\n for (const sliceComponent of sliceComponents) {\n if (!sliceComponent || sliceComponent instanceof Error) {\n finalResult.status = PageResourceStatus.Error\n finalResult.error = sliceComponent\n }\n }\n\n if (\n !global.hasPartialHydration &&\n (!pageComponent || pageComponent instanceof Error)\n ) {\n finalResult.status = PageResourceStatus.Error\n finalResult.error = pageComponent\n }\n\n let pageResources\n\n if (finalResult.status !== PageResourceStatus.Error) {\n finalResult.status = PageResourceStatus.Success\n if (\n pageDataResponse.notFound === true ||\n rscDataResponse?.notFound === true\n ) {\n finalResult.notFound = true\n }\n pageData = Object.assign(pageData, {\n webpackCompilationHash: appDataResponse\n ? appDataResponse.webpackCompilationHash\n : ``,\n })\n\n if (typeof rscDataResponse?.payload === `string`) {\n pageResources = toPageResources(pageData, null, headComponent)\n\n pageResources.partialHydration = rscDataResponse.payload\n\n const readableStream = new ReadableStream({\n start(controller) {\n const te = new TextEncoder()\n controller.enqueue(te.encode(rscDataResponse.payload))\n },\n pull(controller) {\n // close on next read when queue is empty\n controller.close()\n },\n cancel() {},\n })\n\n return waitForResponse(\n createFromReadableStream(readableStream)\n ).then(result => {\n pageResources.partialHydration = result\n\n return pageResources\n })\n } else {\n pageResources = toPageResources(\n pageData,\n pageComponent,\n headComponent\n )\n }\n }\n\n // undefined if final result is an error\n return pageResources\n }\n )\n\n // get list of static queries to get\n const staticQueryBatchPromise = Promise.all(\n dedupedStaticQueryHashes.map(staticQueryHash => {\n // Check for cache in case this static query result has already been loaded\n if (this.staticQueryDb[staticQueryHash]) {\n const jsonPayload = this.staticQueryDb[staticQueryHash]\n return { staticQueryHash, jsonPayload }\n }\n\n return this.memoizedGet(\n `${__PATH_PREFIX__}/page-data/sq/d/${staticQueryHash}.json`\n )\n .then(req => {\n const jsonPayload = JSON.parse(req.responseText)\n return { staticQueryHash, jsonPayload }\n })\n .catch(() => {\n throw new Error(\n `We couldn't load \"${__PATH_PREFIX__}/page-data/sq/d/${staticQueryHash}.json\"`\n )\n })\n })\n ).then(staticQueryResults => {\n const staticQueryResultsMap = {}\n\n staticQueryResults.forEach(({ staticQueryHash, jsonPayload }) => {\n staticQueryResultsMap[staticQueryHash] = jsonPayload\n this.staticQueryDb[staticQueryHash] = jsonPayload\n })\n\n return staticQueryResultsMap\n })\n\n return (\n Promise.all([componentChunkPromises, staticQueryBatchPromise])\n .then(([pageResources, staticQueryResults]) => {\n let payload\n if (pageResources) {\n payload = { ...pageResources, staticQueryResults }\n finalResult.payload = payload\n emitter.emit(`onPostLoadPageResources`, {\n page: payload,\n pageResources: payload,\n })\n }\n\n this.pageDb.set(pagePath, finalResult)\n\n if (finalResult.error) {\n return {\n error: finalResult.error,\n status: finalResult.status,\n }\n }\n\n return payload\n })\n // when static-query fail to load we throw a better error\n .catch(err => {\n return {\n error: err,\n status: PageResourceStatus.Error,\n }\n })\n )\n })\n })\n\n inFlightPromise\n .then(() => {\n this.inFlightDb.delete(pagePath)\n })\n .catch(error => {\n this.inFlightDb.delete(pagePath)\n throw error\n })\n\n this.inFlightDb.set(pagePath, inFlightPromise)\n\n return inFlightPromise\n }\n\n // returns undefined if the page does not exists in cache\n loadPageSync(rawPath, options = {}) {\n const pagePath = findPath(rawPath)\n if (this.pageDb.has(pagePath)) {\n const pageData = this.pageDb.get(pagePath)\n\n if (pageData.payload) {\n return pageData.payload\n }\n\n if (options?.withErrorDetails) {\n return {\n error: pageData.error,\n status: pageData.status,\n }\n }\n }\n return undefined\n }\n\n shouldPrefetch(pagePath) {\n // Skip prefetching if we know user is on slow or constrained connection\n if (!doesConnectionSupportPrefetch()) {\n return false\n }\n\n // Don't prefetch if this is a crawler bot\n if (navigator.userAgent && BOT_REGEX.test(navigator.userAgent)) {\n return false\n }\n\n // Check if the page exists.\n if (this.pageDb.has(pagePath)) {\n return false\n }\n\n return true\n }\n\n prefetch(pagePath) {\n if (!this.shouldPrefetch(pagePath)) {\n return {\n then: resolve => resolve(false),\n abort: () => {},\n }\n }\n if (this.prefetchTriggered.has(pagePath)) {\n return {\n then: resolve => resolve(true),\n abort: () => {},\n }\n }\n\n const defer = {\n resolve: null,\n reject: null,\n promise: null,\n }\n defer.promise = new Promise((resolve, reject) => {\n defer.resolve = resolve\n defer.reject = reject\n })\n this.prefetchQueued.push([pagePath, defer])\n const abortC = new AbortController()\n abortC.signal.addEventListener(`abort`, () => {\n const index = this.prefetchQueued.findIndex(([p]) => p === pagePath)\n // remove from the queue\n if (index !== -1) {\n this.prefetchQueued.splice(index, 1)\n }\n })\n\n if (!this.isPrefetchQueueRunning) {\n this.isPrefetchQueueRunning = true\n setTimeout(() => {\n this._processNextPrefetchBatch()\n }, 3000)\n }\n\n return {\n then: (resolve, reject) => defer.promise.then(resolve, reject),\n abort: abortC.abort.bind(abortC),\n }\n }\n\n _processNextPrefetchBatch() {\n const idleCallback = window.requestIdleCallback || (cb => setTimeout(cb, 0))\n\n idleCallback(() => {\n const toPrefetch = this.prefetchQueued.splice(0, 4)\n const prefetches = Promise.all(\n toPrefetch.map(([pagePath, dPromise]) => {\n // Tell plugins with custom prefetching logic that they should start\n // prefetching this path.\n if (!this.prefetchTriggered.has(pagePath)) {\n this.apiRunner(`onPrefetchPathname`, { pathname: pagePath })\n this.prefetchTriggered.add(pagePath)\n }\n\n // If a plugin has disabled core prefetching, stop now.\n if (this.prefetchDisabled) {\n return dPromise.resolve(false)\n }\n\n return this.doPrefetch(findPath(pagePath)).then(() => {\n if (!this.prefetchCompleted.has(pagePath)) {\n this.apiRunner(`onPostPrefetchPathname`, { pathname: pagePath })\n this.prefetchCompleted.add(pagePath)\n }\n\n dPromise.resolve(true)\n })\n })\n )\n\n if (this.prefetchQueued.length) {\n prefetches.then(() => {\n setTimeout(() => {\n this._processNextPrefetchBatch()\n }, 3000)\n })\n } else {\n this.isPrefetchQueueRunning = false\n }\n })\n }\n\n doPrefetch(pagePath) {\n const pageDataUrl = createPageDataUrl(pagePath)\n\n if (global.hasPartialHydration) {\n return Promise.all([\n prefetchHelper(pageDataUrl, {\n crossOrigin: `anonymous`,\n as: `fetch`,\n }).then(() =>\n // This was just prefetched, so will return a response from\n // the cache instead of making another request to the server\n this.loadPageDataJson(pagePath)\n ),\n prefetchHelper(pageDataUrl.replace(`.json`, `-rsc.json`), {\n crossOrigin: `anonymous`,\n as: `fetch`,\n }).then(() =>\n // This was just prefetched, so will return a response from\n // the cache instead of making another request to the server\n this.loadPartialHydrationJson(pagePath)\n ),\n ])\n } else {\n return prefetchHelper(pageDataUrl, {\n crossOrigin: `anonymous`,\n as: `fetch`,\n }).then(() =>\n // This was just prefetched, so will return a response from\n // the cache instead of making another request to the server\n this.loadPageDataJson(pagePath)\n )\n }\n }\n\n hovering(rawPath) {\n this.loadPage(rawPath)\n }\n\n getResourceURLsForPathname(rawPath) {\n const pagePath = findPath(rawPath)\n const page = this.pageDataDb.get(pagePath)\n if (page) {\n const pageResources = toPageResources(page.payload)\n\n return [\n ...createComponentUrls(pageResources.page.componentChunkName),\n createPageDataUrl(pagePath),\n ]\n } else {\n return null\n }\n }\n\n isPageNotFound(rawPath) {\n const pagePath = findPath(rawPath)\n const page = this.pageDb.get(pagePath)\n return !page || page.notFound\n }\n\n loadAppData(retries = 0) {\n return this.memoizedGet(`${__PATH_PREFIX__}/page-data/app-data.json`).then(\n req => {\n const { status, responseText } = req\n\n let appData\n\n if (status !== 200 && retries < 3) {\n // Retry 3 times incase of non-200 responses\n return this.loadAppData(retries + 1)\n }\n\n // Handle 200\n if (status === 200) {\n try {\n const jsonPayload = JSON.parse(responseText)\n if (jsonPayload.webpackCompilationHash === undefined) {\n throw new Error(`not a valid app-data response`)\n }\n\n appData = jsonPayload\n } catch (err) {\n // continue regardless of error\n }\n }\n\n return appData\n }\n )\n }\n}\n\nconst createComponentUrls = componentChunkName =>\n (window.___chunkMapping[componentChunkName] || []).map(\n chunk => __PATH_PREFIX__ + chunk\n )\n\nexport class ProdLoader extends BaseLoader {\n constructor(asyncRequires, matchPaths, pageData) {\n const loadComponent = (chunkName, exportType = `components`) => {\n if (!global.hasPartialHydration) {\n exportType = `components`\n }\n\n if (!asyncRequires[exportType][chunkName]) {\n throw new Error(\n `We couldn't find the correct component chunk with the name \"${chunkName}\"`\n )\n }\n\n return (\n asyncRequires[exportType][chunkName]()\n // loader will handle the case when component is error\n .catch(err => err)\n )\n }\n\n super(loadComponent, matchPaths)\n\n if (pageData) {\n this.pageDataDb.set(findPath(pageData.path), {\n pagePath: pageData.path,\n payload: pageData,\n status: `success`,\n })\n }\n }\n\n doPrefetch(pagePath) {\n return super.doPrefetch(pagePath).then(result => {\n if (result.status !== PageResourceStatus.Success) {\n return Promise.resolve()\n }\n const pageData = result.payload\n const chunkName = pageData.componentChunkName\n const componentUrls = createComponentUrls(chunkName)\n return Promise.all(componentUrls.map(prefetchHelper)).then(() => pageData)\n })\n }\n\n loadPageDataJson(rawPath) {\n return super.loadPageDataJson(rawPath).then(data => {\n if (data.notFound) {\n if (shouldAbortFetch(rawPath)) {\n return data\n }\n // check if html file exist using HEAD request:\n // if it does we should navigate to it instead of showing 404\n return doFetch(rawPath, `HEAD`).then(req => {\n if (req.status === 200) {\n // page (.html file) actually exist (or we asked for 404 )\n // returning page resources status as errored to trigger\n // regular browser navigation to given page\n return {\n status: PageResourceStatus.Error,\n }\n }\n\n // if HEAD request wasn't 200, return notFound result\n // and show 404 page\n return data\n })\n }\n return data\n })\n }\n\n loadPartialHydrationJson(rawPath) {\n return super.loadPartialHydrationJson(rawPath).then(data => {\n if (data.notFound) {\n if (shouldAbortFetch(rawPath)) {\n return data\n }\n // check if html file exist using HEAD request:\n // if it does we should navigate to it instead of showing 404\n return doFetch(rawPath, `HEAD`).then(req => {\n if (req.status === 200) {\n // page (.html file) actually exist (or we asked for 404 )\n // returning page resources status as errored to trigger\n // regular browser navigation to given page\n return {\n status: PageResourceStatus.Error,\n }\n }\n\n // if HEAD request wasn't 200, return notFound result\n // and show 404 page\n return data\n })\n }\n return data\n })\n }\n}\n\nlet instance\n\nexport const setLoader = _loader => {\n instance = _loader\n}\n\nexport const publicLoader = {\n enqueue: rawPath => instance.prefetch(rawPath),\n\n // Real methods\n getResourceURLsForPathname: rawPath =>\n instance.getResourceURLsForPathname(rawPath),\n loadPage: rawPath => instance.loadPage(rawPath),\n // TODO add deprecation to v4 so people use withErrorDetails and then we can remove in v5 and change default behaviour\n loadPageSync: (rawPath, options = {}) =>\n instance.loadPageSync(rawPath, options),\n prefetch: rawPath => instance.prefetch(rawPath),\n isPageNotFound: rawPath => instance.isPageNotFound(rawPath),\n hovering: rawPath => instance.hovering(rawPath),\n loadAppData: () => instance.loadAppData(),\n}\n\nexport default publicLoader\n\nexport function getStaticQueryResults() {\n if (instance) {\n return instance.staticQueryDb\n } else {\n return {}\n }\n}\n\nexport function getSliceResults() {\n if (instance) {\n return instance.slicesDb\n } else {\n return {}\n }\n}\n","import { useEffect } from \"react\"\n\n/*\n * Calls callback in an effect and renders children\n */\nexport function FireCallbackInEffect({ children, callback }) {\n useEffect(() => {\n callback()\n })\n\n return children\n}\n","export const VALID_NODE_NAMES = [\n `link`,\n `meta`,\n `style`,\n `title`,\n `base`,\n `noscript`,\n `script`,\n `html`,\n `body`,\n]\n","import { VALID_NODE_NAMES } from \"./constants\"\n\n/**\n * Filter the props coming from a page down to just the ones that are relevant for head.\n * This e.g. filters out properties that are undefined during SSR.\n */\nexport function filterHeadProps(input) {\n return {\n location: {\n pathname: input.location.pathname,\n },\n params: input.params,\n data: input.data || {},\n serverData: input.serverData,\n pageContext: input.pageContext,\n }\n}\n\n/**\n * Throw error if Head export is not a valid\n */\nexport function headExportValidator(head) {\n if (typeof head !== `function`)\n throw new Error(\n `Expected \"Head\" export to be a function got \"${typeof head}\".`\n )\n}\n\n/**\n * Warn once for same messsage\n */\nlet warnOnce = _ => {}\nif (process.env.NODE_ENV !== `production`) {\n const warnings = new Set()\n warnOnce = msg => {\n if (!warnings.has(msg)) {\n console.warn(msg)\n }\n warnings.add(msg)\n }\n}\n\n/**\n * Warn for invalid tags in head.\n * @param {string} tagName\n */\nexport function warnForInvalidTags(tagName) {\n if (process.env.NODE_ENV !== `production`) {\n const warning = `<${tagName}> is not a valid head element. Please use one of the following: ${VALID_NODE_NAMES.join(\n `, `\n )}`\n\n warnOnce(warning)\n }\n}\n\n/**\n * When a `nonce` is present on an element, browsers such as Chrome and Firefox strip it out of the\n * actual HTML attributes for security reasons *when the element is added to the document*. Thus,\n * given two equivalent elements that have nonces, `Element,isEqualNode()` will return false if one\n * of those elements gets added to the document. Although the `element.nonce` property will be the\n * same for both elements, the one that was added to the document will return an empty string for\n * its nonce HTML attribute value.\n *\n * This custom `isEqualNode()` function therefore removes the nonce value from the `newTag` before\n * comparing it to `oldTag`, restoring it afterwards.\n *\n * For more information, see:\n * https://bugs.chromium.org/p/chromium/issues/detail?id=1211471#c12\n */\nexport function isEqualNode(oldTag, newTag) {\n if (oldTag instanceof HTMLElement && newTag instanceof HTMLElement) {\n const nonce = newTag.getAttribute(`nonce`)\n // Only strip the nonce if `oldTag` has had it stripped. An element's nonce attribute will not\n // be stripped if there is no content security policy response header that includes a nonce.\n if (nonce && !oldTag.getAttribute(`nonce`)) {\n const cloneTag = newTag.cloneNode(true)\n cloneTag.setAttribute(`nonce`, ``)\n cloneTag.nonce = nonce\n return nonce === oldTag.nonce && oldTag.isEqualNode(cloneTag)\n }\n }\n\n return oldTag.isEqualNode(newTag)\n}\n\nexport function diffNodes({ oldNodes, newNodes, onStale, onNew }) {\n for (const existingHeadElement of oldNodes) {\n const indexInNewNodes = newNodes.findIndex(e =>\n isEqualNode(e, existingHeadElement)\n )\n\n if (indexInNewNodes === -1) {\n onStale(existingHeadElement)\n } else {\n // this node is re-created as-is, so we keep old node, and remove it from list of new nodes (as we handled it already here)\n newNodes.splice(indexInNewNodes, 1)\n }\n }\n\n // remaing new nodes didn't have matching old node, so need to be added\n for (const newNode of newNodes) {\n onNew(newNode)\n }\n}\n","import React from \"react\"\nimport { useEffect } from \"react\"\nimport { StaticQueryContext } from \"gatsby\"\nimport { LocationProvider } from \"@gatsbyjs/reach-router\"\nimport { reactDOMUtils } from \"../react-dom-utils\"\nimport { FireCallbackInEffect } from \"./components/fire-callback-in-effect\"\nimport { VALID_NODE_NAMES } from \"./constants\"\nimport {\n headExportValidator,\n filterHeadProps,\n warnForInvalidTags,\n diffNodes,\n} from \"./utils\"\n\nconst hiddenRoot = document.createElement(`div`)\nconst htmlAttributesList = new Set()\nconst bodyAttributesList = new Set()\n\nconst removePrevHtmlAttributes = () => {\n htmlAttributesList.forEach(attributeName => {\n const elementTag = document.getElementsByTagName(`html`)[0]\n elementTag.removeAttribute(attributeName)\n })\n}\n\nconst removePrevBodyAttributes = () => {\n bodyAttributesList.forEach(attributeName => {\n const elementTag = document.getElementsByTagName(`body`)[0]\n elementTag.removeAttribute(attributeName)\n })\n}\n\nconst updateAttribute = (\n tagName,\n attributeName,\n attributeValue,\n attributesList\n) => {\n const elementTag = document.getElementsByTagName(tagName)[0]\n\n if (!elementTag) {\n return\n }\n\n elementTag.setAttribute(attributeName, attributeValue)\n attributesList.add(attributeName)\n}\n\nconst removePrevHeadElements = () => {\n const prevHeadNodes = document.querySelectorAll(`[data-gatsby-head]`)\n\n for (const node of prevHeadNodes) {\n node.parentNode.removeChild(node)\n }\n}\n\nconst onHeadRendered = () => {\n const validHeadNodes = []\n const seenIds = new Map()\n\n for (const node of hiddenRoot.childNodes) {\n const nodeName = node.nodeName.toLowerCase()\n const id = node.attributes?.id?.value\n\n if (!VALID_NODE_NAMES.includes(nodeName)) {\n warnForInvalidTags(nodeName)\n continue\n }\n\n if (nodeName === `html`) {\n for (const attribute of node.attributes) {\n updateAttribute(\n `html`,\n attribute.name,\n attribute.value,\n htmlAttributesList\n )\n }\n continue\n }\n\n if (nodeName === `body`) {\n for (const attribute of node.attributes) {\n updateAttribute(\n `body`,\n attribute.name,\n attribute.value,\n bodyAttributesList\n )\n }\n continue\n }\n\n let clonedNode = node.cloneNode(true)\n clonedNode.setAttribute(`data-gatsby-head`, true)\n\n // Create an element for scripts to make script work\n if (clonedNode.nodeName.toLowerCase() === `script`) {\n const script = document.createElement(`script`)\n for (const attr of clonedNode.attributes) {\n script.setAttribute(attr.name, attr.value)\n }\n script.innerHTML = clonedNode.innerHTML\n clonedNode = script\n }\n\n if (id) {\n if (!seenIds.has(id)) {\n validHeadNodes.push(clonedNode)\n seenIds.set(id, validHeadNodes.length - 1)\n } else {\n const indexOfPreviouslyInsertedNode = seenIds.get(id)\n validHeadNodes[indexOfPreviouslyInsertedNode].parentNode?.removeChild(\n validHeadNodes[indexOfPreviouslyInsertedNode]\n )\n validHeadNodes[indexOfPreviouslyInsertedNode] = clonedNode\n\n continue\n }\n } else {\n validHeadNodes.push(clonedNode)\n }\n }\n\n const existingHeadElements = document.querySelectorAll(`[data-gatsby-head]`)\n\n if (existingHeadElements.length === 0) {\n document.head.append(...validHeadNodes)\n return\n }\n\n const newHeadNodes = []\n diffNodes({\n oldNodes: existingHeadElements,\n newNodes: validHeadNodes,\n onStale: node => node.parentNode.removeChild(node),\n onNew: node => newHeadNodes.push(node),\n })\n\n document.head.append(...newHeadNodes)\n}\n\nif (process.env.BUILD_STAGE === `develop`) {\n // sigh ... and elements are not valid descedents of
(our hidden element)\n // react-dom in dev mode will warn about this. There doesn't seem to be a way to render arbitrary\n // user Head without hitting this issue (our hidden element could be just \"new Document()\", but\n // this can only have 1 child, and we don't control what is being rendered so that's not an option)\n // instead we continue to render to
, and just silence warnings for and elements\n // https://github.com/facebook/react/blob/e2424f33b3ad727321fc12e75c5e94838e84c2b5/packages/react-dom-bindings/src/client/validateDOMNesting.js#L498-L520\n const originalConsoleError = console.error.bind(console)\n console.error = (...args) => {\n if (\n Array.isArray(args) &&\n args.length >= 2 &&\n args[0]?.includes(`validateDOMNesting(...): %s cannot appear as`) &&\n (args[1] === `` || args[1] === ``)\n ) {\n return undefined\n }\n return originalConsoleError(...args)\n }\n\n // We set up observer to be able to regenerate after react-refresh\n // updates our hidden element.\n const observer = new MutationObserver(onHeadRendered)\n observer.observe(hiddenRoot, {\n attributes: true,\n childList: true,\n characterData: true,\n subtree: true,\n })\n}\n\nexport function headHandlerForBrowser({\n pageComponent,\n staticQueryResults,\n pageComponentProps,\n}) {\n useEffect(() => {\n if (pageComponent?.Head) {\n headExportValidator(pageComponent.Head)\n\n const { render } = reactDOMUtils()\n\n const Head = pageComponent.Head\n\n render(\n // just a hack to call the callback after react has done first render\n // Note: In dev, we call onHeadRendered twice( in FireCallbackInEffect and after mutualution observer dectects initail render into hiddenRoot) this is for hot reloading\n // In Prod we only call onHeadRendered in FireCallbackInEffect to render to head\n \n \n \n \n \n \n ,\n hiddenRoot\n )\n }\n\n return () => {\n removePrevHeadElements()\n removePrevHtmlAttributes()\n removePrevBodyAttributes()\n }\n })\n}\n","import React, { Suspense, createElement } from \"react\"\nimport PropTypes from \"prop-types\"\nimport { apiRunner } from \"./api-runner-browser\"\nimport { grabMatchParams } from \"./find-path\"\nimport { headHandlerForBrowser } from \"./head/head-export-handler-for-browser\"\n\n// Renders page\nfunction PageRenderer(props) {\n const pageComponentProps = {\n ...props,\n params: {\n ...grabMatchParams(props.location.pathname),\n ...props.pageResources.json.pageContext.__params,\n },\n }\n\n const preferDefault = m => (m && m.default) || m\n\n let pageElement\n if (props.pageResources.partialHydration) {\n pageElement = props.pageResources.partialHydration\n } else {\n pageElement = createElement(preferDefault(props.pageResources.component), {\n ...pageComponentProps,\n key: props.path || props.pageResources.page.path,\n })\n }\n\n const pageComponent = props.pageResources.head\n\n headHandlerForBrowser({\n pageComponent,\n staticQueryResults: props.pageResources.staticQueryResults,\n pageComponentProps,\n })\n\n const wrappedPage = apiRunner(\n `wrapPageElement`,\n {\n element: pageElement,\n props: pageComponentProps,\n },\n pageElement,\n ({ result }) => {\n return { element: result, props: pageComponentProps }\n }\n ).pop()\n\n return wrappedPage\n}\n\nPageRenderer.propTypes = {\n location: PropTypes.object.isRequired,\n pageResources: PropTypes.object.isRequired,\n data: PropTypes.object,\n pageContext: PropTypes.object.isRequired,\n}\n\nexport default PageRenderer\n","// This is extracted to separate module because it's shared\n// between browser and SSR code\nexport const RouteAnnouncerProps = {\n id: `gatsby-announcer`,\n style: {\n position: `absolute`,\n top: 0,\n width: 1,\n height: 1,\n padding: 0,\n overflow: `hidden`,\n clip: `rect(0, 0, 0, 0)`,\n whiteSpace: `nowrap`,\n border: 0,\n },\n \"aria-live\": `assertive`,\n \"aria-atomic\": `true`,\n}\n","import React from \"react\"\nimport PropTypes from \"prop-types\"\nimport loader, { PageResourceStatus } from \"./loader\"\nimport { maybeGetBrowserRedirect } from \"./redirect-utils.js\"\nimport { apiRunner } from \"./api-runner-browser\"\nimport emitter from \"./emitter\"\nimport { RouteAnnouncerProps } from \"./route-announcer-props\"\nimport {\n navigate as reachNavigate,\n globalHistory,\n} from \"@gatsbyjs/reach-router\"\nimport { parsePath } from \"gatsby-link\"\n\nfunction maybeRedirect(pathname) {\n const redirect = maybeGetBrowserRedirect(pathname)\n const { hash, search } = window.location\n\n if (redirect != null) {\n window.___replace(redirect.toPath + search + hash)\n return true\n } else {\n return false\n }\n}\n\n// Catch unhandled chunk loading errors and force a restart of the app.\nlet nextRoute = ``\n\nwindow.addEventListener(`unhandledrejection`, event => {\n if (/loading chunk \\d* failed./i.test(event.reason)) {\n if (nextRoute) {\n window.location.pathname = nextRoute\n }\n }\n})\n\nconst onPreRouteUpdate = (location, prevLocation) => {\n if (!maybeRedirect(location.pathname)) {\n nextRoute = location.pathname\n apiRunner(`onPreRouteUpdate`, { location, prevLocation })\n }\n}\n\nconst onRouteUpdate = (location, prevLocation) => {\n if (!maybeRedirect(location.pathname)) {\n apiRunner(`onRouteUpdate`, { location, prevLocation })\n if (\n process.env.GATSBY_QUERY_ON_DEMAND &&\n process.env.GATSBY_QUERY_ON_DEMAND_LOADING_INDICATOR === `true`\n ) {\n emitter.emit(`onRouteUpdate`, { location, prevLocation })\n }\n }\n}\n\nconst navigate = (to, options = {}) => {\n // Support forward/backward navigation with numbers\n // navigate(-2) (jumps back 2 history steps)\n // navigate(2) (jumps forward 2 history steps)\n if (typeof to === `number`) {\n globalHistory.navigate(to)\n return\n }\n\n const { pathname, search, hash } = parsePath(to)\n const redirect = maybeGetBrowserRedirect(pathname)\n\n // If we're redirecting, just replace the passed in pathname\n // to the one we want to redirect to.\n if (redirect) {\n to = redirect.toPath + search + hash\n }\n\n // If we had a service worker update, no matter the path, reload window and\n // reset the pathname whitelist\n if (window.___swUpdated) {\n window.location = pathname + search + hash\n return\n }\n\n // Start a timer to wait for a second before transitioning and showing a\n // loader in case resources aren't around yet.\n const timeoutId = setTimeout(() => {\n emitter.emit(`onDelayedLoadPageResources`, { pathname })\n apiRunner(`onRouteUpdateDelayed`, {\n location: window.location,\n })\n }, 1000)\n\n loader.loadPage(pathname + search).then(pageResources => {\n // If no page resources, then refresh the page\n // Do this, rather than simply `window.location.reload()`, so that\n // pressing the back/forward buttons work - otherwise when pressing\n // back, the browser will just change the URL and expect JS to handle\n // the change, which won't always work since it might not be a Gatsby\n // page.\n if (!pageResources || pageResources.status === PageResourceStatus.Error) {\n window.history.replaceState({}, ``, location.href)\n window.location = pathname\n clearTimeout(timeoutId)\n return\n }\n\n // If the loaded page has a different compilation hash to the\n // window, then a rebuild has occurred on the server. Reload.\n if (process.env.NODE_ENV === `production` && pageResources) {\n if (\n pageResources.page.webpackCompilationHash !==\n window.___webpackCompilationHash\n ) {\n // Purge plugin-offline cache\n if (\n `serviceWorker` in navigator &&\n navigator.serviceWorker.controller !== null &&\n navigator.serviceWorker.controller.state === `activated`\n ) {\n navigator.serviceWorker.controller.postMessage({\n gatsbyApi: `clearPathResources`,\n })\n }\n\n window.location = pathname + search + hash\n }\n }\n reachNavigate(to, options)\n clearTimeout(timeoutId)\n })\n}\n\nfunction shouldUpdateScroll(prevRouterProps, { location }) {\n const { pathname, hash } = location\n const results = apiRunner(`shouldUpdateScroll`, {\n prevRouterProps,\n // `pathname` for backwards compatibility\n pathname,\n routerProps: { location },\n getSavedScrollPosition: args => [\n 0,\n // FIXME this is actually a big code smell, we should fix this\n // eslint-disable-next-line @babel/no-invalid-this\n this._stateStorage.read(args, args.key),\n ],\n })\n if (results.length > 0) {\n // Use the latest registered shouldUpdateScroll result, this allows users to override plugin's configuration\n // @see https://github.com/gatsbyjs/gatsby/issues/12038\n return results[results.length - 1]\n }\n\n if (prevRouterProps) {\n const {\n location: { pathname: oldPathname },\n } = prevRouterProps\n if (oldPathname === pathname) {\n // Scroll to element if it exists, if it doesn't, or no hash is provided,\n // scroll to top.\n return hash ? decodeURI(hash.slice(1)) : [0, 0]\n }\n }\n return true\n}\n\nfunction init() {\n // The \"scroll-behavior\" package expects the \"action\" to be on the location\n // object so let's copy it over.\n globalHistory.listen(args => {\n args.location.action = args.action\n })\n\n window.___push = to => navigate(to, { replace: false })\n window.___replace = to => navigate(to, { replace: true })\n window.___navigate = (to, options) => navigate(to, options)\n}\n\nclass RouteAnnouncer extends React.Component {\n constructor(props) {\n super(props)\n this.announcementRef = React.createRef()\n }\n\n componentDidUpdate(prevProps, nextProps) {\n requestAnimationFrame(() => {\n let pageName = `new page at ${this.props.location.pathname}`\n if (document.title) {\n pageName = document.title\n }\n const pageHeadings = document.querySelectorAll(`#gatsby-focus-wrapper h1`)\n if (pageHeadings && pageHeadings.length) {\n pageName = pageHeadings[0].textContent\n }\n const newAnnouncement = `Navigated to ${pageName}`\n if (this.announcementRef.current) {\n const oldAnnouncement = this.announcementRef.current.innerText\n if (oldAnnouncement !== newAnnouncement) {\n this.announcementRef.current.innerText = newAnnouncement\n }\n }\n })\n }\n\n render() {\n return \n }\n}\n\nconst compareLocationProps = (prevLocation, nextLocation) => {\n if (prevLocation.href !== nextLocation.href) {\n return true\n }\n\n if (prevLocation?.state?.key !== nextLocation?.state?.key) {\n return true\n }\n\n return false\n}\n\n// Fire on(Pre)RouteUpdate APIs\nclass RouteUpdates extends React.Component {\n constructor(props) {\n super(props)\n onPreRouteUpdate(props.location, null)\n }\n\n componentDidMount() {\n onRouteUpdate(this.props.location, null)\n }\n\n shouldComponentUpdate(prevProps) {\n if (compareLocationProps(prevProps.location, this.props.location)) {\n onPreRouteUpdate(this.props.location, prevProps.location)\n return true\n }\n return false\n }\n\n componentDidUpdate(prevProps) {\n if (compareLocationProps(prevProps.location, this.props.location)) {\n onRouteUpdate(this.props.location, prevProps.location)\n }\n }\n\n render() {\n return (\n \n {this.props.children}\n \n \n )\n }\n}\n\nRouteUpdates.propTypes = {\n location: PropTypes.object.isRequired,\n}\n\nexport { init, shouldUpdateScroll, RouteUpdates, maybeGetBrowserRedirect }\n","// Pulled from react-compat\n// https://github.com/developit/preact-compat/blob/7c5de00e7c85e2ffd011bf3af02899b63f699d3a/src/index.js#L349\nfunction shallowDiffers(a, b) {\n for (var i in a) {\n if (!(i in b)) return true;\n }for (var _i in b) {\n if (a[_i] !== b[_i]) return true;\n }return false;\n}\n\nexport default (function (instance, nextProps, nextState) {\n return shallowDiffers(instance.props, nextProps) || shallowDiffers(instance.state, nextState);\n});","import React from \"react\"\nimport loader, { PageResourceStatus } from \"./loader\"\nimport shallowCompare from \"shallow-compare\"\n\nclass EnsureResources extends React.Component {\n constructor(props) {\n super()\n const { location, pageResources } = props\n this.state = {\n location: { ...location },\n pageResources:\n pageResources ||\n loader.loadPageSync(location.pathname + location.search, {\n withErrorDetails: true,\n }),\n }\n }\n\n static getDerivedStateFromProps({ location }, prevState) {\n if (prevState.location.href !== location.href) {\n const pageResources = loader.loadPageSync(\n location.pathname + location.search,\n {\n withErrorDetails: true,\n }\n )\n\n return {\n pageResources,\n location: { ...location },\n }\n }\n\n return {\n location: { ...location },\n }\n }\n\n loadResources(rawPath) {\n loader.loadPage(rawPath).then(pageResources => {\n if (pageResources && pageResources.status !== PageResourceStatus.Error) {\n this.setState({\n location: { ...window.location },\n pageResources,\n })\n } else {\n window.history.replaceState({}, ``, location.href)\n window.location = rawPath\n }\n })\n }\n\n shouldComponentUpdate(nextProps, nextState) {\n // Always return false if we're missing resources.\n if (!nextState.pageResources) {\n this.loadResources(\n nextProps.location.pathname + nextProps.location.search\n )\n return false\n }\n\n if (\n process.env.BUILD_STAGE === `develop` &&\n nextState.pageResources.stale\n ) {\n this.loadResources(\n nextProps.location.pathname + nextProps.location.search\n )\n return false\n }\n\n // Check if the component or json have changed.\n if (this.state.pageResources !== nextState.pageResources) {\n return true\n }\n if (\n this.state.pageResources.component !== nextState.pageResources.component\n ) {\n return true\n }\n\n if (this.state.pageResources.json !== nextState.pageResources.json) {\n return true\n }\n // Check if location has changed on a page using internal routing\n // via matchPath configuration.\n if (\n this.state.location.key !== nextState.location.key &&\n nextState.pageResources.page &&\n (nextState.pageResources.page.matchPath ||\n nextState.pageResources.page.path)\n ) {\n return true\n }\n return shallowCompare(this, nextProps, nextState)\n }\n\n render() {\n if (\n process.env.NODE_ENV !== `production` &&\n (!this.state.pageResources ||\n this.state.pageResources.status === PageResourceStatus.Error)\n ) {\n const message = `EnsureResources was not able to find resources for path: \"${this.props.location.pathname}\"\nThis typically means that an issue occurred building components for that path.\nRun \\`gatsby clean\\` to remove any cached elements.`\n if (this.state.pageResources?.error) {\n console.error(message)\n throw this.state.pageResources.error\n }\n\n throw new Error(message)\n }\n\n return this.props.children(this.state)\n }\n}\n\nexport default EnsureResources\n","import { apiRunner, apiRunnerAsync } from \"./api-runner-browser\"\nimport React from \"react\"\nimport { Router, navigate, Location, BaseContext } from \"@gatsbyjs/reach-router\"\nimport { ScrollContext } from \"gatsby-react-router-scroll\"\nimport { StaticQueryContext } from \"./static-query\"\nimport {\n SlicesMapContext,\n SlicesContext,\n SlicesResultsContext,\n} from \"./slice/context\"\nimport {\n shouldUpdateScroll,\n init as navigationInit,\n RouteUpdates,\n} from \"./navigation\"\nimport emitter from \"./emitter\"\nimport PageRenderer from \"./page-renderer\"\nimport asyncRequires from \"$virtual/async-requires\"\nimport {\n setLoader,\n ProdLoader,\n publicLoader,\n PageResourceStatus,\n getStaticQueryResults,\n getSliceResults,\n} from \"./loader\"\nimport EnsureResources from \"./ensure-resources\"\nimport stripPrefix from \"./strip-prefix\"\n\n// Generated during bootstrap\nimport matchPaths from \"$virtual/match-paths.json\"\nimport { reactDOMUtils } from \"./react-dom-utils\"\n\nconst loader = new ProdLoader(asyncRequires, matchPaths, window.pageData)\nsetLoader(loader)\nloader.setApiRunner(apiRunner)\n\nconst { render, hydrate } = reactDOMUtils()\n\nwindow.asyncRequires = asyncRequires\nwindow.___emitter = emitter\nwindow.___loader = publicLoader\n\nnavigationInit()\n\nconst reloadStorageKey = `gatsby-reload-compilation-hash-match`\n\napiRunnerAsync(`onClientEntry`).then(() => {\n // Let plugins register a service worker. The plugin just needs\n // to return true.\n if (apiRunner(`registerServiceWorker`).filter(Boolean).length > 0) {\n require(`./register-service-worker`)\n }\n\n // In gatsby v2 if Router is used in page using matchPaths\n // paths need to contain full path.\n // For example:\n // - page have `/app/*` matchPath\n // - inside template user needs to use `/app/xyz` as path\n // Resetting `basepath`/`baseuri` keeps current behaviour\n // to not introduce breaking change.\n // Remove this in v3\n const RouteHandler = props => (\n \n \n \n )\n\n const DataContext = React.createContext({})\n\n const slicesContext = {\n renderEnvironment: `browser`,\n }\n\n class GatsbyRoot extends React.Component {\n render() {\n const { children } = this.props\n return (\n \n {({ location }) => (\n \n {({ pageResources, location }) => {\n const staticQueryResults = getStaticQueryResults()\n const sliceResults = getSliceResults()\n\n return (\n \n \n \n \n \n {children}\n \n \n \n \n \n )\n }}\n \n )}\n \n )\n }\n }\n\n class LocationHandler extends React.Component {\n render() {\n return (\n \n {({ pageResources, location }) => (\n \n \n \n \n \n \n \n )}\n \n )\n }\n }\n\n const { pagePath, location: browserLoc } = window\n\n // Explicitly call navigate if the canonical path (window.pagePath)\n // is different to the browser path (window.location.pathname). SSR\n // page paths might include search params, while SSG and DSG won't.\n // If page path include search params we also compare query params.\n // But only if NONE of the following conditions hold:\n //\n // - The url matches a client side route (page.matchPath)\n // - it's a 404 page\n // - it's the offline plugin shell (/offline-plugin-app-shell-fallback/)\n if (\n pagePath &&\n __BASE_PATH__ + pagePath !==\n browserLoc.pathname + (pagePath.includes(`?`) ? browserLoc.search : ``) &&\n !(\n loader.findMatchPath(stripPrefix(browserLoc.pathname, __BASE_PATH__)) ||\n pagePath.match(/^\\/(404|500)(\\/?|.html)$/) ||\n pagePath.match(/^\\/offline-plugin-app-shell-fallback\\/?$/)\n )\n ) {\n navigate(\n __BASE_PATH__ +\n pagePath +\n (!pagePath.includes(`?`) ? browserLoc.search : ``) +\n browserLoc.hash,\n {\n replace: true,\n }\n )\n }\n\n // It's possible that sessionStorage can throw an exception if access is not granted, see https://github.com/gatsbyjs/gatsby/issues/34512\n const getSessionStorage = () => {\n try {\n return sessionStorage\n } catch {\n return null\n }\n }\n\n publicLoader.loadPage(browserLoc.pathname + browserLoc.search).then(page => {\n const sessionStorage = getSessionStorage()\n\n if (\n page?.page?.webpackCompilationHash &&\n page.page.webpackCompilationHash !== window.___webpackCompilationHash\n ) {\n // Purge plugin-offline cache\n if (\n `serviceWorker` in navigator &&\n navigator.serviceWorker.controller !== null &&\n navigator.serviceWorker.controller.state === `activated`\n ) {\n navigator.serviceWorker.controller.postMessage({\n gatsbyApi: `clearPathResources`,\n })\n }\n\n // We have not matching html + js (inlined `window.___webpackCompilationHash`)\n // with our data (coming from `app-data.json` file). This can cause issues such as\n // errors trying to load static queries (as list of static queries is inside `page-data`\n // which might not match to currently loaded `.js` scripts).\n // We are making attempt to reload if hashes don't match, but we also have to handle case\n // when reload doesn't fix it (possibly broken deploy) so we don't end up in infinite reload loop\n if (sessionStorage) {\n const isReloaded = sessionStorage.getItem(reloadStorageKey) === `1`\n\n if (!isReloaded) {\n sessionStorage.setItem(reloadStorageKey, `1`)\n window.location.reload(true)\n return\n }\n }\n }\n\n if (sessionStorage) {\n sessionStorage.removeItem(reloadStorageKey)\n }\n\n if (!page || page.status === PageResourceStatus.Error) {\n const message = `page resources for ${browserLoc.pathname} not found. Not rendering React`\n\n // if the chunk throws an error we want to capture the real error\n // This should help with https://github.com/gatsbyjs/gatsby/issues/19618\n if (page && page.error) {\n console.error(message)\n throw page.error\n }\n\n throw new Error(message)\n }\n\n const SiteRoot = apiRunner(\n `wrapRootElement`,\n { element: },\n ,\n ({ result }) => {\n return { element: result }\n }\n ).pop()\n\n const App = function App() {\n const onClientEntryRanRef = React.useRef(false)\n\n React.useEffect(() => {\n if (!onClientEntryRanRef.current) {\n onClientEntryRanRef.current = true\n if (performance.mark) {\n performance.mark(`onInitialClientRender`)\n }\n\n apiRunner(`onInitialClientRender`)\n }\n }, [])\n\n return {SiteRoot}\n }\n\n const focusEl = document.getElementById(`gatsby-focus-wrapper`)\n\n // Client only pages have any empty body so we just do a normal\n // render to avoid React complaining about hydration mis-matches.\n let defaultRenderer = render\n if (focusEl && focusEl.children.length) {\n defaultRenderer = hydrate\n }\n\n const renderer = apiRunner(\n `replaceHydrateFunction`,\n undefined,\n defaultRenderer\n )[0]\n\n function runRender() {\n const rootElement =\n typeof window !== `undefined`\n ? document.getElementById(`___gatsby`)\n : null\n\n renderer(, rootElement)\n }\n\n // https://github.com/madrobby/zepto/blob/b5ed8d607f67724788ec9ff492be297f64d47dfc/src/zepto.js#L439-L450\n // TODO remove IE 10 support\n const doc = document\n if (\n doc.readyState === `complete` ||\n (doc.readyState !== `loading` && !doc.documentElement.doScroll)\n ) {\n setTimeout(function () {\n runRender()\n }, 0)\n } else {\n const handler = function () {\n doc.removeEventListener(`DOMContentLoaded`, handler, false)\n window.removeEventListener(`load`, handler, false)\n\n runRender()\n }\n\n doc.addEventListener(`DOMContentLoaded`, handler, false)\n window.addEventListener(`load`, handler, false)\n }\n\n return\n })\n})\n","import React from \"react\"\nimport PropTypes from \"prop-types\"\n\nimport loader from \"./loader\"\nimport InternalPageRenderer from \"./page-renderer\"\n\nconst ProdPageRenderer = ({ location }) => {\n const pageResources = loader.loadPageSync(location.pathname)\n if (!pageResources) {\n return null\n }\n return React.createElement(InternalPageRenderer, {\n location,\n pageResources,\n ...pageResources.json,\n })\n}\n\nProdPageRenderer.propTypes = {\n location: PropTypes.shape({\n pathname: PropTypes.string.isRequired,\n }).isRequired,\n}\n\nexport default ProdPageRenderer\n","const preferDefault = m => (m && m.default) || m\n\nif (process.env.BUILD_STAGE === `develop`) {\n module.exports = preferDefault(require(`./public-page-renderer-dev`))\n} else if (process.env.BUILD_STAGE === `build-javascript`) {\n module.exports = preferDefault(require(`./public-page-renderer-prod`))\n} else {\n module.exports = () => null\n}\n","const map = new WeakMap()\n\nexport function reactDOMUtils() {\n const reactDomClient = require(`react-dom/client`)\n\n const render = (Component, el) => {\n let root = map.get(el)\n if (!root) {\n map.set(el, (root = reactDomClient.createRoot(el)))\n }\n root.render(Component)\n }\n\n const hydrate = (Component, el) => reactDomClient.hydrateRoot(el, Component)\n\n return { render, hydrate }\n}\n","import redirects from \"./redirects.json\"\n\n// Convert to a map for faster lookup in maybeRedirect()\n\nconst redirectMap = new Map()\nconst redirectIgnoreCaseMap = new Map()\n\nredirects.forEach(redirect => {\n if (redirect.ignoreCase) {\n redirectIgnoreCaseMap.set(redirect.fromPath, redirect)\n } else {\n redirectMap.set(redirect.fromPath, redirect)\n }\n})\n\nexport function maybeGetBrowserRedirect(pathname) {\n let redirect = redirectMap.get(pathname)\n if (!redirect) {\n redirect = redirectIgnoreCaseMap.get(pathname.toLowerCase())\n }\n return redirect\n}\n","import { apiRunner } from \"./api-runner-browser\"\n\nif (\n window.location.protocol !== `https:` &&\n window.location.hostname !== `localhost`\n) {\n console.error(\n `Service workers can only be used over HTTPS, or on localhost for development`\n )\n} else if (`serviceWorker` in navigator) {\n navigator.serviceWorker\n .register(`${__BASE_PATH__}/sw.js`)\n .then(function (reg) {\n reg.addEventListener(`updatefound`, () => {\n apiRunner(`onServiceWorkerUpdateFound`, { serviceWorker: reg })\n // The updatefound event implies that reg.installing is set; see\n // https://w3c.github.io/ServiceWorker/#service-worker-registration-updatefound-event\n const installingWorker = reg.installing\n console.log(`installingWorker`, installingWorker)\n installingWorker.addEventListener(`statechange`, () => {\n switch (installingWorker.state) {\n case `installed`:\n if (navigator.serviceWorker.controller) {\n // At this point, the old content will have been purged and the fresh content will\n // have been added to the cache.\n\n // We set a flag so Gatsby Link knows to refresh the page on next navigation attempt\n window.___swUpdated = true\n // We call the onServiceWorkerUpdateReady API so users can show update prompts.\n apiRunner(`onServiceWorkerUpdateReady`, { serviceWorker: reg })\n\n // If resources failed for the current page, reload.\n if (window.___failedResources) {\n console.log(`resources failed, SW updated - reloading`)\n window.location.reload()\n }\n } else {\n // At this point, everything has been precached.\n // It's the perfect time to display a \"Content is cached for offline use.\" message.\n console.log(`Content is now available offline!`)\n\n // Post to service worker that install is complete.\n // Delay to allow time for the event listener to be added --\n // otherwise fetch is called too soon and resources aren't cached.\n apiRunner(`onServiceWorkerInstalled`, { serviceWorker: reg })\n }\n break\n\n case `redundant`:\n console.error(`The installing service worker became redundant.`)\n apiRunner(`onServiceWorkerRedundant`, { serviceWorker: reg })\n break\n\n case `activated`:\n apiRunner(`onServiceWorkerActive`, { serviceWorker: reg })\n break\n }\n })\n })\n })\n .catch(function (e) {\n console.error(`Error during service worker registration:`, e)\n })\n}\n","import React from \"react\"\n\nconst SlicesResultsContext = React.createContext({})\nconst SlicesContext = React.createContext({})\nconst SlicesMapContext = React.createContext({})\nconst SlicesPropsContext = React.createContext({})\n\nexport {\n SlicesResultsContext,\n SlicesContext,\n SlicesMapContext,\n SlicesPropsContext,\n}\n","import React from \"react\"\nimport PropTypes from \"prop-types\"\nimport { createServerOrClientContext } from \"./context-utils\"\n\nconst StaticQueryContext = createServerOrClientContext(`StaticQuery`, {})\n\nfunction StaticQueryDataRenderer({ staticQueryData, data, query, render }) {\n const finalData = data\n ? data.data\n : staticQueryData[query] && staticQueryData[query].data\n\n return (\n \n {finalData && render(finalData)}\n {!finalData &&
Loading (StaticQuery)
}\n \n )\n}\n\nlet warnedAboutStaticQuery = false\n\n// TODO(v6): Remove completely\nconst StaticQuery = props => {\n const { data, query, render, children } = props\n\n if (process.env.NODE_ENV === `development` && !warnedAboutStaticQuery) {\n console.warn(\n `The component is deprecated and will be removed in Gatsby v6. Use useStaticQuery instead. Refer to the migration guide for more information: https://gatsby.dev/migrating-4-to-5/#staticquery--is-deprecated`\n )\n warnedAboutStaticQuery = true\n }\n\n return (\n \n {staticQueryData => (\n \n )}\n \n )\n}\n\nStaticQuery.propTypes = {\n data: PropTypes.object,\n query: PropTypes.string.isRequired,\n render: PropTypes.func,\n children: PropTypes.func,\n}\n\nconst useStaticQuery = query => {\n if (\n typeof React.useContext !== `function` &&\n process.env.NODE_ENV === `development`\n ) {\n // TODO(v5): Remove since we require React >= 18\n throw new Error(\n `You're likely using a version of React that doesn't support Hooks\\n` +\n `Please update React and ReactDOM to 16.8.0 or later to use the useStaticQuery hook.`\n )\n }\n\n const context = React.useContext(StaticQueryContext)\n\n // query is a stringified number like `3303882` when wrapped with graphql, If a user forgets\n // to wrap the query in a grqphql, then casting it to a Number results in `NaN` allowing us to\n // catch the misuse of the API and give proper direction\n if (isNaN(Number(query))) {\n throw new Error(`useStaticQuery was called with a string but expects to be called using \\`graphql\\`. Try this:\n\nimport { useStaticQuery, graphql } from 'gatsby';\n\nuseStaticQuery(graphql\\`${query}\\`);\n`)\n }\n\n if (context[query]?.data) {\n return context[query].data\n } else {\n throw new Error(\n `The result of this StaticQuery could not be fetched.\\n\\n` +\n `This is likely a bug in Gatsby and if refreshing the page does not fix it, ` +\n `please open an issue in https://github.com/gatsbyjs/gatsby/issues`\n )\n }\n}\n\nexport { StaticQuery, StaticQueryContext, useStaticQuery }\n","import React from \"react\"\n\n// Ensure serverContext is not created more than once as React will throw when creating it more than once\n// https://github.com/facebook/react/blob/dd2d6522754f52c70d02c51db25eb7cbd5d1c8eb/packages/react/src/ReactServerContext.js#L101\nconst createServerContext = (name, defaultValue = null) => {\n /* eslint-disable no-undef */\n if (!globalThis.__SERVER_CONTEXT) {\n globalThis.__SERVER_CONTEXT = {}\n }\n\n if (!globalThis.__SERVER_CONTEXT[name]) {\n globalThis.__SERVER_CONTEXT[name] = React.createServerContext(\n name,\n defaultValue\n )\n }\n\n return globalThis.__SERVER_CONTEXT[name]\n}\n\nfunction createServerOrClientContext(name, defaultValue) {\n if (React.createServerContext) {\n return createServerContext(name, defaultValue)\n }\n\n return React.createContext(defaultValue)\n}\n\nexport { createServerOrClientContext }\n","/**\n * Remove a prefix from a string. Return the input string if the given prefix\n * isn't found.\n */\n\nexport default function stripPrefix(str, prefix = ``) {\n if (!prefix) {\n return str\n }\n\n if (str === prefix) {\n return `/`\n }\n\n if (str.startsWith(`${prefix}/`)) {\n return str.slice(prefix.length)\n }\n\n return str\n}\n","export const onRouteUpdate = ({\n location\n}, pluginOptions = {\n stripQueryString: false\n}) => {\n const domElem = document.querySelector(`link[rel='canonical']`);\n const existingValue = domElem.getAttribute(`href`);\n const baseProtocol = domElem.getAttribute(`data-baseProtocol`);\n const baseHost = domElem.getAttribute(`data-baseHost`);\n\n if (existingValue && baseProtocol && baseHost) {\n let value = `${baseProtocol}//${baseHost}${location.pathname}`;\n const {\n stripQueryString\n } = pluginOptions;\n\n if (!stripQueryString) {\n value += location.search;\n }\n\n value += location.hash;\n domElem.setAttribute(`href`, `${value}`);\n }\n};","/* global __MANIFEST_PLUGIN_HAS_LOCALISATION__ */\nimport { withPrefix } from \"gatsby\";\nimport getManifestForPathname from \"./get-manifest-pathname\"; // when we don't have localisation in our manifest, we tree shake everything away\n\nexport const onRouteUpdate = function onRouteUpdate({\n location\n}, pluginOptions) {\n if (__MANIFEST_PLUGIN_HAS_LOCALISATION__) {\n const {\n localize\n } = pluginOptions;\n const manifestFilename = getManifestForPathname(location.pathname, localize, true);\n const manifestEl = document.head.querySelector(`link[rel=\"manifest\"]`);\n\n if (manifestEl) {\n manifestEl.setAttribute(`href`, withPrefix(manifestFilename));\n }\n }\n};","\"use strict\";\n\nexports.__esModule = true;\nexports.default = void 0;\n\nvar _gatsby = require(\"gatsby\");\n\n/**\n * Get a manifest filename depending on localized pathname\n *\n * @param {string} pathname\n * @param {Array<{start_url: string, lang: string}>} localizedManifests\n * @param {boolean} shouldPrependPathPrefix\n * @return string\n */\nvar _default = (pathname, localizedManifests, shouldPrependPathPrefix = false) => {\n const defaultFilename = `manifest.webmanifest`;\n\n if (!Array.isArray(localizedManifests)) {\n return defaultFilename;\n }\n\n const localizedManifest = localizedManifests.find(app => {\n let startUrl = app.start_url;\n\n if (shouldPrependPathPrefix) {\n startUrl = (0, _gatsby.withPrefix)(startUrl);\n }\n\n return pathname.startsWith(startUrl);\n });\n\n if (!localizedManifest) {\n return defaultFilename;\n }\n\n return `manifest_${localizedManifest.lang}.webmanifest`;\n};\n\nexports.default = _default;","\"use strict\";\n\nexports.DEFAULT_OPTIONS = {\n maxWidth: 650,\n wrapperStyle: \"\",\n backgroundColor: \"white\",\n linkImagesToOriginal: true,\n showCaptions: false,\n markdownCaptions: false,\n withWebp: false,\n withAvif: false,\n tracedSVG: false,\n loading: \"lazy\",\n decoding: \"async\",\n disableBgImageOnAlpha: false,\n disableBgImage: false\n};\nexports.EMPTY_ALT = \"GATSBY_EMPTY_ALT\";\nexports.imageClass = \"gatsby-resp-image-image\";\nexports.imageWrapperClass = \"gatsby-resp-image-wrapper\";\nexports.imageBackgroundClass = \"gatsby-resp-image-background-image\";","\"use strict\";\n\nvar _require = require(\"./constants\"),\n DEFAULT_OPTIONS = _require.DEFAULT_OPTIONS,\n imageClass = _require.imageClass,\n imageBackgroundClass = _require.imageBackgroundClass,\n imageWrapperClass = _require.imageWrapperClass;\n\nexports.onRouteUpdate = function (apiCallbackContext, pluginOptions) {\n var options = Object.assign({}, DEFAULT_OPTIONS, pluginOptions);\n var imageWrappers = document.querySelectorAll(\".\" + imageWrapperClass); // https://css-tricks.com/snippets/javascript/loop-queryselectorall-matches/\n // for cross-browser looping through NodeList without polyfills\n\n var _loop = function _loop(i) {\n var imageWrapper = imageWrappers[i];\n var backgroundElement = imageWrapper.querySelector(\".\" + imageBackgroundClass);\n var imageElement = imageWrapper.querySelector(\".\" + imageClass);\n\n var onImageLoad = function onImageLoad() {\n backgroundElement.style.transition = \"opacity 0.5s 0.5s\";\n imageElement.style.transition = \"opacity 0.5s\";\n onImageComplete();\n };\n\n var onImageComplete = function onImageComplete() {\n backgroundElement.style.opacity = 0;\n imageElement.style.opacity = 1;\n imageElement.style.color = \"inherit\";\n imageElement.style.boxShadow = \"inset 0px 0px 0px 400px \" + options.backgroundColor;\n imageElement.removeEventListener(\"load\", onImageLoad);\n imageElement.removeEventListener(\"error\", onImageComplete);\n };\n\n imageElement.style.opacity = 0;\n imageElement.addEventListener(\"load\", onImageLoad);\n imageElement.addEventListener(\"error\", onImageComplete);\n\n if (imageElement.complete) {\n onImageComplete();\n }\n };\n\n for (var i = 0; i < imageWrappers.length; i++) {\n _loop(i);\n }\n};","/**\n * Copyright (c) 2013-present, Facebook, Inc.\n *\n * This source code is licensed under the MIT license found in the\n * LICENSE file in the root directory of this source tree.\n */\n\n'use strict';\n\n/**\n * Use invariant() to assert state which your program assumes to be true.\n *\n * Provide sprintf-style format (only %s is supported) and arguments\n * to provide information about what broke and what you were\n * expecting.\n *\n * The invariant message will be stripped in production, but the invariant\n * will remain to ensure logic does not differ in production.\n */\n\nvar invariant = function(condition, format, a, b, c, d, e, f) {\n if (process.env.NODE_ENV !== 'production') {\n if (format === undefined) {\n throw new Error('invariant requires an error message argument');\n }\n }\n\n if (!condition) {\n var error;\n if (format === undefined) {\n error = new Error(\n 'Minified exception occurred; use the non-minified dev environment ' +\n 'for the full error message and additional helpful warnings.'\n );\n } else {\n var args = [a, b, c, d, e, f];\n var argIndex = 0;\n error = new Error(\n format.replace(/%s/g, function() { return args[argIndex++]; })\n );\n error.name = 'Invariant Violation';\n }\n\n error.framesToPop = 1; // we don't care about invariant's own frame\n throw error;\n }\n};\n\nmodule.exports = invariant;\n","/**\n * @license React\n * react-server-dom-webpack.production.min.js\n *\n * Copyright (c) Facebook, Inc. and its affiliates.\n *\n * This source code is licensed under the MIT license found in the\n * LICENSE file in the root directory of this source tree.\n */\n'use strict';var k=require(\"react\"),l={stream:!0},n=new Map,p=Symbol.for(\"react.element\"),q=Symbol.for(\"react.lazy\"),r=Symbol.for(\"react.default_value\"),t=k.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED.ContextRegistry;function u(a){t[a]||(t[a]=k.createServerContext(a,r));return t[a]}function v(a,b,c){this._status=a;this._value=b;this._response=c}v.prototype.then=function(a){0===this._status?(null===this._value&&(this._value=[]),this._value.push(a)):a()};\nfunction w(a){switch(a._status){case 3:return a._value;case 1:var b=JSON.parse(a._value,a._response._fromJSON);a._status=3;return a._value=b;case 2:b=a._value;for(var c=b.chunks,d=0;d {\n const { forward = [], ...filteredConfig } = config || {};\n const configStr = JSON.stringify(filteredConfig, (k, v) => {\n if (typeof v === 'function') {\n v = String(v);\n if (v.startsWith(k + '(')) {\n v = 'function ' + v;\n }\n }\n return v;\n });\n return [\n `!(function(w,p,f,c){`,\n Object.keys(filteredConfig).length > 0\n ? `c=w[p]=Object.assign(w[p]||{},${configStr});`\n : `c=w[p]=w[p]||{};`,\n `c[f]=(c[f]||[])`,\n forward.length > 0 ? `.concat(${JSON.stringify(forward)})` : ``,\n `})(window,'partytown','forward');`,\n snippetCode,\n ].join('');\n};\n\n/**\n * The `type` attribute for Partytown scripts, which does two things:\n *\n * 1. Prevents the `Contact | Skohub Blog
Contact via e-mail and feel free to open issues in the different repositories of Skohub Editor, Skohub Vocabs, Skohub Pubsub and Skohub Extension, not only for bugs or enhancements, but also questions about Skohub usage, or to share your experiences.
We welcome a new team member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will be working on SkoHub and invite people to a SkoHub planning workshop.
The SkoHub team at hbz has launched a cooperation with the Hamburg-based company effective WEBWORK to work on some pending issues regarding both functionality and design of SkoHub Vocabs.
We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands-on learning how to publish a small vocabulary with SkoHub Vocabs.
Short report about the SkoHub presentation at the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) organized by the Working Group within the GfKL – Data Science Society.
An introduction to the publish/subscribe approach to content discovery to be implemented with SkoHub.
+
+
\ No newline at end of file
diff --git a/manifest.webmanifest b/manifest.webmanifest
new file mode 100644
index 0000000..e8409f2
--- /dev/null
+++ b/manifest.webmanifest
@@ -0,0 +1 @@
+{"name":"Skohub Blog","short_name":"Skohub Blog","start_url":"/","background_color":"#ffffff","theme_color":"#26c884","display":"minimal-ui","icons":[{"src":"icons/icon-48x48.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"48x48","type":"image/png"},{"src":"icons/icon-72x72.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"72x72","type":"image/png"},{"src":"icons/icon-96x96.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"96x96","type":"image/png"},{"src":"icons/icon-144x144.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"144x144","type":"image/png"},{"src":"icons/icon-192x192.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"192x192","type":"image/png"},{"src":"icons/icon-256x256.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"256x256","type":"image/png"},{"src":"icons/icon-384x384.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"384x384","type":"image/png"},{"src":"icons/icon-512x512.png?v=3b68e7175ff10cac1dc50da0827ce2e1","sizes":"512x512","type":"image/png"}]}
\ No newline at end of file
diff --git a/page-data/2019-05-17-skohub/page-data.json b/page-data/2019-05-17-skohub/page-data.json
new file mode 100644
index 0000000..7cdd846
--- /dev/null
+++ b/page-data/2019-05-17-skohub/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2019-05-17-skohub/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"489f8de8-6e66-5dbf-af18-bf10abc74e20","excerpt":"For a long time, openness movements and initiatives with labels like “Open Access”, “Open Educational Resources” (OER) or “Linked Science” have been working on…","html":"
For a long time, openness movements and initiatives with labels like “Open Access”, “Open Educational Resources” (OER) or “Linked Science” have been working on establishing a culture where scientific or educational resources are by default published with an open license on the web to be read, used, remixed and shared by anybody. With a growing supply of resources on the web, the challenge grows to learn about or find resources relevant for your teaching, studies, or research.
\n
In this post, we describe the SkoHub project being carried out in 2019 by the hbz in cooperation with graphthinking GmbH. The project seeks to implement a prototype for a novel approach in syndicating content on the web by combining current web standards for sending notifications and subscribing to feeds with knowledge organization systems (KOS, sometimes also called “controlled vocabularies”).*
\n
Current practices and problems
\n
What are the present approaches to the problem of finding open content on the web, and what are their limitations?
\n
Searching metadata harvested from silos
\n
Current approaches for publishing and finding open content on the web are often focused on repositories as the place to publish content. Those repositories then provide (ideally standardized) interfaces for crawlers to collect and index the metadata in order to offer search solutions on top. An established approach for Open Access (OA) articles goes like this:
\n
\n
Repositories with interfaces for metadata harvesting (OAI-PMH) are set up for scholars to upload their OA publications
\n
Metadata is crawled from those repositories, normalized and loaded into search indexes
\n
Search interfaces are offered to end users
\n
\n\n
With this approach, subject-specific filtering is either already done when crawling the data to create a subject-specific index, or when searching the index.
\n
Maintenance burden
\n
When offering a search interface with this approach, you have to create and maintain a list of sources to harvest:
\n\n
watch out for new relevant sources to be added to your list,
\n
adjust your crawler to changes regarding the services’ harvesting interface,
\n
homogenize data from different sources to get a consistent search index.
\n\n
Furthermore, end users have to know where to find your service to search for relevant content.
\n
Off the web
\n
Besides being error-prone and requiring resources for keeping up with changes in the repositories, this approach also does not take into account how web standards work. As Van de Sompel and Nelson 2015 (both co-editors of the OAI-PMH specification) phrase it:
\n
\n
“Conceptually, we have come to see [OAI-PMH] as repository-centric instead of resource-centric or web-centric. It has its starting point in the repository, which is considered to be the center of the universe. Interoperability is framed in terms of the repository, rather than in terms of the web and its primitives. This kind of repository, although it resides on the web, hinders seamless access to its content because it does not fully embrace the ways of the web.”
\n
\n
In short, the repository metaphor guiding this practice obscures what constitutes the web: resources that are identified by HTTP URIs (Uniform Resource Identifier).
\n
Subject-specific subscription to web resources
\n
So how could a web- or resource-centric approach to resource discovery by subject look like?
\n
Of the web
\n
To truly be part of the web, URIs are the most important part: Every resource (e.g. an OER) needs a URL that locates and identifies it. In order to make use of knowledge organization systems on the web, representing a controlled vocabulary using SKOS vocabulary is the best way to go forward: each subject in the vocabulary is identified by a URI. With these prerequisites, anybody can link their resources to subjects from a controlled vocabulary. This can be done e.g. by embedding LRMI, “Learning Resource Metadata Initiative” metadata as JSON-LD into the resource or its description page.
\n\n
Web-based subscriptions and notifications
\n
So, HTTP URIs for resources and subject are important to transparently publish and thereafter identify and link educational resources, controlled vocabularies and subject on the web. But with URIs as the basic requirement in place, we also get the possibility to utilize further web standards for the discovery of OER. For SkoHub, we make use of Social Web Protocols to build an infrastructure where services can send and subscribe to notifications for subject. The general setup looks as follows:
\n\n
Every element of a controlled vocabulary gets an inbox, identified by a URL.
\n\n
\n2. Systems can send notifications to the inbox, for example “This is a new resource about this subject”.\n\n3. Systems can subscribe to a subject’s inbox and will directly receive a notification as soon as it is received (push approach).\n
\n
This infrastructure allows applications
\n\n
to send a notification to a subject’s inbox containing information about and a link to new content about this subject
\n
to subscribe to the inbox of a subject from a knowledge organization system in order to receive push updates about new content in real time.
\n\n
Here is an example: a teacher is interested in new resources about environmental subjects. She subscribes to the subject via a controlled vocabulary like ISCED-2013 Fields of Education and Training. She then receives updates whenever a colleague publishes a resource that is linked to the subject.
\n\n
To be really useful, applications for subscribing to content should enable additional filters, to subscribe to combinations of subjects (e.g. “Environment” & “Building and civil engineering”) or to add addtional filters on educational level, license type etc.
\n
Advantages
\n
This subject-oriented notification/subscription approach to content syndication on the web has many advantages.
\n
Push instead of pull\n \nWith the push approach, you subscribe once and content is coming from different and new sources without the subscriber having to maintain a list of sources. Of course quality control might become an issue. Thus, instead of whitelisting by administering a subscription list one would practice blacklisting by filtering out sources that distribute spam or provide low-quality content.
\n
Supporting web-wide publications
\n
Being of the web, SkoHub supports publications residing anywhere on the web. While the repository-centric approach favours content in a repository that provides interfaces for harvesting, with SkoHub any web resource can make use of the notification mechanism. Thus, content producers can choose which tool or platform best fits their publishing needs, be it YouTube, a repository, hackmd.io or else. The only requirement is for publications to have a stable URL and, voilà, they can syndicate their content via KOS.
\n
Knowledge organization systems are used to their full potential\n \nThis additional layer to the use of Knowledge Organization Systems makes them much more powerful (“KOS on steroids”) and attractive for potential users.
\n
Encouraging creation and use of shared Knowledge Organization Systems across applications\n \nIn the German OER context it is a recurring theme that people are wishing everybody would use the same controlled vocabularies so that data exchange and aggregation required less mapping. With a SkoHub infrastructure in place, there are big additional incentives on going forward in this direction.
\n
Incentive for content producers to add machine-readable descriptions\n \nWhen subject indexing becomes tantamount with notifying interested parties about one’s resources, this means a huge incentive for content producers to describe their resources with structured data doing subject indexing.
\n
SkoHub project scope
\n
The SkoHub project has four deliverables. While working on the backend infrastructure for receiving and pushing notifications (skohub-pubsub), we also want to provide people with means to publish a controlled vocabulary along with inboxes (skohub-ssg), to link to subjects and send notifications (skohub-editor) and to subscribe to notifications in the browser (skohub-deck).
\n
skohub-pubsub: Inboxes and subscriptions\n \nCode: https://github.com/hbz/skohub-pubsub\n \nThis part provides the SkoHub core infrastructure, setting up basic inboxes for subjects plus the ability of subscribing to push subscriptions for each new notification.
\n
skohub-ssg: Static site generator for Simple Knowledge Organization Systems\n \nCode: https://github.com/hbz/skohub-ssg\n \nThis part of the project covers the need to easily publish a controlled vocabulary as a SKOS file, with a basic lookup API and a nice HTML view including links to an inbox for each subject.
\n
skohub-editor: Describing & linking learning resources, sending notifications\n \nCode: https://github.com/hbz/skohub-editor\n \nThe editor will run in the browser and enable structured description of educational resources published anywhere on the web. It includes validation of the entered content for each field and lookup of controlled values via the API provided by skohub-ssg.
\n
skohub-deck: Browser-based subscription to subjects\n \nCode: https://github.com/hbz/skohub-deck\n \nThe SkoHub deck is a proof of concept to show that the technologies developed actually work. It enables people to subscribe to notifications for specific subjects in the browser. The incoming notifications will be shown in a Tweetdeck-like interface.
\n
Outlook
\n
The project will be completed by end of 2019. We intend to provide updates about the process during the way. Next up, we will explain the technical architecture in more detail, expanding on our use of social web protocols. Furthermore, we will provide updates on the development status of the project.
\n\n
* Note that while SkoHub has clear similarities with the “Information-Sharing Pipeline” envisioned in Ilik and Koster 2019 regarding the use of social web protocols on authority data, there is also a fundamental difference: While Ilik and Koster are talking about sharing updates of authority entries themselves (e.g. receiving updates for a person profile to be considered for inclusion in one’s own authority file), SkoHub is about sharing new links to an entry in an authority file or other controlled vocabulary.
\n
References
\n
de Sompel, Herbert Van / Nelson, Michael L. (2015): Reminiscing About 15 Years of Interoperability Efforts. D-Lib Magazine 21 , no. 11/12. DOI: 10.1045/november2015-vandesompel
","frontmatter":{"title":"SkoHub: Enabling KOS-based content subscription","date":"May 17, 2019","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}],"description":"An introduction to the publish/subscribe approach to content discovery to be implemented with SkoHub."}},"previous":null,"next":{"fields":{"slug":"/2019-09-27-skohub-vocabs/"},"frontmatter":{"title":"Presenting the SkoHub Vocabs Prototype"}}},"pageContext":{"id":"489f8de8-6e66-5dbf-af18-bf10abc74e20","previousPostId":null,"nextPostId":"06d8681b-f2a7-504d-abe6-bda80b826cc8"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2019-09-27-skohub-vocabs/page-data.json b/page-data/2019-09-27-skohub-vocabs/page-data.json
new file mode 100644
index 0000000..90b96c5
--- /dev/null
+++ b/page-data/2019-09-27-skohub-vocabs/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2019-09-27-skohub-vocabs/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"06d8681b-f2a7-504d-abe6-bda80b826cc8","excerpt":"We are happy to announce that the SkoHub prototype outlined in our post “SkoHub: Enabling KOS-based content subscription” is now finished. In a series of three…","html":"
We are happy to announce that the SkoHub prototype outlined in our post “SkoHub: Enabling KOS-based content subscription” is now finished. In a series of three post we will report on the outcome by walking through the different components and presenting their features.
\n
SkoHub is all about utilizing the power of Knowledge Organization Systems (KOS) to create a publication/subscription infrastructure for Open Educational Resources (OER). Consequently, publishing these KOS on the web according to the standards was the first area of focus for us. We are well aware that there are already plenty of Open Source tools to publish and edit vocabularies based on SKOS, but these are usually monolithic database applications. Our own workflows often involve managing smaller vocabularies as flat files on GitHub, and others seem to also do so.
\n
We will thus start this series with SkoHub Vocabs (formerly called “skohub-ssg”), a static site generator that provides integration for a GitHub-based workflow to publish an HTML version of SKOS vocabularies. Check out the JAMStack Best Practices for some thoughts about the advantages of this approach. SkoHub Vocabs – like SkoHub Editor that will be presented in a separate post – is a stand-alone module that can already be helpful on its own, when used without any of the other SkoHub modules.
\n
How to publish a SKOS scheme from GitHub with SkoHub Vocabs
\n
Let’s take a look at the editing and publishing workflow step by step. We will use SkoHub Vocabs to publish a subject classification for Open Educational Resources. We will use the “Educational Subject Classification” (ESC), that was created for the OER World Map based on ISCED Fields of Education and Training 2013.
\n
Step 1: Publish vocab as turtle file(s) on GitHub
\n
Currently, a SKOS vocab has to be published in a GitHub repository as one or more Turtle file(s) in order to be processed by SkoHub Vocabs. ESC is already available on GitHub in one Turtle file, so there is nothing to do in this regard. Note that you can also use the static site generator locally, i.e. without GitHub integration; see below for more about this.
\n
Step 2: Configure webhook
\n
In order to publish a vocabulary from GitHub with SkoHub Vocabs, you have to set up a webhook in GitHub. It goes like this:
\n\n
In the GitHub repo where the vocab resides, go to “Settings” → “Webhooks” and click “Add webhook”
\n\n
\n2. Enter https://test.skohub.io/build as payload URL, choose application/json as content type and enter the secret. (Please contact us for the secret if you want to try it out.)\n
\n
Step 3: Execute build & error handling
\n
For the vocabulary to be built and published on SkoHub, there has to be a new commit in the master branch. So, we have to adjust something in the vocab and push it into the master branch. Looking again at the webhook page in the repo settings, you can see a notice that the build was triggered:
\n\n
However, looking at the build log, an error is shown and the site did not build:
\n\n
Oops, we forgot to check the vocab for syntax errors before triggering the build and there actually is a syntax error in the turtle file. Fixing the syntax in a new commit will automatically trigger a new build:
As we want the canonical version of ESC to be the one published with SkoHub Vocabs, we need to redirect the namespace URI we defined in the Turtle file to SkoHub. As we used w3id.org for this, we have to make a pull request in the respective repo.
\n
\n
If everything looks good, w3id.org PRs are merged very quickly, in this case it happened an hour later.
\n
Result: HTML & JSON-LD representation published with SkoHub & basic GitHub editing workflow
\n
As a result, we have published a controlled vocabulary in SKOS under a permanent URI and with a human-readable HTML representation from GitHub with a minimum amount of work. Additionally, the initial Turtle representation is transformed to more developer-friendly JSON-LD. The HTML has a hierarchy view that can be expanded and collapsed at will:
\n
\n
There also is a search field to easily filter the vocabulary:
\n\n
This filter is based on a FlexSearch index that is also built along with the rest of the content. This allows us to implement lookup functionalities without the need for a server-side API. More about this below and in the upcoming post on the SkoHub Editor.
\n
Implementation
\n
To follow along the more technical aspects, you might want to have SkoHub Vocabs checked out locally:
The static site generator itself is implemented with Gatsby. One reason for this choice was our good previous experience with React. Another nice feature of Gatsby is that all content is sourced into an in-memory database that is available using GraphQL. While there is certainly a learning curve, this makes the experience of creating a static site not that much different from traditional database-based approaches. You can locally build a vocab as follows:
\n
$ cp test/data/systematik.ttl data/\n$ npm run build
\n
This will result in a build in public/ directory. Currently, the build is optimized to be served by Apache with Multiviews in order to provide content negotiation. Please note that currently only vocabularies are supported that implement the slash namespace pattern. We will add support for hash URIs in the future.
\n
In order to trigger the static site generator from GitHub, a small webhook server based on Koa was implemented. (Why not Express? – It wouldn’t have made a difference.) The webhook server listens for and validates POST requests coming from GitHub, retrieves the data from the corresponding repository and then spins up Gatsby to create the static content.
\n
A final word on the FlexSearch index mentioned above. An important use case for vocabularies is to access them from external applications. Using the FlexSearch library and the index pre-built by SkoHub Vocabs, a lookup of vocabulary terms is easy to implement:
Note that currently the index will only return URIs associated with the search term, not the corresponding labels. This will change in a future update.
","frontmatter":{"title":"Presenting the SkoHub Vocabs Prototype","date":"September 27, 2019","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}],"description":"An introducatory post to SkoHub Vocabs."}},"previous":{"fields":{"slug":"/2019-05-17-skohub/"},"frontmatter":{"title":"SkoHub: Enabling KOS-based content subscription"}},"next":{"fields":{"slug":"/2020-01-29-skohub-talk-at-swib19/"},"frontmatter":{"title":"SkoHub talk at SWIB19: KOS-based content syndication with ActivityPub"}}},"pageContext":{"id":"06d8681b-f2a7-504d-abe6-bda80b826cc8","previousPostId":"489f8de8-6e66-5dbf-af18-bf10abc74e20","nextPostId":"c06e00ef-a3aa-51cf-b522-3c4e20079efd"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2020-01-29-skohub-talk-at-swib19/page-data.json b/page-data/2020-01-29-skohub-talk-at-swib19/page-data.json
new file mode 100644
index 0000000..491d44f
--- /dev/null
+++ b/page-data/2020-01-29-skohub-talk-at-swib19/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2020-01-29-skohub-talk-at-swib19/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"c06e00ef-a3aa-51cf-b522-3c4e20079efd","excerpt":"On November 27th 2019, Adrian Pohl and Felix Ostrowski (graphthinking) presented SkoHub at the “Semantic Web in Libraries” conference in Hamburg (SWIB19).","html":"
On November 27th 2019, Adrian Pohl and Felix Ostrowski (graphthinking) presented SkoHub at the “Semantic Web in Libraries” conference in Hamburg (SWIB19).
\n
","frontmatter":{"title":"SkoHub talk at SWIB19: KOS-based content syndication with ActivityPub","date":"January 29, 2020","authors":[{"lastname":"Pohl","firstname":"Adrian"}],"description":null}},"previous":{"fields":{"slug":"/2019-09-27-skohub-vocabs/"},"frontmatter":{"title":"Presenting the SkoHub Vocabs Prototype"}},"next":{"fields":{"slug":"/2020-03-31-skohub-editor/"},"frontmatter":{"title":"Presenting the SkoHub Editor"}}},"pageContext":{"id":"c06e00ef-a3aa-51cf-b522-3c4e20079efd","previousPostId":"06d8681b-f2a7-504d-abe6-bda80b826cc8","nextPostId":"3f590c02-dbe3-5472-b3e9-0a71e9ac4b31"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2020-03-31-skohub-editor/page-data.json b/page-data/2020-03-31-skohub-editor/page-data.json
new file mode 100644
index 0000000..da17f4c
--- /dev/null
+++ b/page-data/2020-03-31-skohub-editor/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2020-03-31-skohub-editor/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"3f590c02-dbe3-5472-b3e9-0a71e9ac4b31","excerpt":"ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub Editor demo for an indefinite time. However, the code is…","html":"\n
ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub Editor demo for an indefinite time. However, the code is still there for anybody to set up their own instance.
\n\n
In a previous blog post we presented a first SkoHub module: SkoHub Vocabs. Before talking about another module, first a short summary of the features SkoHub Vocabs offers. Basically, it provides an editorial workflow to publish a SKOS vocabulary on the web which can then be consumed by humans and applications. It builds on git-based online software development platforms (currently GitHub and GitLab are supported) where you maintain a SKOS vocabulary as a Turtle file. This allows you to use all the associated features such as branches and pull requests for a full-fledged review process. With every new commit in a branch, triggered by a webhook, SkoHub Vocabs will build a static site for the vocab – with HTML for human consumption and JSON-LD for consumption by applications.
\n
In this post, we present SkoHub Editor (demo, code) that is accompanied by a browser extension. In a nutshell, SkoHub Editor enables the automatic generation of a web form based on a JSON schema, along with the possibility to look up terms in a controlled vocabulary that is published with SkoHub Vocabs. Additionally, metadata generated by the editor can be published using SkoHub PubSub, which we will describe in an upcoming post. Let’s take a look at the specifics by configuring an editor that lets you create JSON-LD describing an open educational resource (OER) on the web.
\n
Describing a resource with the browser extension
\n
Let’s start with actually using SkoHub Editor. You will have the most comfortable experience when using the SkoHub browser extension that wraps the SkoHub Editor and pre-populates some field in the web form. The browser extension is available both for Firefox and Chrome. Just add the extension to your browser and a little icon will be shown on the right-hand side of your navigation bar:
\n\n
While having any web page open, you can now open the SkoHub editor in your browser to describe that web resource. Let’s use as an example the YouTube video “COVID-19 – 6 Dangerous Coronavirus Myths, Busted by World Health Organization” published recently by the World Economic Forum under a CC-BY license. Open the video in your browser, click on the extension and you will see that several fields are automatically filled out.
\n\n
We can now add additional metadata by selecting a type (VideoObject in this case), add a creator, creation date, language etc. As we mentioned, you can look up a subject from a controlled vocabulary for some fields in the web form. You will experience this when inputting content into the fields “Subject”, “License”, “Learning Resource Type”, and “Intended Audience”. For those fields you will get a drop down with suggestions from a controlled vocabulary, e.g. for “Subject” from a German classification of subjects in Higher education that is published with SkoHub Vocabs.
\n\n
Currently, only the fields “URL”, “Type” and “Title” are obligatory, all other fields are optional. When you think you have described the resource sufficiently, you can click on “Show Preview” in the extension, copy & paste the JSON-LD to the clipboard and include it in the HTML of any web page within a <script type=\"application/ld+json\"> tag.
As said above, the SkoHub Extension wraps the SkoHub Editor running at https://skohub.io/editor/. SkoHub Editor is configured with a JSON schema document that is used both to generate appropriate form inputs and to validate the entered content. Thus, the JSON Schema is the central, most crucial part when working with SkoHub Editor. Currently, we are using as default schema a draft schema for OER we created using relevant properties and types from schema.org. With the JSON schema URL, we can now load the web form you already know from the browser extension by providing the link to the schema. Of course, you can just write your own schema to build a web form for your use case.
\n
Let’s take a short look at the underlying schema, which we tried to keep as straightforward as possible. Generally, with JSON schema you can specify a number of optional or mandatory properties and what type of input each expects. The \"title\" of each property will be used as the label for the field in the web form.
Such lists of allowed values can be considered controlled vocabularies, and ideally they should be shared across many data sources. This is where SkoHub Vocabs comes into play. Instead of embedding the list of allowed values into our schema, we can reference a SKOS vocabulary on the web:
Notice the custom key _widget in the JSON schema. This will configure the editor to use the specified UI element for the given field. In our example, the SkohubLookup widget is used, which works with all controlled vocabularies that are published with SkoHub Vocabs. All custom JSON schema extensions start with an underscore _ and are used to control the look and feel of the editor; see below for an example for how to hide a field on the form.
\n
Finally, to make our data JSON-LD, we also set a mandatory @context property and a default object value for the @context. This makes the editor add it to the document without any user interaction needed.
\n
{\n \"$schema\": \"http://json-schema.org/draft-07/schema#\",\n \"title\": \"OER\",\n \"description\": \"This is a generic JSON schema for describing an Open Educational Resource with schema.org\",\n \"type\": \"object\",\n \"default\": {\n \"@context\": {\n \"id\": \"@id\",\n \"type\": \"@type\",\n \"@vocab\": \"http://schema.org/\",\n \"skos\": \"http://www.w3.org/2004/02/skos/core#\",\n \"prefLabel\": \"skos:prefLabel\",\n \"inScheme\": \"skos:inScheme\",\n \"Concept\": \"skos:Concept\"\n }\n },\n \"properties\": {\n \"@context\": {\n \"type\": \"object\",\n \"additionalProperties\": true,\n \"_display\": {\n \"className\": \"hidden\"\n }\n }\n}
\n
Implementation
\n
Of course you can also poke around the editor while running it locally:
\n
$ git clone https://github.com/hbz/skohub-editor.git\n$ cd skohub-editor\n$ npm install
\n
As is the case with SkoHub Vocabs, the editor is implemented in React. The form components are located in src/components/JSONSchemaForm. In a nutshell, a Form provides data to the various input components:
Obviously it would be tedious to manually code all the inputs for a given schema. This is where the Builder comes into play. It reads a schema and creates all necessary input components:
The browser extension is essentially a simple wrapper for the editor running at https://skohub.io/editor/, which the extension injects as an iframe into the current page. Additionally, before the iframe is injected, some metadata is scraped from that page. This data is used to pre-populate the editor. This process obviously depends both on the data found in the web page and on the schema the editor is configured to use. YouTube for example uses meta name=\"description\" for data about YouTube itself rather than the actual video, which is described in meta property=\"og:description\". Even if the correct metadata is extracted, there is no guarantee that the schema used to configure the editor even has a description field. In the future, it would be nice to find a possibility to somehow map page metadata to properties in the schema itself.
\n
Outlook
\n
SkoHub Editor already works very well and can be extremely useful. However, some things are still work in progress and will need some future effort to be improved:
\n
\n
Using schema.org markup for pre-population: This might sound obvious but we have not implemented it yet, see #17.
Furthermore, some work will have to be put into the current default schema and the controlled vocabularies it uses:
\n
\n
Develop JSON Schema: The JSON Schema definitely is not finished yet. For example, it makes sense to include http://schema.org/keywords in the future for adding arbitrary tags to describe a resource. We plan to develop the schema within the common OER metadata group of DINI AG KIM & Jointly with a focus on describing OER in the German-speaking world.
\n
Improve Vocabularies: For “Learning Resource Type” and “Intended Audience” we are using controlled vocabularies that are not nearly finished but in development at the LRMI Task Group of the Dublin Core Metadata Initiative (DCMI). Trying out the browser extension, you will for instance see that the educational resources types are missing some options. However, we assume that the combination of SkoHub Editor & SkoHub Vocabs makes a pretty nice environment for the development of these vocabularies in an open and transparent process on GitHub or GitLab.
\n
\n
Get involved
\n
Please try it out and let us know what doesn’t work or which feature you are missing and also what you like about SkoHub. We are happy about every bug report, suggestion and feature requests for the production version. Get in contact with us via a hypothes.is annotation, GitHub, Email, Mastodon or IRC.
","frontmatter":{"title":"Presenting the SkoHub Editor","date":"March 31, 2020","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}],"description":"An introducatory post to SkoHub Editor."}},"previous":{"fields":{"slug":"/2020-01-29-skohub-talk-at-swib19/"},"frontmatter":{"title":"SkoHub talk at SWIB19: KOS-based content syndication with ActivityPub"}},"next":{"fields":{"slug":"/2020-06-25-skohub-pubsub/"},"frontmatter":{"title":"Presenting SkoHub PubSub"}}},"pageContext":{"id":"3f590c02-dbe3-5472-b3e9-0a71e9ac4b31","previousPostId":"c06e00ef-a3aa-51cf-b522-3c4e20079efd","nextPostId":"8222f844-ed14-527d-aaad-85f396f7f114"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2020-06-25-skohub-pubsub/page-data.json b/page-data/2020-06-25-skohub-pubsub/page-data.json
new file mode 100644
index 0000000..4256670
--- /dev/null
+++ b/page-data/2020-06-25-skohub-pubsub/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2020-06-25-skohub-pubsub/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"8222f844-ed14-527d-aaad-85f396f7f114","excerpt":"ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub PubSub demo server at skohub.io for an indefinite time…","html":"\n
ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub PubSub demo server at skohub.io for an indefinite time. However, the code is still there for anybody to set up their own instance.
\n\n
In the previous blog posts we have presented SkoHub Vocabs and SkoHub Editor. In the final post of this SkoHub introduction series we will take a deeper look at SkoHub PubSub, the part of SkoHub that brings the novel approach of KOS-based content subscription into the game.
\n
Let’s refresh what SkoHub is about by quoting the gist from the project homepage:
\n
\n
SkoHub supports a novel approach for finding content on the web. The general idea is to extend the scope of Knowledge Organization Systems (KOS) to also act as communication hubs for publishers and information seekers. In effect, SkoHub allows to follow specific subjects in order to be notified when new content about that subject is published.
\n
\n
Before diving into the technical implementation and protocols used, we provide an example on how this subscription, publication and notification process can be carried out in practice. Although SkoHub PubSub constitutes the core of the SkoHub infrastructure being the module that brings all SkoHub components together, it is not visible to end users by itself but only through applications which send out notifications or subscribe to a specific topic. (This is the great thing about open standards as it also invites everybody to develop new clients for specific use cases!)
\n
So, let’s take a look at an example workflow involving SkoHub Editor and the federated microblogging service Mastodon to demonstrate the functionalities.
On the left-hand side, you can see the location of the topic in the classification hierarchy. On the right-hand side, there is some basic information on the subject: It has a URI (https://w3id.org/class/esc/n0322), a notation (0322), a preferred label (Library, information and archival studies) and an inbox. This is how the underlying JSON data (e.g. by adding the format suffix .json to the URI) looks like:
Besides the usual SKOS properties, the followers key gives a hint that I can somehow follow this subject. Clicking on the associated URL, I will see a JSON file containing the list of followers of this subject. I am also interested in this topic and want to follow it to receive notifications about new online resources that are published and tagged with this subject. How do I achieve this?
\n
As already noted, what I need is an application that speaks ActivityPub. In this case we will use one of the most popular services in the Fediverse: Mastodon. So, I open up my Mastodon client and put the topic URI into the search box:
\n\n
I click on the follow button and am now following this subject with my Mastodon account and will receive any updates posted by it.
\n
Describing and announcing a resource with SkoHub Editor
\n
Let’s now switch into the role of a scholar, teacher, tutor or general interested person who has created an instructive online resource and wants to publish it to all people interested in the topic of “Library, information and archival studies”. In this case, I published a blog post about a talk at SWIB19 – Semantic Web in Libraries Conference and want to share it with others. I somehow need to send a message to the topic’s inbox, in this case I am using the SkoHub Editor (but it could be any other ActivityPub client or even the command line interface from which I publish). For the best user experience I download the SkoHub browser extension (Firefox, Chrome).
\n
As the default JSON schema uses another classification, we first have to configure the editor based on a schema that actually makes use of the Educational Subjects Classification. For this, we created a version of the default schema that does so. Now I put it into the extension’s settings:
\n\n
Then, I fire up the extension when visiting the web page I like to share and add data to the input form:
\n\n
I select the topic “Library, information and archival studies” from the suggestions in the “subject” field, add information on licensing etc. and click “Publish”. A pop up lets me know that the resource is published to “Library, information and archival studies”. In the background, the description of the resource is sent to the respective topic (it could be more than one) which distributes the information to all its subscribers. Thus, in the end I as a subscriber of the topic will receive a notification of the resource in my Mastodon timeline:
\n\n
Protocols and implementation
\n
The SkoHub-PubSub server is built in Node.js and implements a subset of ActivityPub, Linked Data Notifications and Webfinger to achieve the behavior described above. On the ActivityPub side, Server to Server Follow and corresponding Undo interactions can be received to handle the subscription mechanism. Non-activity messages are considered Linked Data Notifications and can simply be sent to the inbox of a subject using a POST request with any JSON body. These notifications are considered metadata payload, wrapped in a Create action and distributed to every follower of the corresponding subject again using ActivityPub.
\n
As for the internals, MongoDB is used to manage followers lists and an Elasticsearch index is used to keep an archive of all payloads that have been distributed. This archive can be used to search and further explore metadata that has been distributed, e.g. by visualizing the distribution of subjects across all payloads.
\n
The most challenging aspects of the implementation were to gain an understanding of Webfinger for user discovery and of the details of message signatures and how to validate them. “How to implement a basic ActivityPub server” was a good guidance here!
\n
Outlook
\n
We currently consider PubSub the least mature component of SkoHub. In the future, we would like to validate incoming Linked Data Notifications against a JSON schema that should be specific enough to ensure a consistent experience when viewing them e.g. in Mastodon but flexible enough to support additional use cases. We would also like to support ActivityPub on the publication side and Announce activities in order to enable use cases such as mentioning a SkoHub concept on Mastodon. We would really value your input on this!
","frontmatter":{"title":"Presenting SkoHub PubSub","date":"June 25, 2020","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}],"description":null}},"previous":{"fields":{"slug":"/2020-03-31-skohub-editor/"},"frontmatter":{"title":"Presenting the SkoHub Editor"}},"next":{"fields":{"slug":"/2020-10-09-skohub-apconf/"},"frontmatter":{"title":"ActivityPub Conference 2020"}}},"pageContext":{"id":"8222f844-ed14-527d-aaad-85f396f7f114","previousPostId":"3f590c02-dbe3-5472-b3e9-0a71e9ac4b31","nextPostId":"0b55c8af-edf8-5752-a9bd-9a5c99bc636c"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2020-10-09-skohub-apconf/page-data.json b/page-data/2020-10-09-skohub-apconf/page-data.json
new file mode 100644
index 0000000..0bf9203
--- /dev/null
+++ b/page-data/2020-10-09-skohub-apconf/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2020-10-09-skohub-apconf/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"0b55c8af-edf8-5752-a9bd-9a5c99bc636c","excerpt":"From 2-5 October the ActivityPub conference happened online where people using the ActivityPub protocol came together to discuss topics all around federated…","html":"
From 2-5 October the ActivityPub conference happened online where people using the ActivityPub protocol came together to discuss topics all around federated networks and the respective web standards. For presentations, a flipped classroom approach was chosen where talks would be uploaded before the conference and the live part would be Q&A sessions for each talk. On Sunday, there was an additional round of lightning talks and some birds of a feather (bof) sessions where – quite similar to a barcamp session – people interested in a topic could propose a session and meet with likeminded people.
On Friday, we had a session about SkoHub. Here is our previously recorded video:
\n
\n
With regard to creating and maintaining controlled vocabularies and assigning topics, (at least some) people who develop applications for the Fediverse have quite some interest in building on experiences and approaches from the library world. What we learned from our Q&A session is to better prepare next time as there most certainly will be some people who haven’t fully watched the recording or where some time has passed since watching it: Next time we would have some slides ready to recap the basic concepts and some links to point to exemplary implementations and further information.
\n
Here are some projects that are dealing with categories, taxonomies in the Fediverse:
\n
\n
CommonsPub – which builds on experiences from MoodleNet – is working on “[f]ederated taxonomies for topic-based search and discovery across instances.”
LearnAwesome.org helps people in managing and finding learning material online in very different formats. It supports following certain topics.
\n
The “Rebooting Indymedia” project wants to use topic-based channels to create moderation workflows for building independent news sites from decentrally published content.
\n
In a related effort by Trolli Schmittlauch, the problem is addressed how to create a comprehensive hashtag search & subscription in a federated social network. See this paper for details.
\n
\n
In a birds of a feather session about “Topics” and Services we can subscribe to some of the people working on tags, controlled vocabularies etc. came together and had a fruitful exchange, sorting out the different problems and which approaches exist to address them. We are looking forward to further working on SkoHub and discussing common approaches to assigning or following controlled topics in the fediverse.
","frontmatter":{"title":"ActivityPub Conference 2020","date":"October 09, 2020","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}],"description":null}},"previous":{"fields":{"slug":"/2020-06-25-skohub-pubsub/"},"frontmatter":{"title":"Presenting SkoHub PubSub"}},"next":{"fields":{"slug":"/2020-11-25-swib20-workshop/"},"frontmatter":{"title":"SkoHub workshop at SWIB20"}}},"pageContext":{"id":"0b55c8af-edf8-5752-a9bd-9a5c99bc636c","previousPostId":"8222f844-ed14-527d-aaad-85f396f7f114","nextPostId":"deea6c17-69ed-5006-958f-40a429cfd5b3"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2020-11-25-swib20-workshop/page-data.json b/page-data/2020-11-25-swib20-workshop/page-data.json
new file mode 100644
index 0000000..6cc1e14
--- /dev/null
+++ b/page-data/2020-11-25-swib20-workshop/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2020-11-25-swib20-workshop/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"deea6c17-69ed-5006-958f-40a429cfd5b3","excerpt":"From 23-27 November the 12th Semantic Web in Libraries Conference took place online. The programme with links to recordings and slides forum can be viewed at…","html":"
Wednesday was workshop day where Adrian and Steffen offered a SkoHub workshop. They tried out a flipped classroom approach and created several video tutorials and wrote down walkthroughs for the participants to prepare themselves. This did not work out as intended, lesson learned: Always be prepared that a bigger number of participants is not prepared.
","frontmatter":{"title":"SkoHub workshop at SWIB20","date":"November 25, 2020","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}],"description":null}},"previous":{"fields":{"slug":"/2020-10-09-skohub-apconf/"},"frontmatter":{"title":"ActivityPub Conference 2020"}},"next":{"fields":{"slug":"/2021-lis-workshop/"},"frontmatter":{"title":"SkoHub Presentation at LIS Workshop 2021"}}},"pageContext":{"id":"deea6c17-69ed-5006-958f-40a429cfd5b3","previousPostId":"0b55c8af-edf8-5752-a9bd-9a5c99bc636c","nextPostId":"39c26922-f4be-583b-8b7f-13c3d81f5ae2"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2021-12-10-skohub-vocabs-workshops/page-data.json b/page-data/2021-12-10-skohub-vocabs-workshops/page-data.json
new file mode 100644
index 0000000..66ed725
--- /dev/null
+++ b/page-data/2021-12-10-skohub-vocabs-workshops/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2021-12-10-skohub-vocabs-workshops/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"0bf040bb-65b5-5017-ba79-c0ae92357322","excerpt":"We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands-on learning how to…","html":"
We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands-on learning how to publish a small vocabulary with SkoHub Vocabs. The first one “Eine Einführung in SKOS mit SkoHub Vocabs” was held as a cooperation between the Hochschulbibliothekszentrum NRW and the Göttingen eResearch Alliance with 14 German-speaking participants on 2021-11-02, see the workshop pad and the slides.
\n
As the workshop worked quite well, we applied the same approach to our Workshop “An Introduction to SKOS with SkoHub-Vocabs” on 2021-11-30 at SWIB21 (slides) with around 20 participants.
\n
\n
Generally, we had the impression that participants of both workshops where having a good time, at least nobody left the conference room before the end of the workshop. Here are some notes and lessons learned we collected after the SWIB workshop:
\n
\n
We invited participants to share their screen to not be talking to the void. Around 4-5 participants followed our invitation which was ok for us.
\n
Many participants joined without their microphones connected. We missed to explicitly ask them to turn their mics on.
\n
We introduced ourselves and used the BigBlueButton poll feature to get to know the participants and their previous experience bit more.
\n
We divided the workshop into two parts: an introduction as a frontal presentation and a hands-on part with discussion and Q&A phases in between. Though this worked quite well, we think it might be nice to switch more often between explanatory parts and hands-on parts.
\n
Participants were a bit shy. We got some good feedback in the chat at the end but only from some participants. Next time, we should prepare another poll for getting feedback at the end of the workshop
\n
\n
With regard to the further development of SkoHub Vocabs, it became clear during the workshops that it would be great to have an automatic test with each commit that lets you know whether a SKOS/Turtle file in a repo is “SkoHub-ready”, i.e. conforms to the pattern that is supported by SkoHub Vocabs. Issue #91 is already addressing this need and should be worked on to accomplish this.
","frontmatter":{"title":"SKOS Introduction workshops with SkoHub Vocabs","date":"December 10, 2021","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}],"description":"We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands-on learning how to publish a small vocabulary with SkoHub Vocabs."}},"previous":{"fields":{"slug":"/2021-lis-workshop/"},"frontmatter":{"title":"SkoHub Presentation at LIS Workshop 2021"}},"next":{"fields":{"slug":"/2022-05-eww-project-kickoff/"},"frontmatter":{"title":"Collaborating on improving SkoHub Vocabs"}}},"pageContext":{"id":"0bf040bb-65b5-5017-ba79-c0ae92357322","previousPostId":"39c26922-f4be-583b-8b7f-13c3d81f5ae2","nextPostId":"f786e1a6-675a-5f6f-b095-2bd4d1840b4f"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2021-lis-workshop/page-data.json b/page-data/2021-lis-workshop/page-data.json
new file mode 100644
index 0000000..f86301f
--- /dev/null
+++ b/page-data/2021-lis-workshop/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2021-lis-workshop/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"39c26922-f4be-583b-8b7f-13c3d81f5ae2","excerpt":"Last Friday afternoon, the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) took place organized by the…","html":"
\n
Last Friday afternoon, the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) took place organized by the Working Group within the GfKL – Data Science Society. Adrian and Steffen had the chance to present SkoHub in the workshop’s first presentation. The slides can be viewed at https://pad.gwdg.de/p/lis-workshop21-skohub.
\n
Here is an overview over the full programme that comprised six talks:
\n
\n
Adrian Pohl (hbz, Cologne, Germany) & Steffen Rörtgen (GWDG, Göttingen, Germany): SkoHub: Publishing and using knowledge organization systems on the web
\n
Colin Higgins (University of Cambridge, Cambridge, United Kingdom): Justice, governance, and the thesaurus – the Cambridge experience with ‘illegal aliens’
\n
Gislene Rodrigues da Silva & Célia da Consolação Dias (Universidade Federal de Minas Gerais, Belo Horizonte Brasil): Subjective aspects of indexing photographs from visual communication using a reading model based on the complex method and the primary functions of the image
\n
Heidrun Wiesenmüller (Stuttgart Media University, Stuttgart, Germany): Orientation and exploration – the presentation of subject headings in German catalog
\n
Karin Schmidgall (Deutsches Literaturarchiv Marbach, Marbach, Germany) & Matthias Finck (Effective Webwork, Hamburg, Germany): Glückliche Funde - ein Katalog der Forschende auf neue Ideen und Pfade bringt
\n
Julijana Nadj-Guttandin (Deutsche Nationalbibliothek, Frankfurt, Germany) & Sarah Pielmeier (University and State Library, Münster, Germany): Ein neues und modulares Regelwerk für die verbale Inhaltserschließung / A new and modular standard for subject indexing
\n
\n
The workshop happened as part of the virtual conference “Data Science, Statistics & Visualisation and European Conference on Data Analysis 2021” (DSSV-ECDA 2021). The promotion for the workshop could probably have been better, e.g. the presentations weren’t even listed in the regular DSSV-ECDA programme. In the end, the speakers and moderators were among themselves with little additional audience. However, the talks discussed interesting topics and discussion was lively.
","frontmatter":{"title":"SkoHub Presentation at LIS Workshop 2021","date":"July 12, 2021","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}],"description":"Short report about the SkoHub presentation at the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) organized by the Working Group within the GfKL – Data Science Society."}},"previous":{"fields":{"slug":"/2020-11-25-swib20-workshop/"},"frontmatter":{"title":"SkoHub workshop at SWIB20"}},"next":{"fields":{"slug":"/2021-12-10-skohub-vocabs-workshops/"},"frontmatter":{"title":"SKOS Introduction workshops with SkoHub Vocabs"}}},"pageContext":{"id":"39c26922-f4be-583b-8b7f-13c3d81f5ae2","previousPostId":"deea6c17-69ed-5006-958f-40a429cfd5b3","nextPostId":"0bf040bb-65b5-5017-ba79-c0ae92357322"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2022-05-eww-project-kickoff/page-data.json b/page-data/2022-05-eww-project-kickoff/page-data.json
new file mode 100644
index 0000000..06c4b2a
--- /dev/null
+++ b/page-data/2022-05-eww-project-kickoff/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2022-05-eww-project-kickoff/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"f786e1a6-675a-5f6f-b095-2bd4d1840b4f","excerpt":"In a kickoff workshop, the SkoHub team at hbz has launched a cooperation with the Hamburg-based company effective WEBWORK to work on some pending issues…","html":"
In a kickoff workshop, the SkoHub team at hbz has launched a cooperation with the Hamburg-based company effective WEBWORK to work on some pending issues regarding both functionality and design of SkoHub Vocabs. The issues and progress of the project can be followed in a Kanban board.
\n
\n
A big part of the project will concern a new functionality to support the grouping of concepts by creating SKOS collection pages (Issue #159). Another central goal of the cooperation is a redesign of the static sites that are generated by SkoHub Vocabs. Finally, a logo is to be developed for the SkoHub softwar suite that will hopefully capture the spirit of the software and the community behind it. We are looking forward to presenting the results within the next months.
\n
The team from effective WEBWORK consists of software developers, a librarian and a designer whose skills overlap with those of the SkoHub community in many ways and especially regarding their enthusiasm for open source solutions. We are looking forward to this collaboration!
","frontmatter":{"title":"Collaborating on improving SkoHub Vocabs","date":"May 19, 2022","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Christensen","firstname":"Anne"}],"description":"The SkoHub team at hbz has launched a cooperation with the Hamburg-based company effective WEBWORK to work on some pending issues regarding both functionality and design of SkoHub Vocabs."}},"previous":{"fields":{"slug":"/2021-12-10-skohub-vocabs-workshops/"},"frontmatter":{"title":"SKOS Introduction workshops with SkoHub Vocabs"}},"next":{"fields":{"slug":"/2022-11-skohub-workshop/"},"frontmatter":{"title":"Things are moving at SkoHub"}}},"pageContext":{"id":"f786e1a6-675a-5f6f-b095-2bd4d1840b4f","previousPostId":"0bf040bb-65b5-5017-ba79-c0ae92357322","nextPostId":"1cb80f09-aece-5c58-a7d5-e9621e40df98"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2022-11-skohub-workshop/page-data.json b/page-data/2022-11-skohub-workshop/page-data.json
new file mode 100644
index 0000000..5b444c6
--- /dev/null
+++ b/page-data/2022-11-skohub-workshop/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2022-11-skohub-workshop/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"1cb80f09-aece-5c58-a7d5-e9621e40df98","excerpt":"In this blog post we want to introduce to you a new member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will be working on SkoHub…","html":"
In this blog post we want to introduce to you a new member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will be working on SkoHub as well as invite you to a workshop to present and discuss our future plans for the SkoHub project.
\n
After some time with not much happening (or even with shutting down some SkoHub services), we are happy to inform about a lot of movement in the space. We already announced the current project with the effective WEBWORK team to develop a new logo, improve the design and fix some minor issues in SkoHub. We are happy about the improvements made. Watch this space for more details in an upcoming post.
\n
Welcome, Steffen!
\n
We are very happy to welcome Steffen Rörtgen in the open infrastructure team at the Hochschulbibliothekszentrum NRW who has joined the team this November. In his former projects he already made heavy use of SkoHub Vocabs and contributed to the project. He will now focus on further SkoHub development in context of the Metadaten.nrw project which is funded by the Ministry of Culture and Science of North Rhine-Westphalia (MKW).
\n
With the grant for this project, we have resources for further development and would like to discuss with you our plans, especially regarding the use of SkoHub for reconciliation and the ActivityPub-based publish/subscribe approach (SkoHub PubSub). Having already defined some work packages within the Metadaten.nrw project we would like to align these with use cases and ideas the SkoHub community has.
\n
Upcoming workshop
\n
Therefore we are happy to invite you to a small workshop on Thursday, the 17th of November. We will start at 10:00h CET and split the workshop into two parts with each one lasting about two hours:
\n
In the first part we will give an overview of the past and current developments of SkoHub as well as an introduction to the Metadaten.nrw project and its plans for SkoHub. At the end of the first part we want to discuss use cases for the publish/subscripe approach of SkoHub as well as for reconciliation. Our community member Andreas Wagner already developed a prototype for reconciliation with SkoHub, which he will present.
\n
In the second part of the workshop we will then deep dive into the reconciliation topic. We will discuss Andreas’ approach and develop a roadmap for SkoHub to implement the desired reconciliation functionalities.
If you are interested in the workshop, please send an email to skohub@hbz-nrw.de
\nand give us a note if you are joining the whole workshop or just the first part.
\n
Please be aware that we won’t give a general introduction to SkoHub and expect you to be familiar with SKOS and the approach of SkoHub.
","frontmatter":{"title":"Things are moving at SkoHub","date":"November 04, 2022","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}],"description":"We welcome a new team member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will be working on SkoHub and invite people to a SkoHub planning workshop."}},"previous":{"fields":{"slug":"/2022-05-eww-project-kickoff/"},"frontmatter":{"title":"Collaborating on improving SkoHub Vocabs"}},"next":{"fields":{"slug":"/2022-12-02-new-look/"},"frontmatter":{"title":"Have U Seen The New Look?"}}},"pageContext":{"id":"1cb80f09-aece-5c58-a7d5-e9621e40df98","previousPostId":"f786e1a6-675a-5f6f-b095-2bd4d1840b4f","nextPostId":"828b5508-2825-568c-8588-621b7f3cb941"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2022-12-02-new-look/page-data.json b/page-data/2022-12-02-new-look/page-data.json
new file mode 100644
index 0000000..4851a9e
--- /dev/null
+++ b/page-data/2022-12-02-new-look/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2022-12-02-new-look/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"828b5508-2825-568c-8588-621b7f3cb941","excerpt":"We are happy to announce the new SkoHub logo and design we have deployed right in time for our SWIB22 workshop on Wednesday! In the last months, Kai Mertens and…","html":"
We are happy to announce the new SkoHub logo and design we have deployed right in time for our SWIB22 workshop on Wednesday! In the last months, Kai Mertens and effective WEBWORK helped to work out this new look in the context of the project we have announced earlier. We have now updated the SkoHub website, this blog and the default SkoHub Vocabs setup to incorporate the new logo and design.
\n
Here is an example of how a vocabulary built with SkoHub Vocabs will look now with this default design:
","frontmatter":{"title":"Have U Seen The New Look?","date":"December 02, 2022","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}],"description":"We are happy to launch the new SkoHub logo and design developed with help by Kai Mertens and effective WEBWORK"}},"previous":{"fields":{"slug":"/2022-11-skohub-workshop/"},"frontmatter":{"title":"Things are moving at SkoHub"}},"next":{"fields":{"slug":"/2022-12-19-workshop-summary/"},"frontmatter":{"title":"Notes from the November workshop"}}},"pageContext":{"id":"828b5508-2825-568c-8588-621b7f3cb941","previousPostId":"1cb80f09-aece-5c58-a7d5-e9621e40df98","nextPostId":"c45557b7-0901-5b47-8e00-548ea748b6f9"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2022-12-19-workshop-summary/page-data.json b/page-data/2022-12-19-workshop-summary/page-data.json
new file mode 100644
index 0000000..0c0e436
--- /dev/null
+++ b/page-data/2022-12-19-workshop-summary/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2022-12-19-workshop-summary/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"c45557b7-0901-5b47-8e00-548ea748b6f9","excerpt":"Due to new funding for SkoHub development we conducted – as previously announced – a workshop on the 17th of November. About 15 participants from three…","html":"
Collection of requirements from the community regarding PubSub and reconciliation
\n
Presentation of SkoHub Reconcile prototype by Andreas Wagner
\n\n
We split the workshop in two parts. First, we gave a general overview about the current state of SkoHub and the renewed funding through the Metadaten.nrw project. Then we had a general discussion about SkoHub PubSub – the module to connect a SKOS vocab with the Fediverse – as well as Andreas Wagner’s Reconciliation prototype. See also the slides for the first part of the workshop.
\n
In the second half Andreas gave us a technical deep dive into his reconciliation prototype, walked us through the code and we discussed the architecture as well as future development and integration into the SkoHub ecosystem.
\n
In the following, we will go deeper into what happened in the different parts.
\n
Current state of SkoHub
\n
Currently, SkoHub Vocabs is by far the most used SkoHub module. It is used by the hbz, the metadata standardization groups around KIM, WirLernenOnline, The Institute for Educational Quality Improvement (IQB), in research projects in the area of digital humanities and by other people and institutes to publish their controlled vocabularies.
\n
The browser plugin SkoHub Editor as well as the PubSub module haven’t been used in production yet and have been shut down temporarily in March 2022 due to missing resources.
\n
In 2022 the work on SkoHub started again when we partnered with effective WEBWORK (eWW) to redesign the web pages, create a new logo, improve UI configuration and address other issues as for example the support of skos:Collection. (See the project kanban for an overview.)
\n
Decouple software and services
\n
The general idea is to further decouple the software “SkoHub” from its running instances. Therefore we also wanted eWW to work on UI configuration possibilities, so other institutes or projects can easily brand their SkoHub instance.
\n
In the future, we will move the hosted instance, currently running at skohub.io to metadaten.nrw, the project which grants the further development of SkoHub.
\n
Metadaten.nrw
\n
In the end of 2021 hbz secured some funding by the Ministry of Culture and Science of North Rhine-Westphalia (MKW) for a project called Metadaten.nrw. It consists of two sub-projects, with one called “Infrastructure Initiative Metadata Services” being located in the Open Infrastructure team (OI) at hbz, where SkoHub development will take place.
\n
We got four positions funded from which two are already filled, amongst them Steffen for SkoHub development. The goal of the project is to expand the community of users for the existing metadata infrastructure provided by hbz/OI, with focus on libraries and scholars in North Rhine-Westphalia (NRW), and to establish hbz as a competence center for metadata in NRW.
\n
Accordingly, we plan to develop SkoHub further regarding the following topics:
\n
\n
Fediverse integration: Further development of SkoHub PubSub in the context of a concrete use case
\n
Reconciliation: Bringing the SkoHub reconciliation module into production
\n
Possibly support Annif integration in a later project phase
\n
Offer SkoHub tutorials and workshops
\n
\n
Community, PubSub & Reconciliation
\n
To further encourage contributions like the one from Andreas with the reconciliation prototype, we will set up contributing guidelines to have a clear and transparent definition of the development and deployment processes.
\n
SkoHub PubSub
\n
Afterwards we made a small (re)introduction to SkoHub PubSub and discussed possible use cases. We developed ideas about SkoHub PubSub serving as a communication hub between researchers for their research fields. Raphaëlle Lapotre came up with a conrete use case. They currently have some pains in the context of Timel Thesaurus, an indexing Thesaurus for huge amounts of digitized pictures of medieval iconography. Currently, there are problems with the task of storing the large amounts of images centrally in a repository. Researchers could hold the files locally in their NextCloud and publish the image metadata to inboxes of SKOS concepts. A central service could then listen to the data provided by each concept’s inbox and then display the metadata with a link pointing to the image in its storage location. There are actually two possible use cases : one with the digitized illuminations pictures of the Ahloma lab (EHESS, sample), the second one with painted ceilings pictures from all the mediterranean area, collected by an association of scholars and retired volunteers. Possibly, the support for ActivityPub in Nextcloud could help with such a project.
\n
Another topic was the idea of community building around concepts. The Open Educational Resource Search Index (OERSI) as well as the WirLernenOnline project already use elaborated vocabularies to index their resources. Interested humans could easily follow these concepts and engage in discussions around them.
\n
This is also applicable to researchers that will be able to build up a topic-specific data base and open discussions about their research in the fediverse. This also rose practical questions about what happens on the notification side with broader and narrower concepts. If I’m following a concept do I also want to get notifications about its narrower or broader concepts? These are questions that can be discussed further in our community.
\n
SkoHub Reconciliation
\n
Following the PubSub discussion Andreas presented his reconciliation prototype. The reconciliation prototype is based on the Reconciliation API spec developed by the W3C Entity Reconciliation Community Group, so it is interoperable and can be used in any kind of application that acts as a reconciliation client. Andreas implementation already worked in OpenRefine as well as in TEI Publisher’s annotation tool. After showing the implementation with some examples we went into a technical deep dive.
\n
Andreas walked us through the code and we discussed the current implementation as well as the future architecture of the SkoHub modules. His current approach is based on the SkoHub Vocabs webhook part and lending code from SkoHub PubSub regarding the elasticsearch indexing.
\n
The discussion resulted in the proposal to separate SkoHub Vocabs from the webhook module and by this further separate concerns of the respective modules. He integrated a doreconc query parameter to the webhook, which triggers a script that will populate the vocabulary to the reconcile prototype.
\n
After the workshop we transferred the skohub-reconcile repository from Andreas to the SkoHub organization and are happy to start further developing it in 2023.
\n
Final thoughts
\n
The workshop was a great event to discuss with SkoHub users and those who want to be. We collected valuable feedback and ideas for development in the upcoming two years.\nEspecially the sudden rise in awareness for the Fediverse opens up interesting use cases for SkoHub PubSub, which we are happy to engage in. The highlight of the workshop was the presentation of Andreas’ reconciliation prototype and its transfer in the SkoHub organization. This is a good example for the benefits of open source and use case driven development.
\n
We are looking forward to future community events, more use cases and even more modules to be developed in the SkoHub ecosystem.
","frontmatter":{"title":"Notes from the November workshop","date":"December 19, 2022","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}],"description":null}},"previous":{"fields":{"slug":"/2022-12-02-new-look/"},"frontmatter":{"title":"Have U Seen The New Look?"}},"next":{"fields":{"slug":"/2023-02-09-tests-updated/"},"frontmatter":{"title":"Moving to test-driven development and updating existing tests"}}},"pageContext":{"id":"c45557b7-0901-5b47-8e00-548ea748b6f9","previousPostId":"828b5508-2825-568c-8588-621b7f3cb941","nextPostId":"a9ca1438-2e05-5355-8e06-72fe48377acc"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2023-02-09-tests-updated/page-data.json b/page-data/2023-02-09-tests-updated/page-data.json
new file mode 100644
index 0000000..5e489fc
--- /dev/null
+++ b/page-data/2023-02-09-tests-updated/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2023-02-09-tests-updated/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"a9ca1438-2e05-5355-8e06-72fe48377acc","excerpt":"For quite some time we have been adding new features to SkoHub Vocabs like switching languages, display of all relevant properties on the concept page and…","html":"
For quite some time we have been adding new features to SkoHub Vocabs like switching languages, display of all relevant properties on the concept page and support for skos:Collection. Unfortunately there were no tests added to actually test these new functionalities. This led to some surprises now and then, for example when we noticed that at one point language tags did not show up when visiting a Collection page directly.
\n
Originally SkoHub Vocabs already contained some tests, so being the maintainer of SkoHub Vocabs I decided to follow up on that and got myself a little bit more familiar on the topic. Quickly I stumbled over the topic of Test Driven Development (TDD) and though I heard of it before, I decided to dive a little deeper and check, if that pattern might be appropriate for SkoHub Vocabs and the other SkoHub modules (and maybe my coding approaches in general).
\n
The general idea of TDD is as follows (borrowed heavily from the Wikipedia article):
\n
\n
Requirements of a new feature are first translated into test cases
\n
Tests are written
\n
Code is written
\n
\n\n
This leads to the following development cycle:
\n\n
\n
Write tests: This ensures that the developer actually understands the user requirements. Usually this is done with the help of use cases and user stories.
\n
\n
\n
Run tests: The tests should now fail. If not, it might be the case that the actual feature is already present in the code and no further code needs to be written. Maybe documentation has to be updated accordingly.
\n
\n
\n
Write the simplest code that passes the new tests: The code can (and should) later be refactored, so it can be ugly at this point.
\n
\n
\n
All tests should now pass: If the code is still failing it should be revised till all tests pass.
\n
\n
\n
Refactor as needed: Your tests now verify that the new feature is working as expected. If any tests during refactoring fail, you now can and will immediately correct your code.
\n
\n\n
Consequences on SkoHub development
\n
This approach has some consequences for the development of SkoHub modules.\nThese changes will also be reflected in the yet to be published CONTRIBUTING.md.\nIssues for new features should contain use cases and user stories as well as some notes that indicate when the feature is actually correctly implemented.\nJust when all of this is present, the issue can be marked as ready.\nThe use cases and notes can then be used to write the tests and follow the above mentioned development cycle.
\n
Regarding code review, this approach also has some consequences. Code review should only be approved if tests were added for the new feature or the tests were adjusted in case of bug fixing.
\n
Testing Strategies and Technologies in SkoHub Vocabs
\n
At the end of this blog post I want to give you a short overview of currently used testing strategies and technologies used in SkoHub Vocabs development. We use unit tests, integration tests and end-to-end tests whereby we try to write more unit tests than integration tests than end-to-end tests. The reason for this is that end-to-end tests take long and are quite expensive regarding computing power and time. Unit and integration tests on the other hand are cheap and can also auto-run in the background on every save, thus giving you immediate feedback when something is broken.
\n
For unit and integration tests we use Jest and the React-Testing-Library since Gatsby – with which SkoHub Vocabs is built – uses React.\nSome of the older tests used Enzyme, but after upgrading to React 18 I noticed that Enzyme was no longer working, because the project is dead.\nAfter some research I found the React-Testing-Library as the most recommended testing framework and migrated the old Enzyme tests.\nAfter some initial training, writing tests became actually quite handy and fun.
\n
Adding tests is by no way finished at this point, but a lot of the lately added features now have some proper testing. This already required a few changes to the code base from now and then when I noticed things weren’t working as expected.
\n\n
For end-to-end tests I decided to go with cypress since it has an excellent documentation, is fully open source and runs tests in a real browser.
\n\n
All of the tests are actually integrated in the SkoHub Vocabs CI pipeline and run before a new docker image gets build.
","frontmatter":{"title":"Moving to test-driven development and updating existing tests","date":"February 09, 2023","authors":[{"lastname":"Rörtgen","firstname":"Steffen"}],"description":null}},"previous":{"fields":{"slug":"/2022-12-19-workshop-summary/"},"frontmatter":{"title":"Notes from the November workshop"}},"next":{"fields":{"slug":"/2023-11-22-shacl-shape/"},"frontmatter":{"title":"Development of SKOS SHACL shape"}}},"pageContext":{"id":"a9ca1438-2e05-5355-8e06-72fe48377acc","previousPostId":"c45557b7-0901-5b47-8e00-548ea748b6f9","nextPostId":"e19fa987-ceb2-54a5-b888-63169daac7e7"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2023-11-22-shacl-shape/page-data.json b/page-data/2023-11-22-shacl-shape/page-data.json
new file mode 100644
index 0000000..0ea459c
--- /dev/null
+++ b/page-data/2023-11-22-shacl-shape/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2023-11-22-shacl-shape/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"e19fa987-ceb2-54a5-b888-63169daac7e7","excerpt":"To improve the error messages thrown by SkoHub Vocabs for invalid RDF Turtle files, we decided to implement a validation step, before the static site of a…","html":"
To improve the error messages thrown by SkoHub Vocabs for invalid RDF Turtle files, we decided to implement a validation step, before the static site of a vocabulary gets built with the Gatsby framework.\nThis validation step should provide more meaningful error messages than the currently cryptic ones thrown by Gatsby.\nWhile we could have gone with one of the existing SKOS validator tools like SKOS Play!, SKOSify or Poolparty (it’s pretty epensive) we decided to go with a more generic approach and define the shape rules not in code, but in data.
\n\n
If you want to validate the shape of an RDF graph, you currently have two options to do that. You can either use Shape Expressions (ShEx) or the Shapes Constraint Language (SHACL).\nWe decided to go with SHACL for the following reasons:
\n
\n
slightly better tooling (e.g. the zazuko Team provides a JS Validation library rdf-validte-shacl)
Unfortunately the SKOS-XL shape did not work with our tooling (Apache Jena SHACL) out of the box.\nTherefore we decided to build a SKOS shape from the ground up based on the SKOS Reference.
\n
SKOS Reference Shape
\n
The goal was to implement every consistency example from the SKOS Reference as a test case for the shape.\nTo accomplish this it was on the one hand needed to formalize the class and property definitions from the spec as well as the integrity conditions.\nOn the other hand we needed a triple store with reasoning capabilities to apply these rules to the very basic examples in the reference.\nWe used the Apache Jena tooling for this and built a jena-docker containers based on the docker containers of this repo.\nThe SKOS class and property definitions are defined in this file.\nThe workflow for validating the SKOS shape is as follows:
Based on the query result an error message is put out
\n\n
This way we accomplished to validate all valid examples from the SKOS reference up to example 68 as valid and all the invalid examples as invalid (entailment vs non-entailment examples were left out).
\n
SkoHub Shape
\n
In SkoHub Vocabs we are a bit stricter regarding some aspects of the SKOS reference.\nFor example we want every skos:Concept to have at least one skos:prefLabel.\nTherefore we developed a SkoHub specific shape with skohub.shacl.ttl.\nIn contrast to the generic SKOS shape this shape does not contain any SPARQL based SHACL constraints.\nThough it is possible and especially for more elaborated queries useful to use SPARQL to check constraints, the available tools (at least for javascript rdf-validate-shacl) do not support such queries.
\n
As a result, validation errors and warnings help SkoHub users to improve the quality of their vocabularies. See for example the validation warning for no provided license:
\n
-----------Warning--------------\nMessage: [\n Literal {\n value: 'A provided license increases reusability of a vocabulary. Should be an URI.',\n language: '',\n datatype: NamedNode { value: 'http://www.w3.org/2001/XMLSchema#string' }\n }\n]\nPath: http://purl.org/dc/terms/license\nNode, where the error occured: http://w3id.org/example-cs/\nSeverity of error: http://www.w3.org/ns/shacl#Warning
\n
Or the validation error if the object of skos:hasTopConcept is not a skos:Concept:
\n
-----------Violation--------------\nMessage: [\n Literal {\n value: 'The target class for hasTopConcept should be skos:Concept',\n language: '',\n datatype: NamedNode { value: 'http://www.w3.org/2001/XMLSchema#string' }\n }\n]\nPath: http://www.w3.org/2004/02/skos/core#hasTopConcept\nNode, where the error occured: http://w3id.org/example-cs/\nSeverity of error: http://www.w3.org/ns/shacl#Violation
\n
Community
\n
Thanks to a lightning talk at SWIB23 (slides, recording) we got some attention to the shape.\nSuggestions made by Jakob Voß, Osma Suominen and Antoine Isaac already greatly improved the shape.\nFurther suggestions and improvements as well as your use cases are highly welcome.
\n
Outlook
\n
We were quite a bit surprised that we did not find any usable existing SKOS SHACL shape.\nHopefully our work may help others validating their SKOS files and improve the overall quality of vocabularies.\nThere is currently still an open ticket for implementing the qSKOS best practice rules.\nAny feedback and collaboration on the shapes is welcome!
","frontmatter":{"title":"Development of SKOS SHACL shape","date":"November 22, 2023","authors":[{"lastname":"Rörtgen","firstname":"Steffen"}],"description":null}},"previous":{"fields":{"slug":"/2023-02-09-tests-updated/"},"frontmatter":{"title":"Moving to test-driven development and updating existing tests"}},"next":{"fields":{"slug":"/2024-01-18-reconcile/"},"frontmatter":{"title":"Supporting the Reconciliation Service API for SKOS vocabularies"}}},"pageContext":{"id":"e19fa987-ceb2-54a5-b888-63169daac7e7","previousPostId":"a9ca1438-2e05-5355-8e06-72fe48377acc","nextPostId":"bcdcbba9-12cb-50f4-9aa2-de3fc49667db"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2024-01-18-reconcile/page-data.json b/page-data/2024-01-18-reconcile/page-data.json
new file mode 100644
index 0000000..a7297b0
--- /dev/null
+++ b/page-data/2024-01-18-reconcile/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2024-01-18-reconcile/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"bcdcbba9-12cb-50f4-9aa2-de3fc49667db","excerpt":"Reconciliation is the process of integrating data from sources which do not share common unique identifiers by identifying records which refer to the same…","html":"
Reconciliation is the process of integrating data from sources which do not share common unique identifiers by identifying records which refer to the same entities.\nThis happens mostly by comparing the attributes of the entities.\nFor instance, two entries in a catalogue about persons that share the same date of birth, place of birth, name and death date, will probably be about the same person.\nLinking these two entries by adding the identifier from another data source is the process of reconciliation. This allows for extension of your data by taking over information from a linked record.
\n
To facilitate this process multiple tools exist with OpenRefine being the most prominent tool.\nTo align and standardize the way of providing data for these tools the Reconciliation Service API is drafted by the Entity Reconciliation Community Group within the World Wide Web Consortium (W3C).\nThe specification defines endpoints that data services can expose so that applications like OpenRefine can handle that data.\nA number of other services have already implemented the specification, like TEI Publisher or Cocoda, or the Alma Refine plugin for the commercial Library Management System Alma.
\n
Reconciliation and SKOS
\n
Simple Knowledge Organization System (SKOS) is an established standard for modeling controlled vocabularies as Linked Data. Thus, SKOS vocabularies are often targets of reconciliation efforts as you can improve your local data by enrichting strings with identifiers of a controlled vocabulary. So SKOS and the Reconciliation Service API often go hand in hand. However, there has not existed an easy way to set up a reconcilation endpoint for an existing SKOS vocabulary. We decied to change that by developing the new SkoHub component SkoHub-Reconcile.
\n
Andreas Wagner had already built a reconciliation prototype for SKOS vocabularies (see also our Workshop Blog Post.\nWe picked this prototype up, refactored it and moved it into a container based infrastructure.\nWe also added support for v0.2 of the reconciliation spec.
\n
SkoHub Reconcile Publish
\n
To make it easy to upload vocabularies to the reconciliation service a front-end was develped which you can try out at https://reconcile-publish.skohub.io/.
\n\n
Every vocabulary that passes the SkoHub SHACL Shape (see our blog post) should work for uploading to the reconcile service.\nThe only additional requirement is to provide a vann:preferredNamespaceUri.\nAs you can see in the screenshot you also have to provide an account and a language.\nAs for the account you can currently choose whatever you want, just make sure it is unique enough, so your dataset (i.e. your vocabulary) does not get overwritten by someone else.\nSince a lang parameter has only been available since the current draft version of the reconciliation specification and not yet implemented in SkoHub Reconcile, the current version of the SkoHub Reconcile service requires you to specify a language you want to use for reconciliation. We will improve this in the future along with the development of the specification.
\n
Example: Usage in OpenRefine
\n
Let’s see how we can use the service with OpenRefine.
\n
First, we upload the vocabulary.\nWe will use a classification of subject groups.
\n\n
After a successful upload of the turtle file, we are presented with a URI that leads to the “Service Manifest” of our reconciliation service.
Now that the reconciliation service is set up with our data, let’s see how we can use it in OpenRefine.
\n
For demo purposes we use a small vocabulary of a few discipline names:
\n\n
By clicking on the dropdown button of the column we want to reconcile, we choose “Reconcile” -> “Start reconciling…“.
\n\n
After clicking “Add standard service”, we can enter the url we were provided with by the upload service:
\n\n
Then we just have to start the reconciliation by clicking “Start reconciling…” and our reconciliation service will be queried with the terms in our OpenRefine project.\nWe are then presented with the results:
\n\n
This already looks good!\nNow we can choose matches by clicking the checkmark or get additional information by hovering over the proposed entry from the reconcile service.
\n\n
If we want we can also search through our vocabulary by clicking “Search for match”:
\n\n
After selecting the appropritate matches we have successfully reconciled our data:
Feedback is very much appreciated: via email (skohub@hbz-nrw.de), as an issue or – primarily for the German-speaking users – in the newly set up discourse forum metadaten.community.
\n
Our next step will be integrating the above mentioned lang parameter to be able to serve all languages of a vocabulary without the need to specify it beforehand.
","frontmatter":{"title":"Supporting the Reconciliation Service API for SKOS vocabularies","date":"January 22, 2024","authors":[{"lastname":"Rörtgen","firstname":"Steffen"}],"description":null}},"previous":{"fields":{"slug":"/2023-11-22-shacl-shape/"},"frontmatter":{"title":"Development of SKOS SHACL shape"}},"next":{"fields":{"slug":"/2024-01-24-uris-without-language-tags/"},"frontmatter":{"title":"Re-working SkoHub Vocabs internationalization features"}}},"pageContext":{"id":"bcdcbba9-12cb-50f4-9aa2-de3fc49667db","previousPostId":"e19fa987-ceb2-54a5-b888-63169daac7e7","nextPostId":"5b5b6759-4145-563a-a1f9-359045844318"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2024-01-24-uris-without-language-tags/page-data.json b/page-data/2024-01-24-uris-without-language-tags/page-data.json
new file mode 100644
index 0000000..c5e51a2
--- /dev/null
+++ b/page-data/2024-01-24-uris-without-language-tags/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2024-01-24-uris-without-language-tags/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"5b5b6759-4145-563a-a1f9-359045844318","excerpt":"In the past: Internationalization with drawbacks If you have worked with SkoHub Vocabs before, you might have noticed that the URLs in the address bar had a…","html":"
In the past: Internationalization with drawbacks
\n
If you have worked with SkoHub Vocabs before, you might have noticed that the URLs in the address bar had a little special feature that you don’t encounter very often, a language tag before the .html:
We wanted Internationalization features to be able to navigate multiple languages.\nNormally this is done via a subdomain or adding a language tag behind the domain name like https://w3id.org/kim/hochschulfaechersystematik/en/.\nBut this does not work for SkoHub Vocabs since the we use the URIs from the turtle files as IDs for the concept.\nChanging the URI by adding a language tag somewhere would break the whole concept of SkoHub Vocabs.
\n
So it was decided to add the language at the end of the URL by using Apache Multiviews features.\nBut this lead to some drawbacks:
\n
\n
SkoHub Vocabs needed to be served by an Apache Webserver
\n
The webserver needed special configuration
\n
SkoHub Docker Vocabs, which is served via GitHub Pages, always needed a specific link to an index.{language}.html file, since GitHub Pages only looks for an index.html
\n
The build directory grew quite a bit, since there were dedicated html pages built for every language
\n
\n
Switching to one page for all languages
\n
In order to overcome these issues we decided to change this behaviour and just build one html page with a functionality to switch languages. The shown language is now chosen the following way:
\n
\n
by using your browser language
\n
if you switched languages in the application the chosen language is taken
\n
if a language is not present, a default language present in the vocabulary is used
\n
\n
To point users to a specific language, you can use a query parameter lang= like:
\n
https://w3id.org/kim/hcrt/scheme?lang=uk
\n
Since SkoHub Vocabs also used the language tag of the URL internally to determine which language to serve a lot of changes had to be done in the codebase.\nBut overall this resulted in a much reduced size of the built vocabularies and more flexibility on serving the vocabularies.
\n
Benefits of the new approach
\n
This new internationalization approach brings lots of improvements:
\n
\n
SkoHub Vocabs is now independent from the underlying webserver
\n
The size of the vocabularies is drastically reduced, especially for vocabularies with lots of languages
\n
SkoHub Docker Vocabs is now simpler to setup since we only have “normal” index.html files that it knows how to handle
\n
\n
What to do if I’m running my own webhook server?
\n
If you are running your own webhook server, you should upgrade the following way:
\n
\n
Follow the steps outlined in the webhook repository to rebuild vocabularies. This will rebuild all still existing branches you are currently serving.
\n
Set up a redirect in your apache config, so that links that still have ...de.html will be redirected to ...html?lang=de:
\n
\n
# Redirect from ...filename.LANGCODE.html to ...filename.html?lang=LANGCODE\n RewriteRule ^(.+)\\.([a-z]{2})\\.html$ $1.html?lang=$2 [L,R=301]
\n
\n
After that you should be good!
\n
\n
Anything else?
\n
During developing the script to rebuild all existing vocabularies, I noticed that we are serving a lot of branches that do not exist anymore.\nSkoHub Webhook currently builds a vocabulary for every branch, you are setting up and pushing to.\nBut the webhook service does not get notified, when a branch is deleted.\nThis way we end up having lots of files for branches that no one needs anymore.\nIn order to clean this up a bit, we will soon add a script to clean the dist directory up and remove those no longer needed files.
","frontmatter":{"title":"Re-working SkoHub Vocabs internationalization features","date":"January 31, 2024","authors":[{"lastname":"Rörtgen","firstname":"Steffen"}],"description":null}},"previous":{"fields":{"slug":"/2024-01-18-reconcile/"},"frontmatter":{"title":"Supporting the Reconciliation Service API for SKOS vocabularies"}},"next":{"fields":{"slug":"/2024-03-21-skohub-pages/"},"frontmatter":{"title":"Publishing SKOS the easy way"}}},"pageContext":{"id":"5b5b6759-4145-563a-a1f9-359045844318","previousPostId":"bcdcbba9-12cb-50f4-9aa2-de3fc49667db","nextPostId":"3686d7d6-2bd1-5708-93b9-f9dbe02db287"}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/2024-03-21-skohub-pages/page-data.json b/page-data/2024-03-21-skohub-pages/page-data.json
new file mode 100644
index 0000000..103dc0c
--- /dev/null
+++ b/page-data/2024-03-21-skohub-pages/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-templates-blog-post-js","path":"/2024-03-21-skohub-pages/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"markdownRemark":{"id":"3686d7d6-2bd1-5708-93b9-f9dbe02db287","excerpt":"A simple workflow for publishing your vocabs With SkoHub Pages we now provide a very simple way for publishing your SKOS vocabulary from a GitHub repository. It…","html":"
A simple workflow for publishing your vocabs
\n
With SkoHub Pages we now provide a very simple way for publishing your SKOS vocabulary from a GitHub repository. It only involves 5-6 steps:
\n
1. Fork the skohub-pages repo
\n
Click the “Fork” button in the top-right corner of the SkoHub Pages repo. You can change the name of your fork to whatever you like, e.g. my-shiny-vocab. See also the GitHub fork documentation.
\n\n
2. Activate GitHub Actions
\n\n
3. Configure GitHub Pages branch
\n
Go to “Settings”, navigate to the “Pages” setting and select gh-pages as the branch your site is being built from.
\n\n
4. Update pages URL
\n
Go back to the main page of your repo and click the little gear icon in the top right of the “About” section. Check the box at “Use your GitHub Pages website”.
\n
\n
\n
5. Start committing
\n
Now you can add a commit to the main branch adjusting the example vocabularies or adding a new turtle file. The changes will automatically be published to your GitHub pages website that is now linked at the top-right of your GitHub repo (sometimes it takes a little to see the changes, remember to do some hard refreshing).
\n
6. Set your GitHub Pages URL as namespace (optional)
\n
See section “Resolving custom domains” below ⬇️
\n
Utilizing GitHub Actions & Pages
\n
Not all projects or individuals involved in the creation of controlled vocabularies are able or have the resources to run their own infrastructure. Thus, we have been pursuing this approach – formerly under the name of “skohub-docker-vocabs” – to utilize Docker and GitHub infrastructure for publishing SKOS vocabularies with SkoHub Vocabs. Specifically, the workflow relies on ”GitHub Pages” and ”GitHub Actions”. With GitHub Pages it is possible to host websites on the GitHub infrastructure, GitHub Actions are used for automated tests and deployments.
\n
We have written a GitHub Action that ensures that a process is started after each push to the repository which builds the vocabularies with SkoHub Vocabs.\nThe built vocabulary is then pushed to a separate git branch gh-pages.\nAs seen above, GitHub Pages is configured to deliver HTML pages from this gh-pages branch.
\n
We have been using this approach in various introduction to SKOS and SkoHub workshops.\nHowever, in the past the workflow required some adjustments in the GitHub action so that errors could quickly creep in. We are happy to having improved this considerably and made the process much less error-prone! 🎉
\n
The relevant information is now set directly as environment variables and all other customizations can be changed via the GitHub GUI, so the workflow is now much more user-friendly. But that’s not all!
\n
Resolving custom domains
\n
Although with the presented approach the custom vocabulary could be provided without own infrastructure, the domains did not resolve to the GitHub pages.\nThis means that a concept scheme that uses URIs based on the GitHub Pages domain (e.g. https://myhandle.github.io/skohub-pages/) could not be resolved so far. In the past, in order to mitigate this we recommended setting up a redirect via w3id or purl.org.\nOf course, it still makes sense to set up a redirect (in case the vocabulary moves somewhere else). However, it is now also possible to use the domain that is assigned via GitHub Pages and have quickly set up a fully working SKOS vocabulary with resolving concept URIs which can come handy for prototyping.
\n
To do this, a config.yaml must be created in the repo.\nThe respective domain must then be entered under the custom_domain.\nExample: Your GitHub Pages domain is https://myhandle.github.io/skohub-pages/. Then provide https://myhandle.github.io/skohub-pages/ as custom_domain in your config.yaml.
\n
The base of your concept scheme could then be something like: https://myhandle.github.io/skohub-pages/myvocab/
There are lots of reasons why people might not want to use the GitHub infrastructure owned by Microsoft for their SKOS publication workflows. That’s why we will be looking into replacing as much as possible of this workflow by generic git-based tooling for triggering the build. The goal is to support such an easy SKOS publishing workflow on other forges like GitLab or Forgejo. The work on this happens around this issue: https://github.com/skohub-io/skohub-pages/issues/19
\n
Let us know if you have some good implementation ideas or more wishes for future development!
","frontmatter":{"title":"Publishing SKOS the easy way","date":"March 21, 2024","authors":[{"lastname":"Rörtgen","firstname":"Steffen"},{"lastname":"Pohl","firstname":"Adrian"}],"description":null}},"previous":{"fields":{"slug":"/2024-01-24-uris-without-language-tags/"},"frontmatter":{"title":"Re-working SkoHub Vocabs internationalization features"}},"next":null},"pageContext":{"id":"3686d7d6-2bd1-5708-93b9-f9dbe02db287","previousPostId":"5b5b6759-4145-563a-a1f9-359045844318","nextPostId":null}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/404.html/page-data.json b/page-data/404.html/page-data.json
new file mode 100644
index 0000000..d58956a
--- /dev/null
+++ b/page-data/404.html/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-pages-404-js","path":"/404.html","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}}},"pageContext":{}},"staticQueryHashes":["1878297489","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/404/page-data.json b/page-data/404/page-data.json
new file mode 100644
index 0000000..638d79a
--- /dev/null
+++ b/page-data/404/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-pages-404-js","path":"/404/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}}},"pageContext":{}},"staticQueryHashes":["1878297489","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/about/page-data.json b/page-data/about/page-data.json
new file mode 100644
index 0000000..b2c7f27
--- /dev/null
+++ b/page-data/about/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-pages-about-js","path":"/about/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}}},"pageContext":{}},"staticQueryHashes":["1878297489","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/app-data.json b/page-data/app-data.json
new file mode 100644
index 0000000..ad4c06b
--- /dev/null
+++ b/page-data/app-data.json
@@ -0,0 +1 @@
+{"webpackCompilationHash":"a671f36031795c0731bf"}
diff --git a/page-data/contact/page-data.json b/page-data/contact/page-data.json
new file mode 100644
index 0000000..6436dd6
--- /dev/null
+++ b/page-data/contact/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-pages-contact-js","path":"/contact/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog","email":"skohub@hbz-nrw.de","social":{"github":"skohub-io"}}}},"pageContext":{}},"staticQueryHashes":["1878297489","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/index/page-data.json b/page-data/index/page-data.json
new file mode 100644
index 0000000..6057231
--- /dev/null
+++ b/page-data/index/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-pages-index-js","path":"/","result":{"data":{"site":{"siteMetadata":{"title":"Skohub Blog"}},"allMarkdownRemark":{"nodes":[{"excerpt":"A simple workflow for publishing your vocabs With SkoHub Pages we now provide a very simple way for publishing your SKOS vocabulary from a…","fields":{"slug":"/2024-03-21-skohub-pages/"},"frontmatter":{"date":"March 21, 2024","title":"Publishing SKOS the easy way","description":null,"authors":[{"lastname":"Rörtgen","firstname":"Steffen"},{"lastname":"Pohl","firstname":"Adrian"}]}},{"excerpt":"In the past: Internationalization with drawbacks If you have worked with SkoHub Vocabs before, you might have noticed that the URLs in the…","fields":{"slug":"/2024-01-24-uris-without-language-tags/"},"frontmatter":{"date":"January 31, 2024","title":"Re-working SkoHub Vocabs internationalization features","description":null,"authors":[{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"Reconciliation is the process of integrating data from sources which do not share common unique identifiers by identifying records which…","fields":{"slug":"/2024-01-18-reconcile/"},"frontmatter":{"date":"January 22, 2024","title":"Supporting the Reconciliation Service API for SKOS vocabularies","description":null,"authors":[{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"To improve the error messages thrown by SkoHub Vocabs for invalid RDF Turtle files, we decided to implement a validation step, before the…","fields":{"slug":"/2023-11-22-shacl-shape/"},"frontmatter":{"date":"November 22, 2023","title":"Development of SKOS SHACL shape","description":null,"authors":[{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"For quite some time we have been adding new features to SkoHub Vocabs like switching languages, display of all relevant properties on the…","fields":{"slug":"/2023-02-09-tests-updated/"},"frontmatter":{"date":"February 09, 2023","title":"Moving to test-driven development and updating existing tests","description":null,"authors":[{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"Due to new funding for SkoHub development we conducted – as previously announced – a workshop on the 17th of November. About 15 participants…","fields":{"slug":"/2022-12-19-workshop-summary/"},"frontmatter":{"date":"December 19, 2022","title":"Notes from the November workshop","description":null,"authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"We are happy to announce the new SkoHub logo and design we have deployed right in time for our SWIB22 workshop on Wednesday! In the last…","fields":{"slug":"/2022-12-02-new-look/"},"frontmatter":{"date":"December 02, 2022","title":"Have U Seen The New Look?","description":"We are happy to launch the new SkoHub logo and design developed with help by Kai Mertens and effective WEBWORK","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"In this blog post we want to introduce to you a new member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will…","fields":{"slug":"/2022-11-skohub-workshop/"},"frontmatter":{"date":"November 04, 2022","title":"Things are moving at SkoHub","description":"We welcome a new team member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will be working on SkoHub and invite people to a SkoHub planning workshop.","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"In a kickoff workshop, the SkoHub team at hbz has launched a cooperation with the Hamburg-based company effective WEBWORK to work on some…","fields":{"slug":"/2022-05-eww-project-kickoff/"},"frontmatter":{"date":"May 19, 2022","title":"Collaborating on improving SkoHub Vocabs","description":"The SkoHub team at hbz has launched a cooperation with the Hamburg-based company effective WEBWORK to work on some pending issues regarding both functionality and design of SkoHub Vocabs.","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Christensen","firstname":"Anne"}]}},{"excerpt":"We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands…","fields":{"slug":"/2021-12-10-skohub-vocabs-workshops/"},"frontmatter":{"date":"December 10, 2021","title":"SKOS Introduction workshops with SkoHub Vocabs","description":"We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands-on learning how to publish a small vocabulary with SkoHub Vocabs.","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"Last Friday afternoon, the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) took place…","fields":{"slug":"/2021-lis-workshop/"},"frontmatter":{"date":"July 12, 2021","title":"SkoHub Presentation at LIS Workshop 2021","description":"Short report about the SkoHub presentation at the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) organized by the Working Group within the GfKL – Data Science Society.","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"From 23-27 November the 12th Semantic Web in Libraries Conference took place online. The programme with links to recordings and slides forum…","fields":{"slug":"/2020-11-25-swib20-workshop/"},"frontmatter":{"date":"November 25, 2020","title":"SkoHub workshop at SWIB20","description":null,"authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Rörtgen","firstname":"Steffen"}]}},{"excerpt":"From 2-5 October the ActivityPub conference happened online where people using the ActivityPub protocol came together to discuss topics all…","fields":{"slug":"/2020-10-09-skohub-apconf/"},"frontmatter":{"date":"October 09, 2020","title":"ActivityPub Conference 2020","description":null,"authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}]}},{"excerpt":"ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub PubSub demo server at skohub.io for an…","fields":{"slug":"/2020-06-25-skohub-pubsub/"},"frontmatter":{"date":"June 25, 2020","title":"Presenting SkoHub PubSub","description":null,"authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}]}},{"excerpt":"ⓘ Update, 2022-03-01: Due to lacking resources for maintenance, we decided to shut down the SkoHub Editor demo for an indefinite time…","fields":{"slug":"/2020-03-31-skohub-editor/"},"frontmatter":{"date":"March 31, 2020","title":"Presenting the SkoHub Editor","description":"An introducatory post to SkoHub Editor.","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}]}},{"excerpt":"On November 27th 2019, Adrian Pohl and Felix Ostrowski (graphthinking) presented SkoHub at the “Semantic Web in Libraries” conference in…","fields":{"slug":"/2020-01-29-skohub-talk-at-swib19/"},"frontmatter":{"date":"January 29, 2020","title":"SkoHub talk at SWIB19: KOS-based content syndication with ActivityPub","description":null,"authors":[{"lastname":"Pohl","firstname":"Adrian"}]}},{"excerpt":"We are happy to announce that the SkoHub prototype outlined in our post “SkoHub: Enabling KOS-based content subscription” is now finished…","fields":{"slug":"/2019-09-27-skohub-vocabs/"},"frontmatter":{"date":"September 27, 2019","title":"Presenting the SkoHub Vocabs Prototype","description":"An introducatory post to SkoHub Vocabs.","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}]}},{"excerpt":"For a long time, openness movements and initiatives with labels like “Open Access”, “Open Educational Resources” (OER) or “Linked Science…","fields":{"slug":"/2019-05-17-skohub/"},"frontmatter":{"date":"May 17, 2019","title":"SkoHub: Enabling KOS-based content subscription","description":"An introduction to the publish/subscribe approach to content discovery to be implemented with SkoHub.","authors":[{"lastname":"Pohl","firstname":"Adrian"},{"lastname":"Ostrowski","firstname":"Felix"}]}}]}},"pageContext":{}},"staticQueryHashes":["1878297489","2734362168","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/page-data/sq/d/1878297489.json b/page-data/sq/d/1878297489.json
new file mode 100644
index 0000000..bdd8500
--- /dev/null
+++ b/page-data/sq/d/1878297489.json
@@ -0,0 +1 @@
+{"data":{"site":{"siteMetadata":{"social":{"github":"skohub-io","mastodon":"skohub","discourse":"https://metadaten.community/c/software-und-tools/metafacture/8"}}}}}
\ No newline at end of file
diff --git a/page-data/sq/d/2734362168.json b/page-data/sq/d/2734362168.json
new file mode 100644
index 0000000..94e94aa
--- /dev/null
+++ b/page-data/sq/d/2734362168.json
@@ -0,0 +1 @@
+{"data":{"site":{"siteMetadata":{"author":{"name":"the SkoHub Community"},"social":{"mastodon":"skohub","github":"skohub-io"},"description":"A blog for SkoHub."}}}}
\ No newline at end of file
diff --git a/page-data/sq/d/3000541721.json b/page-data/sq/d/3000541721.json
new file mode 100644
index 0000000..41a12c4
--- /dev/null
+++ b/page-data/sq/d/3000541721.json
@@ -0,0 +1 @@
+{"data":{"site":{"siteMetadata":{"title":"Skohub Blog","description":"A blog for SkoHub."}}}}
\ No newline at end of file
diff --git a/page-data/using-typescript/page-data.json b/page-data/using-typescript/page-data.json
new file mode 100644
index 0000000..a5fadfe
--- /dev/null
+++ b/page-data/using-typescript/page-data.json
@@ -0,0 +1 @@
+{"componentChunkName":"component---src-pages-using-typescript-tsx","path":"/using-typescript/","result":{"data":{"site":{"buildTime":"2024-03-27 07:26 am UTC"}},"pageContext":{}},"staticQueryHashes":["1878297489","3000541721"],"slicesMap":{}}
\ No newline at end of file
diff --git a/robots.txt b/robots.txt
new file mode 100644
index 0000000..eb05362
--- /dev/null
+++ b/robots.txt
@@ -0,0 +1,2 @@
+User-agent: *
+Disallow:
diff --git a/rss.xml b/rss.xml
new file mode 100644
index 0000000..1ae0a17
--- /dev/null
+++ b/rss.xml
@@ -0,0 +1,2208 @@
+https://blog.skohub.ioGatsbyJSWed, 27 Mar 2024 07:27:14 GMThttps://blog.skohub.io/2024-03-21-skohub-pages/https://blog.skohub.io/2024-03-21-skohub-pages/Thu, 21 Mar 2024 00:00:00 GMT<h2>A simple workflow for publishing your vocabs</h2>
+<p>With <a href="https://github.com/skohub-io/skohub-pages">SkoHub Pages</a> we now provide a very simple way for publishing your SKOS vocabulary from a GitHub repository. It only involves 5-6 steps:</p>
+<p><strong>1. Fork the skohub-pages repo</strong></p>
+<p>Click the “Fork” button in the top-right corner of the <a href="https://github.com/skohub-io/skohub-pages">SkoHub Pages repo</a>. You can change the name of your fork to whatever you like, e.g. <code class="language-text">my-shiny-vocab</code>. See also the <a href="https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/fork-a-repo#forking-a-repository">GitHub fork documentation</a>.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/b5beefdabbe43f74578e4af6474e861c/eb390/create_fork.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 74.68354430379746%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAPCAYAAADkmO9VAAAACXBIWXMAAA7EAAAOxAGVKw4bAAACEElEQVQ4y5WT627iMBCF87cQyNWOYzsJiXPjUkgLq+5SVdq32X3/FzgrDwQoolL3xyePb6Px8Rknr3oMh3d06xestgdsX39iOBzBZYHpPMIs4JgFybdxRGbQrl9QNmuYfotmuYPpB4i8RyRrMN0gTgsiSnLEogCzc5EjEvllz65TwpBrsHSBRFW0kagSoSjhyx5xtkYgW8zDFPNQEPbSOI7x7Z7D5AJ60ULoClwukOY1ctMjL1vkVYuQSbg+g+sncP3T88dxjEcoYbsc8Hb8Da4qqjLgGqbdolu9olkOSLWB6/FPVX2FTeoUiw77/TvK7hllsyFdkqZEtu6hlx38TMFTElOPYTKPabyPLZcKrW6yaCDzGvaDZG6gTAtZ1pBVDW1aZHVHUlhpEl2RPLKo6U6aGZpfKtRFi9XuB1VomjV9jBelCGIJ3xJJBLGCzyQCpuBFp3Ub++OZWF4Tcl2RXd6KGkoU8Jj6L9/d/jY9OfA5NlMffBZh4l81ubfEleSMeIjDYoldv0XRD2D5FlX3jHY1kB/tAS+WJMEJgVmkCPfOMiOO3YhVCaENorSkRPZzGHVGRoQ8Q3iOvcQQs1sb2eoice0U21LWf7ad7MXgvDa2l+0ku0ZS+GdG73kck6cYk6cI7ozDoWddnvQ1j4xtk3laQn300B89gkrDeaTDd7AJpy5D0GZY/fmFzd8j2L7CP8NtfrndaYedAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Create fork, uncheck box ⚠️ to also fork the gh-pages branch"
+ title="Create fork, uncheck box ⚠️ to also fork the gh-pages branch"
+ src="/static/b5beefdabbe43f74578e4af6474e861c/f058b/create_fork.png"
+ srcset="/static/b5beefdabbe43f74578e4af6474e861c/c26ae/create_fork.png 158w,
+/static/b5beefdabbe43f74578e4af6474e861c/6bdcf/create_fork.png 315w,
+/static/b5beefdabbe43f74578e4af6474e861c/f058b/create_fork.png 630w,
+/static/b5beefdabbe43f74578e4af6474e861c/eb390/create_fork.png 935w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Create fork, uncheck box ⚠️ to also fork the gh-pages branch</figcaption>
+ </figure></p>
+<p><strong>2. Activate GitHub Actions</strong></p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/4bae802167f80dcb59a36f4d3ec553ed/ecf19/activate_action.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 48.734177215189874%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAKCAIAAAA7N+mxAAAACXBIWXMAAAsTAAALEwEAmpwYAAABa0lEQVQoz3WRW08bQQyF06q0m/XO/WLP7E7Y7JKSAElJgFxUUYVKiAcU+v//TbUEoUCp9MkPto+lc9zLQAE3faalDQxkaygYyoVj0uVMAzcgLAiTFUpo7BdK2XCUsU9f+p+P8h5TyBRy6b9puud6p51xkUn/3N/XF7imriospN/TOxwLhUwivHY0MWEPF97xRlwoBOGeryJwzYcrOVwC1yA9CLcffSwGbrkJPtaWBgaTxcqlCY1uLJY2DAwlFwbC0OGJTlxIr1zpY52a8fX612yxPru8mV5tZovlZDo/ny9nV+vpYrNY3bbjmS+HylX7OHqvYbhy6GPNfNLUxHBC1MTye6ATwjaEEWFjXHKxxqoVJvzjWbivCieFvv1xTPcXaXua7ibV9rS6Gw9+n5XbMUMC3jn/wDNXmCuMyp3/HNW7ebNbtX82HU/r5nF+/HAhYmDi/2lz1f0pyy1kFoqqgyVgKe+7PDNMvn/VX0sXNj+cantmAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Go to "Actions" tab and if not already activated, activate GitHub Actions."
+ title="Go to "Actions" tab and if not already activated, activate GitHub Actions."
+ src="/static/4bae802167f80dcb59a36f4d3ec553ed/f058b/activate_action.png"
+ srcset="/static/4bae802167f80dcb59a36f4d3ec553ed/c26ae/activate_action.png 158w,
+/static/4bae802167f80dcb59a36f4d3ec553ed/6bdcf/activate_action.png 315w,
+/static/4bae802167f80dcb59a36f4d3ec553ed/f058b/activate_action.png 630w,
+/static/4bae802167f80dcb59a36f4d3ec553ed/40601/activate_action.png 945w,
+/static/4bae802167f80dcb59a36f4d3ec553ed/ecf19/activate_action.png 948w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Go to "Actions" tab and if not already activated, activate GitHub Actions.</figcaption>
+ </figure></p>
+<p><strong>3. Configure GitHub Pages branch</strong></p>
+<p>Go to “Settings”, navigate to the “Pages” setting and select <code class="language-text">gh-pages</code> as the branch your site is being built from.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/302a4/set_gh_pages.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 66.45569620253164%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAANCAYAAACpUE5eAAAACXBIWXMAAA7EAAAOxAGVKw4bAAACVklEQVQ4y1WTSW/bMBCF3QVxYluiFopaqN2yZDuOl6QL2qI9NdcC7Q/oX+ill/76ryC9JDk8PGIwHA7nvRlNnICJE3J14+J4ETMh7dnyRDB1Q66nnj0bNjCxq2uHKz9mZHJkikoKXr29YeR4Ci9McYMEx49xfMVURPZsOFA5YVxY9iNtY66neCMiProhv6aC7zJGx5qJGzIyF9OyQ2U1M09ZHAsfUc5XLFY75sMddbe2MdePGQcpe1fyZyb4ICST0z1bUKYV4tSlkJnlc+GprLkJGzxVEmcVSVaS6JpU1wS6Zp1XRLoiyRt01R8Llu2KxXpP02+o5is8mV46VLqlHg7Ml1vy7oBf3SOrPUG5Jyx2eOWOoDrgxS0iOHeYVMjEdFC/6M6wJzPCfInKF5bdpCdMW4SqcGWBkAVuVOIEGUYPWzCrFuR1z0RmL4oZjtIKlZYkeUtWtKR5jWvc4EXWFU94NkPzd5WULJ3gODfz3VPCfNjax4xoSdkhdcPYOMKLmT0T79jEqWDRLm3iv/GEr0IeE4PEqll1a4bNA01/Rz1s+StCHqeC10ZtEV2cccbI9RVpM7CvOh59xepkgdkpuV3cUjQ9flzwTWX88EIe/YjrSBPIzM74DDfMGHmyoFgf+Fm2PMQ5Y5OojjBqG2X1+hO/i44v1ZpZf0/YDOhibv0b6+YCP5kzMgqttu+pN/ckzWAHH8S5Vd28qpsVt4fPvOt3lP2B5fYT7WKDNnPVDRNX2u0xML8amVWzaxWm+DK7KHyGUdmY1kk7RDogks56M9b1s3V9wn+nzU0yNd9pDQAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Set gh-pages branch"
+ title="Set gh-pages branch"
+ src="/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/f058b/set_gh_pages.png"
+ srcset="/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/c26ae/set_gh_pages.png 158w,
+/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/6bdcf/set_gh_pages.png 315w,
+/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/f058b/set_gh_pages.png 630w,
+/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/40601/set_gh_pages.png 945w,
+/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/302a4/set_gh_pages.png 1080w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Set gh-pages branch</figcaption>
+ </figure></p>
+<p><strong>4. Update pages URL</strong></p>
+<p>Go back to the main page of your repo and click the little gear icon in the top right of the “About” section. Check the box at “Use your GitHub Pages website”.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/b6a6a99907ce6e939473565adc2a5be3/0098c/click_gear_icon.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 37.34177215189873%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAHCAYAAAAIy204AAAACXBIWXMAAA7EAAAOxAGVKw4bAAABfUlEQVQoz12RyY7UMBiE+wBiUOIsdhYv6XbSk8nW6dBDQyM4sAmegofgiHj6DyUDFw6lkn/L9ZerdmFS4O8nxuUtp8sNV/fU3czp8R3T5UY7vsIcWh7GC+PyhuF8ZX8ccL6jmx4Z5ittf6aqe0RasoukZsV6WMVXFlGJCArCoECIcpsFcU6kDLEyGwtpCJThpbI8U5ZQ6u39JiiLiqY7c76+x/sJ921k/PWB5fdH3I+Jwjb000KiCgKRchdLGpHyPRB8DSJ+vrjDlhVxXrFLcrcJmn27faW0R/Khofo04j/PZK8btGs4TjfSaiY2A7GbUXbAu4nOjnzRPVnhEVKzc/XA4f6E9R3W99i6o2kXjseF2s8odUCVFanpSNyJxI7E9kSke0I7EdiJ59WZUDrCJGen9IHM1GTak9u/7Bqsf8D4FlnuyY1HVw26qpG5IZEFUZoTRhIRSaKVk/yplDXI/7FeqPKwZRdnBlXut0j0/n5buBpIstXRU2H/sJbyBx8NuQ6iyohjAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Edit "About" section of the repository"
+ title="Edit "About" section of the repository"
+ src="/static/b6a6a99907ce6e939473565adc2a5be3/f058b/click_gear_icon.png"
+ srcset="/static/b6a6a99907ce6e939473565adc2a5be3/c26ae/click_gear_icon.png 158w,
+/static/b6a6a99907ce6e939473565adc2a5be3/6bdcf/click_gear_icon.png 315w,
+/static/b6a6a99907ce6e939473565adc2a5be3/f058b/click_gear_icon.png 630w,
+/static/b6a6a99907ce6e939473565adc2a5be3/0098c/click_gear_icon.png 775w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Edit "About" section of the repository</figcaption>
+ </figure>
+<figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/34c8419d626f86c2e8a6f15ce75998a4/f0685/use_gh_pages_website.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 79.11392405063292%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAQCAYAAAAWGF8bAAAACXBIWXMAAA7EAAAOxAGVKw4bAAACIklEQVQ4y42TS2vbQBSFtWmbWNZzpJE00szIelmWFDtxcF0oJVBaKHTT/oFuuix0199/ylzbgUBiZ/ExGs3VfZyjsUIuoaoV8rJDrjsU5RJcLBBwiTBRMOePJArsSHjk8CwRZSUcP4GVFDXGzQ7Deofpdo/19gN0MyDKNHi+ABflI/Ez0PusRCobKmqFvEA77dAN97iOJN64HPMggxPmB9hrKOAwAduLYblJDVFtwbod7tWIjZoQLm7Byw3hixW8rL/AEkFxAzsQsAKuwZZ3GNMFfvgcLFZwYw2fa3ixxpwVhMPkGQq4cQnbS2C5PMfPfsIvoRDGOVxewMgQxDmtcabBUkn7c4Q8h+1GsK54iX95hX3dQ0/36MctmuEO3bhFO9wiUy14XoGlCizVL2JMJA2DrIXefISsOlTtiLIdUXUTOV2UHQUZ5j6/yMx0GMQC3WpN9lMXiaTfwI8EBThBCidILuKG6WFkU918eOrktKcgllHgJUxRj2WHhEZQo5WuV1gcR276NQJe0Biv6Y4I08PITE2Y3n9B298gkzVdO6EaqnhK+Br9KM6YEqoNqs0DmuVId/LaYcRLyZ529RRK6IseetijrJeQiyV1dm60k5u2F8GeRZi9Zbh+x2BfRZg5xhSXUWb7SQfpsxqZ1YsEhG6R5jWSXQP5bYD8OiD/3GMe84PLp+Bzep3OTMJu2qJq1lDfJ6z/fsLdnwf0v/dwsgT/Acf/mhiejKDMAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Set URL of repository"
+ title="Set URL of repository"
+ src="/static/34c8419d626f86c2e8a6f15ce75998a4/f058b/use_gh_pages_website.png"
+ srcset="/static/34c8419d626f86c2e8a6f15ce75998a4/c26ae/use_gh_pages_website.png 158w,
+/static/34c8419d626f86c2e8a6f15ce75998a4/6bdcf/use_gh_pages_website.png 315w,
+/static/34c8419d626f86c2e8a6f15ce75998a4/f058b/use_gh_pages_website.png 630w,
+/static/34c8419d626f86c2e8a6f15ce75998a4/f0685/use_gh_pages_website.png 835w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Set URL of repository</figcaption>
+ </figure></p>
+<p><strong>5. Start committing</strong></p>
+<p>Now you can add a commit to the main branch adjusting the example vocabularies or adding a new turtle file. The changes will automatically be published to your GitHub pages website that is now linked at the top-right of your GitHub repo (sometimes it takes a little to see the changes, remember to do some hard refreshing).</p>
+<p><strong>6. Set your GitHub Pages URL as namespace (optional)</strong></p>
+<p>See section “Resolving custom domains” below ⬇️</p>
+<h2>Utilizing GitHub Actions & Pages</h2>
+<p>Not all projects or individuals involved in the creation of controlled vocabularies are able or have the resources to run their own infrastructure. Thus, we have been pursuing this approach – formerly under the name of “skohub-docker-vocabs” – to utilize Docker and GitHub infrastructure for publishing SKOS vocabularies with <a href="https://github.com/skohub-io/skohub-vocabs">SkoHub Vocabs</a>. Specifically, the workflow relies on ”<a href="https://docs.github.com/de/pages/getting-started-with-github-pages">GitHub Pages</a>” and ”<a href="https://docs.github.com/en/actions">GitHub Actions</a>”. With GitHub Pages it is possible to host websites on the GitHub infrastructure, GitHub Actions are used for automated tests and deployments.</p>
+<p>We have written <a href="https://github.com/skohub-io/skohub-pages/blob/main/.github/workflows/main.yml">a GitHub Action</a> that ensures that a process is started after each push to the repository which builds the vocabularies with SkoHub Vocabs.
+The built vocabulary is then pushed to a separate git branch <code class="language-text">gh-pages</code>.
+As seen above, GitHub Pages is configured to deliver HTML pages from this <code class="language-text">gh-pages</code> branch.</p>
+<p>We have been using this approach in various introduction to SKOS and SkoHub workshops.
+However, in the past the workflow required some adjustments in the GitHub action so that errors could quickly creep in. We are happy to having improved this considerably and made the process much less error-prone! 🎉</p>
+<p>The relevant information is now set directly as environment variables and all other customizations can be changed via the GitHub GUI, so the workflow is now much more user-friendly. But that’s not all!</p>
+<h2>Resolving custom domains</h2>
+<p>Although with the presented approach the custom vocabulary could be provided without own infrastructure, the domains did not resolve to the GitHub pages.
+This means that a concept scheme that uses URIs based on the GitHub Pages domain (e.g. <code class="language-text">https://myhandle.github.io/skohub-pages/</code>) could not be resolved so far. In the past, in order to mitigate this we recommended setting up a redirect via <a href="https://w3id.org/">w3id</a> or <a href="https://purl.archive.org/">purl.org</a>.
+Of course, it still makes sense to set up a redirect (in case the vocabulary moves somewhere else). However, it is now also possible to use the domain that is assigned via GitHub Pages and have quickly set up a fully working SKOS vocabulary with resolving concept URIs which can come handy for prototyping.</p>
+<p>To do this, a <a href="https://github.com/skohub-io/skohub-pages/blob/main/config.yaml"><code class="language-text">config.yaml</code></a> must be created in the repo.
+The respective domain must then be entered under the <code class="language-text">custom_domain</code>.
+Example: Your GitHub Pages domain is <a href="https://myhandle.github.io/skohub-pages/">https://myhandle.github.io/skohub-pages/</a>. Then provide <code class="language-text">https://myhandle.github.io/skohub-pages/</code> as <code class="language-text">custom_domain</code> in your config.yaml.</p>
+<p>The base of your concept scheme could then be something like: <code class="language-text">https://myhandle.github.io/skohub-pages/myvocab/</code></p>
+<div class="gatsby-highlight" data-language="yaml"><pre class="language-yaml"><code class="language-yaml"><span class="token comment">#config.yaml</span>
+<span class="token punctuation">---</span>
+<span class="token comment"># [...]</span>
+<span class="token key atrule">custom_domain</span><span class="token punctuation">:</span> <span class="token string">"https://myhandle.github.io/skohub-pages/"</span>
+<span class="token comment">#[...]</span></code></pre></div>
+<div class="gatsby-highlight" data-language="turtle"><pre class="language-turtle"><code class="language-turtle"><span class="token comment"># colors.ttl</span>
+<span class="token keyword">@prefix</span> <span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span></span> <span class="token url"><span class="token punctuation"><</span>https://myhandle.github.io/skohub-pages/myColourVocab/<span class="token punctuation">></span></span> <span class="token punctuation">.</span>
+<span class="token keyword">@prefix</span> <span class="token function"><span class="token prefix">dct<span class="token punctuation">:</span></span></span> <span class="token url"><span class="token punctuation"><</span>http://purl.org/dc/terms/<span class="token punctuation">></span></span> <span class="token punctuation">.</span>
+<span class="token keyword">@prefix</span> <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span></span> <span class="token url"><span class="token punctuation"><</span>http://www.w3.org/2004/02/skos/core#<span class="token punctuation">></span></span> <span class="token punctuation">.</span>
+<span class="token keyword">@prefix</span> <span class="token function"><span class="token prefix">xsd<span class="token punctuation">:</span></span></span> <span class="token url"><span class="token punctuation"><</span>http://www.w3.org/2001/XMLSchema#<span class="token punctuation">></span></span> <span class="token punctuation">.</span>
+
+<span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span></span> <span class="token keyword">a</span> <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">ConceptScheme</span></span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">dct<span class="token punctuation">:</span></span><span class="token local-name">title</span></span> <span class="token string">"Colour Vocabulary"</span><span class="token tag"><span class="token punctuation">@</span>en</span><span class="token punctuation">,</span> <span class="token string">"Farbvokabular"</span><span class="token tag"><span class="token punctuation">@</span>de</span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">dct<span class="token punctuation">:</span></span><span class="token local-name">creator</span></span> <span class="token string">"Hans Dampf"</span><span class="token tag"><span class="token punctuation">@</span>de</span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">dct<span class="token punctuation">:</span></span><span class="token local-name">created</span></span> <span class="token string">"2021-11-02"</span><span class="token punctuation">^^</span><span class="token function"><span class="token prefix">xsd<span class="token punctuation">:</span></span><span class="token local-name">date</span></span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">dct<span class="token punctuation">:</span></span><span class="token local-name">license</span></span> <span class="token url"><span class="token punctuation"><</span>https://creativecommons.org/publicdomain/zero/1.0/<span class="token punctuation">></span></span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">hasTopConcept</span></span> <span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span><span class="token local-name">violet</span></span><span class="token punctuation">,</span> <span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span><span class="token local-name">blue</span></span> <span class="token punctuation">.</span>
+
+<span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span><span class="token local-name">violet</span></span> <span class="token keyword">a</span> <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">Concept</span></span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">prefLabel</span></span> <span class="token string">"Violett"</span><span class="token tag"><span class="token punctuation">@</span>de</span><span class="token punctuation">,</span> <span class="token string">"violet"</span><span class="token tag"><span class="token punctuation">@</span>en</span><span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">altLabel</span></span> <span class="token string">"Lila"</span><span class="token tag"><span class="token punctuation">@</span>de</span><span class="token punctuation">,</span> <span class="token string">"purple"</span><span class="token tag"><span class="token punctuation">@</span>en</span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">topConceptOf</span></span> <span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span></span> <span class="token punctuation">.</span>
+
+<span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span><span class="token local-name">blue</span></span> <span class="token keyword">a</span> <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">Concept</span></span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">prefLabel</span></span> <span class="token string">"Blau"</span><span class="token tag"><span class="token punctuation">@</span>de</span><span class="token punctuation">,</span> <span class="token string">"blue"</span><span class="token tag"><span class="token punctuation">@</span>en</span> <span class="token punctuation">;</span>
+ <span class="token function"><span class="token prefix">skos<span class="token punctuation">:</span></span><span class="token local-name">topConceptOf</span></span> <span class="token function"><span class="token prefix">colour<span class="token punctuation">:</span></span></span> <span class="token punctuation">.</span></code></pre></div>
+<p>Feel free to try out our simplified approach and let us know if something does not work: <a href="https://github.com/skohub-io/skohub-pages/issues/new">https://github.com/skohub-io/skohub-pages/issues/new</a></p>
+<h2>Outlook: Beyond GitHub</h2>
+<p>There are lots of reasons why people might not want to use the GitHub infrastructure owned by Microsoft for their SKOS publication workflows. That’s why we will be looking into replacing as much as possible of this workflow by generic git-based tooling for triggering the build. The goal is to support such an easy SKOS publishing workflow on other forges like GitLab or Forgejo. The work on this happens around this issue: <a href="https://github.com/skohub-io/skohub-pages/issues/19">https://github.com/skohub-io/skohub-pages/issues/19</a></p>
+<p>Let us know if you have some good implementation ideas or more wishes for future development!</p>https://blog.skohub.io/2024-01-24-uris-without-language-tags/https://blog.skohub.io/2024-01-24-uris-without-language-tags/Wed, 31 Jan 2024 00:00:00 GMT<h2>In the past: Internationalization with drawbacks</h2>
+<p>If you have worked with SkoHub Vocabs before, you might have noticed that the URLs in the address bar had a little special feature that you don’t encounter very often, a language tag before the <code class="language-text">.html</code>:</p>
+<p><code class="language-text">https://skohub.io/dini-ag-kim/hochschulfaechersystematik/heads/master/w3id.org/kim/hochschulfaechersystematik/scheme.de.html</code></p>
+<p>Why did we need this?</p>
+<p>We wanted Internationalization features to be able to navigate multiple languages.
+Normally this is done via a subdomain or adding a language tag behind the domain name like <code class="language-text">https://w3id.org/kim/hochschulfaechersystematik/en/</code>.
+But this does not work for SkoHub Vocabs since the we use the URIs from the turtle files as IDs for the concept.
+Changing the URI by adding a language tag somewhere would break the whole concept of SkoHub Vocabs.</p>
+<p>So it was decided to add the language at the end of the URL by using Apache Multiviews features.
+But this lead to some drawbacks:</p>
+<ul>
+<li>SkoHub Vocabs needed to be served by an Apache Webserver</li>
+<li>The webserver needed special configuration</li>
+<li><a href="https://github.com/skohub-io/skohub-docker-vocabs">SkoHub Docker Vocabs</a>, which is served via GitHub Pages, always needed a specific link to an index.{language}.html file, since GitHub Pages only looks for an <code class="language-text">index.html</code></li>
+<li>The build directory grew quite a bit, since there were dedicated html pages built for every language</li>
+</ul>
+<h2>Switching to one page for all languages</h2>
+<p>In order to overcome these issues we decided to change this behaviour and just build one html page with a functionality to switch languages. The shown language is now chosen the following way:</p>
+<ul>
+<li>by using your browser language</li>
+<li>if you switched languages in the application the chosen language is taken</li>
+<li>if a language is not present, a default language present in the vocabulary is used</li>
+</ul>
+<p>To point users to a specific language, you can use a query parameter <code class="language-text">lang=</code> like:</p>
+<p><code class="language-text">https://w3id.org/kim/hcrt/scheme?lang=uk</code></p>
+<p>Since SkoHub Vocabs also used the language tag of the URL internally to determine which language to serve a lot of changes had to be done in the codebase.
+But overall this resulted in a much reduced size of the built vocabularies and more flexibility on serving the vocabularies.</p>
+<h2>Benefits of the new approach</h2>
+<p>This new internationalization approach brings lots of improvements:</p>
+<ul>
+<li>SkoHub Vocabs is now independent from the underlying webserver</li>
+<li>The size of the vocabularies is drastically reduced, especially for vocabularies with lots of languages</li>
+<li><a href="https://github.com/skohub-io/skohub-docker-vocabs">SkoHub Docker Vocabs</a> is now simpler to setup since we only have “normal” <code class="language-text">index.html</code> files that it knows how to handle</li>
+</ul>
+<h2>What to do if I’m running my own webhook server?</h2>
+<p>If you are running your own webhook server, you should upgrade the following way:</p>
+<ul>
+<li>Follow the steps outlined in the webhook repository to <a href="https://github.com/skohub-io/skohub-webhook#rebuilding-vocabularies">rebuild vocabularies</a>. This will rebuild all still existing branches you are currently serving.</li>
+<li>Set up a redirect in your apache config, so that links that still have <code class="language-text">...de.html</code> will be redirected to <code class="language-text">...html?lang=de</code>:</li>
+</ul>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text"># Redirect from ...filename.LANGCODE.html to ...filename.html?lang=LANGCODE
+ RewriteRule ^(.+)\.([a-z]{2})\.html$ $1.html?lang=$2 [L,R=301]</code></pre></div>
+<ul>
+<li>After that you should be good!</li>
+</ul>
+<h2>Anything else?</h2>
+<p>During developing the script to rebuild all existing vocabularies, I noticed that we are serving a lot of branches that do not exist anymore.
+SkoHub Webhook currently builds a vocabulary for every branch, you are setting up and pushing to.
+But the webhook service does not get notified, when a branch is deleted.
+This way we end up having lots of files for branches that no one needs anymore.
+In order to clean this up a bit, we will soon add a script to clean the dist directory up and remove those no longer needed files.</p>https://blog.skohub.io/2024-01-18-reconcile/https://blog.skohub.io/2024-01-18-reconcile/Mon, 22 Jan 2024 00:00:00 GMT<p>Reconciliation is the process of integrating data from sources which do not share common unique identifiers by identifying records which refer to the same entities.
+This happens mostly by comparing the attributes of the entities.
+For instance, two entries in a catalogue about persons that share the same date of birth, place of birth, name and death date, will probably be about the same person.
+Linking these two entries by adding the identifier from another data source is the process of reconciliation. This allows for extension of your data by taking over information from a linked record.</p>
+<p>To facilitate this process multiple tools exist with <a href="https://openrefine.org/">OpenRefine</a> being the most prominent tool.
+To align and standardize the way of providing data for these tools the <a href="https://reconciliation-api.github.io/specs/draft/">Reconciliation Service API</a> is drafted by the <a href="https://www.w3.org/community/reconciliation/">Entity Reconciliation Community Group</a> within the World Wide Web Consortium (W3C).
+The specification defines endpoints that data services can expose so that applications like OpenRefine can handle that data.
+A number of other services have already implemented the specification, like <a href="https://teipublisher.com/">TEI Publisher</a> or <a href="https://coli-conc.gbv.de/cocoda/">Cocoda</a>, or the <a href="https://developers.exlibrisgroup.com/appcenter/alma-refine/">Alma Refine plugin</a> for the commercial Library Management System Alma.</p>
+<h2>Reconciliation and SKOS</h2>
+<p><a href="https://www.w3.org/TR/skos-reference/">Simple Knowledge Organization System (SKOS)</a> is an established standard for modeling controlled vocabularies as Linked Data. Thus, SKOS vocabularies are often targets of reconciliation efforts as you can improve your local data by enrichting strings with identifiers of a controlled vocabulary. So SKOS and the Reconciliation Service API often go hand in hand. However, there has not existed an easy way to set up a reconcilation endpoint for an existing SKOS vocabulary. We decied to change that by developing the new SkoHub component <a href="https://github.com/skohub-io/skohub-reconcile">SkoHub-Reconcile</a>.</p>
+<p>Andreas Wagner had already built a reconciliation prototype for SKOS vocabularies (see also our <a href="https://blog.skohub.io/2022-12-19-workshop-summary/">Workshop Blog Post</a>.
+We picked this prototype up, refactored it and moved it into a container based infrastructure.
+We also added support for <a href="https://www.w3.org/community/reports/reconciliation/CG-FINAL-specs-0.2-20230410/">v0.2 of the reconciliation spec</a>.</p>
+<h2>SkoHub Reconcile Publish</h2>
+<p>To make it easy to upload vocabularies to the reconciliation service a front-end was develped which you can try out at <a href="https://reconcile-publish.skohub.io/">https://reconcile-publish.skohub.io/</a>.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/d272074553e0a809343b4bf35fa04f09/5d942/reconcile-publish.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 94.9367088607595%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAATCAYAAACQjC21AAAACXBIWXMAAA7EAAAOxAGVKw4bAAACkklEQVQ4y41T227TQBDNx/FEgY+Adx74H0RpS1uJqlR9ogiiBBwncdKkFyhpo7ZCENu7viXx/ZIetJO41GkaYenoWHN2jmdnxiU88Nzc3BB7SYxhFOJq6CCZZAVt0VN6SJjMki4dG68qn/D6tIswTf/fMD8kWCA3VD0X73rfsXZ2gnEcF87kuJtfWnTNu++W66J+0cNvpiPJ0qXtWVhhkmWIswzZZIJkMoEfR9AdG6pjk0bxLCucW1hhHvg1GuKY62jpA5yZBo64jsuRg6vREIdMI63LNJwaDD8tA36SLDeMshQnXKcEYSYM+raFjq6iras45gyHuoqeZVJ/F155vsn5NQbuGB2m4YfJiRXtD74bDJGY9mxw87mlZStAH5th/n3p2qRpiiAIEIYh8T34PnzfR+AHi/UgQBRF0wqFoeM40HUdnHNYlnUL27ZhGAZpOURsXmeMEeI4nl5ZCMJMiK7rwvM84vF4jNFodMs5hJafy3MFRJUFQ24YxKZlwrJtMM6hico4n4IxaIzBsCyYtg3TtmCYJhVyz5BxBlNU6DiIPA+IEyAMgSCcchj9Y/ELCj1J4Q2H9OF7hgbnGDAdzf4FGhfn2DnuYL3dxJpSx1uljlWljjdNGatNmeIbHQWr7QY611dwTIt6WDAU1V1rKtbrMtZkGS+/fMTzg328ONjHs71tPN7dxMruFlZ2N/Hkwxae7m3j0c4G3p8ewbNtakWxh6IPjKEmy/hcLkOqVtFuNNBttdBVFHSV1hStFpqShPrXb5CrVVz2+zTxQoUiIFZC0zTUZRmVSoUgSRJkWYZUqxFqM1SqVdLL5TJ65+c0FJFLhmIPxQqoqkpBIZqmSaDpGQbxXeS6gKhsMBjQULIsw1/3PpbmfX5EnQAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The reconcile-publish upload UI"
+ title="The reconcile-publish upload UI"
+ src="/static/d272074553e0a809343b4bf35fa04f09/f058b/reconcile-publish.png"
+ srcset="/static/d272074553e0a809343b4bf35fa04f09/c26ae/reconcile-publish.png 158w,
+/static/d272074553e0a809343b4bf35fa04f09/6bdcf/reconcile-publish.png 315w,
+/static/d272074553e0a809343b4bf35fa04f09/f058b/reconcile-publish.png 630w,
+/static/d272074553e0a809343b4bf35fa04f09/40601/reconcile-publish.png 945w,
+/static/d272074553e0a809343b4bf35fa04f09/78612/reconcile-publish.png 1260w,
+/static/d272074553e0a809343b4bf35fa04f09/5d942/reconcile-publish.png 1343w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The reconcile-publish upload UI</figcaption>
+ </figure></p>
+<p>Every vocabulary that passes the SkoHub SHACL Shape (see our <a href="https://blog.skohub.io/2023-11-22-shacl-shape/">blog post</a>) should work for uploading to the reconcile service.
+The only additional requirement is to provide a <code class="language-text">vann:preferredNamespaceUri</code>.
+As you can see in the screenshot you also have to provide an account and a language.
+As for the account you can currently choose whatever you want, just make sure it is unique enough, so your dataset (i.e. your vocabulary) does not get overwritten by someone else.
+Since a <code class="language-text">lang</code> parameter has only been available since the <a href="https://reconciliation-api.github.io/specs/draft/#service-manifest">current draft</a> version of the reconciliation specification and not yet implemented in SkoHub Reconcile, the current version of the SkoHub Reconcile service requires you to specify a language you want to use for reconciliation. We will improve this in the future along with the development of the specification.</p>
+<h2>Example: Usage in OpenRefine</h2>
+<p>Let’s see how we can use the service with OpenRefine.</p>
+<p>First, we upload the vocabulary.
+We will use a classification of <a href="https://w3id.org/kim/hochschulfaechersystematik/scheme">subject groups</a>.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/e5af98f7ca69f0c23efe94e8e10c44c3/3c492/upload.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 50.632911392405056%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAKCAYAAAC0VX7mAAAACXBIWXMAAA7EAAAOxAGVKw4bAAABzUlEQVQoz2WRy08TURSH+8eRSIgb/wZxo3tfG+NCjSY+EsVWw0soBZVqNbKRRF2YWqZWMBArrpQmnTtzX9N2ygzTtD9zTulDWXw5J/d373fOZFLdbhftdhutVgthGHJtNptDgiCAtZahfjwj6D5Bjl6vh1Qcx/A8D77vo16vc6XHxhgWKKUgpWSoHwygnCqd03uCXKkoiljSDwRfpE1p+mCrcehsfDOtNb8lB7lYSPbBJKU1jDVcPd8fIUc9ZcoYRip5UujTJ0sJLSWiMEQ3joGjI/QOIwZRBBweE1GW9Ek6aFrLQ05sqJXCz4MDOL/28an6A7lvZaxWHGTLJSyXvzBLDtUSVioO58uVLez++Q2jFDuGQuF5CLTGx/0qcsUiZkpFnN3cwPT7dzjz9gVO5XOYfDViqrCG04XnmMjnsFHdQ8sYdvwjbFiLzzvbuDyfwbWlOdzMLiL9Jo90YR2PC+vIHPczr1/i1soibmQXcP3ZLLZ2vyOgzx7/y67rsnDTKeHcgzu48Ogec3XhKQ+4NJfBlfknXC/Opjk7//Aupu/fxoevDhrGsoM3TJIEwnVRq9XgCw/tIEBo+wRKo6H/xwxzuusLwW+FEOh0OvgLSZPJA9laBt8AAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The filled in upload form with "test" as account name, "de" as language and "systematik.ttl" as file to be uploaded"
+ title="The filled in upload form with "test" as account name, "de" as language and "systematik.ttl" as file to be uploaded"
+ src="/static/e5af98f7ca69f0c23efe94e8e10c44c3/f058b/upload.png"
+ srcset="/static/e5af98f7ca69f0c23efe94e8e10c44c3/c26ae/upload.png 158w,
+/static/e5af98f7ca69f0c23efe94e8e10c44c3/6bdcf/upload.png 315w,
+/static/e5af98f7ca69f0c23efe94e8e10c44c3/f058b/upload.png 630w,
+/static/e5af98f7ca69f0c23efe94e8e10c44c3/40601/upload.png 945w,
+/static/e5af98f7ca69f0c23efe94e8e10c44c3/78612/upload.png 1260w,
+/static/e5af98f7ca69f0c23efe94e8e10c44c3/3c492/upload.png 1300w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The filled in upload form with "test" as account name, "de" as language and "systematik.ttl" as file to be uploaded</figcaption>
+ </figure></p>
+<p>After a successful upload of the <a href="https://raw.githubusercontent.com/dini-ag-kim/hochschulfaechersystematik/master/hochschulfaechersystematik.ttl">turtle file</a>, we are presented with a URI that leads to the <a href="https://reconciliation-api.github.io/specs/draft/#service-manifest">“Service Manifest”</a> of our reconciliation service.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/4972457dc61a693775b415a5d0d30d8b/3c492/upload-success.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 26.58227848101266%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAFCAYAAABFA8wzAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAA7klEQVQY022PQUvDQBCF89sFhVIP/gvBg3iRoFUPbexZLNFIEtsmdjczuyBkd5tcwpNdLBL18MEw780HEw3DAGstjDF/aNsWe+eQVxXKusbeurD7r+sd3hX1fQ8iGtMQSDGkIhjTYnp5jrOrCxhrws5no+737F1R13Ug2YBXa/BqA356DzMt36DnGdxjAZPksEkOtyyg5hlokYGf1z+kG5AkdAchM0NrDaUVlFJQWuOzapCnJabxDJPrG0ziW5zGMxRpGTLfOXT9rXd4V3hZCDFiJwSYCS/1FsfJPY4Wd4GT5AGvH9uQ7X7dSCnDy18+hG3fseizuAAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The URL of the Service Manifest being returned by the upload UI"
+ title="The URL of the Service Manifest being returned by the upload UI"
+ src="/static/4972457dc61a693775b415a5d0d30d8b/f058b/upload-success.png"
+ srcset="/static/4972457dc61a693775b415a5d0d30d8b/c26ae/upload-success.png 158w,
+/static/4972457dc61a693775b415a5d0d30d8b/6bdcf/upload-success.png 315w,
+/static/4972457dc61a693775b415a5d0d30d8b/f058b/upload-success.png 630w,
+/static/4972457dc61a693775b415a5d0d30d8b/40601/upload-success.png 945w,
+/static/4972457dc61a693775b415a5d0d30d8b/78612/upload-success.png 1260w,
+/static/4972457dc61a693775b415a5d0d30d8b/3c492/upload-success.png 1300w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The URL of the Service Manifest being returned by the upload UI</figcaption>
+ </figure></p>
+<p>If we follow the URL <a href="https://reconcile.skohub.io/reconcile?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme">https://reconcile.skohub.io/reconcile?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme</a> we see some data that services will use for reconciliation against our vocabulary:</p>
+<div class="gatsby-highlight" data-language="json"><pre class="language-json"><code class="language-json"><span class="token punctuation">{</span>
+ <span class="token property">"versions"</span><span class="token operator">:</span> <span class="token punctuation">[</span>
+ <span class="token string">"0.2"</span><span class="token punctuation">,</span>
+ <span class="token string">"0.3.0-alpha"</span>
+ <span class="token punctuation">]</span><span class="token punctuation">,</span>
+ <span class="token property">"name"</span><span class="token operator">:</span> <span class="token string">"SkoHub reconciliation service for account 'test', dataset 'https://w3id.org/kim/hochschulfaechersystematik/scheme'"</span><span class="token punctuation">,</span>
+ <span class="token property">"identifierSpace"</span><span class="token operator">:</span> <span class="token string">"https://w3id.org/kim/hochschulfaechersystematik/"</span><span class="token punctuation">,</span>
+ <span class="token property">"schemaSpace"</span><span class="token operator">:</span> <span class="token string">"http://www.w3.org/2004/02/skos/core#"</span><span class="token punctuation">,</span>
+ <span class="token property">"defaultTypes"</span><span class="token operator">:</span> <span class="token punctuation">[</span>
+ <span class="token punctuation">{</span>
+ <span class="token property">"id"</span><span class="token operator">:</span> <span class="token string">"ConceptScheme"</span><span class="token punctuation">,</span>
+ <span class="token property">"name"</span><span class="token operator">:</span> <span class="token string">"ConceptScheme"</span>
+ <span class="token punctuation">}</span><span class="token punctuation">,</span>
+ <span class="token punctuation">{</span>
+ <span class="token property">"id"</span><span class="token operator">:</span> <span class="token string">"Concept"</span><span class="token punctuation">,</span>
+ <span class="token property">"name"</span><span class="token operator">:</span> <span class="token string">"Concept"</span>
+ <span class="token punctuation">}</span>
+ <span class="token punctuation">]</span><span class="token punctuation">,</span>
+ <span class="token property">"view"</span><span class="token operator">:</span> <span class="token punctuation">{</span>
+ <span class="token property">"url"</span><span class="token operator">:</span> <span class="token string">"{{id}}"</span>
+ <span class="token punctuation">}</span><span class="token punctuation">,</span>
+ <span class="token property">"preview"</span><span class="token operator">:</span> <span class="token punctuation">{</span>
+ <span class="token property">"url"</span><span class="token operator">:</span> <span class="token string">"https://reconcile.skohub.io/preview?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme&id={{id}}"</span><span class="token punctuation">,</span>
+ <span class="token property">"width"</span><span class="token operator">:</span> <span class="token number">100</span><span class="token punctuation">,</span>
+ <span class="token property">"height"</span><span class="token operator">:</span> <span class="token number">320</span>
+ <span class="token punctuation">}</span><span class="token punctuation">,</span>
+ <span class="token property">"suggest"</span><span class="token operator">:</span> <span class="token punctuation">{</span>
+ <span class="token property">"entity"</span><span class="token operator">:</span> <span class="token punctuation">{</span>
+ <span class="token property">"service_url"</span><span class="token operator">:</span> <span class="token string">"https://reconcile.skohub.io"</span><span class="token punctuation">,</span>
+ <span class="token property">"service_path"</span><span class="token operator">:</span> <span class="token string">"/suggest?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme&service=entity"</span><span class="token punctuation">,</span>
+ <span class="token property">"flyout_service_path"</span><span class="token operator">:</span> <span class="token string">"/suggest/flyout?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme&id=${id}"</span>
+ <span class="token punctuation">}</span><span class="token punctuation">,</span>
+ <span class="token property">"property"</span><span class="token operator">:</span> <span class="token punctuation">{</span>
+ <span class="token property">"service_url"</span><span class="token operator">:</span> <span class="token string">"https://reconcile.skohub.io"</span><span class="token punctuation">,</span>
+ <span class="token property">"service_path"</span><span class="token operator">:</span> <span class="token string">"/suggest?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme&service=property"</span><span class="token punctuation">,</span>
+ <span class="token property">"flyout_service_path"</span><span class="token operator">:</span> <span class="token string">"/suggest/flyout?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme&id=${id}"</span>
+ <span class="token punctuation">}</span><span class="token punctuation">,</span>
+ <span class="token property">"type"</span><span class="token operator">:</span> <span class="token punctuation">{</span>
+ <span class="token property">"service_url"</span><span class="token operator">:</span> <span class="token string">"https://reconcile.skohub.io"</span><span class="token punctuation">,</span>
+ <span class="token property">"service_path"</span><span class="token operator">:</span> <span class="token string">"/suggest?language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme&service=property"</span><span class="token punctuation">,</span>
+ <span class="token property">"flyout_service_path"</span><span class="token operator">:</span> <span class="token string">"/suggest/flyout&language=de&account=test&dataset=https://w3id.org/kim/hochschulfaechersystematik/scheme&id=${id}"</span>
+ <span class="token punctuation">}</span>
+ <span class="token punctuation">}</span>
+<span class="token punctuation">}</span></code></pre></div>
+<p>Now that the reconciliation service is set up with our data, let’s see how we can use it in OpenRefine.</p>
+<p>For demo purposes we use a small vocabulary of a few discipline names:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 392px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/8ee81921b4fd54654268bbb82e704459/0acb4/or1.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 58.22784810126582%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAMCAYAAABiDJ37AAAACXBIWXMAAA7EAAAOxAGVKw4bAAACTElEQVQoz1WT607bQBCF/f4PUPVfH4ECUkUSShFBESJEkNIICi2kufuS3bW9ttfXr1qnQcXS0cx4js9452gdEZVsghwRVkQJBDLH22b4wuCLjE1gcIOCbVjiBjkqrohTWm5idlBxTSAL0qzC6XYVk4eKm9sJHz5+4uFpjogq3CDbCcucyWNGp6N5nQk+H3U4OOpw/OWMw+Nemw+uxlwNJ5Q1OC9Tw8qDm9sHDg5PmM58RFj8+0srapgtc16mJX8WIV+/DTjpXtDp9dtoMRxNuP3+TFGBM3dzpquMVdAQRDBdF7wuM2abnLlrc8PCq1hva9ZbkAn4IbgSPLXDetuw8GripMRZiZy1MEgpWW8T/LhGRhmLxYLnX7+JY839/Q/u7sZcXPTp9y9ZrdaoMEIIiZQKqcI25nmBs1EFrjKESuJLzUbmqChhvVry+PhIGCqenn4yHo8ZDodcX1+zXC6J47jthWHYQikrmOPEiUGnBmMysixDaUOcZOTGUBRF+64sS6qqoq5r7GNrC2PMGyzPchxPwzaxtAaZgq+hbMB1N0wmk1ZgNBrR7XY5PT3l7OyM8/Nz+v0+8/m87TdN8xYdmTbEprEVZdWg0pqiasjz3VRLSpKk3bGFEKKNQRC0/b3QHk6oMyy0jol0SpiURDpDCoHneaRpiu/7b7uytT2i3ZfNtdbtQBvtipyFrFmJojVlY29G2KBiw3T6ymAwaImXl5d0Oh16vV7rvjXEmrAf8s4UXyYESr81PKGRKkKp3RH//2AvEkXRO7F9bQX/AnMRgvhFKtbFAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="An OpenRefine tabe with the values "mathe", "bibliothek", "forst""
+ title="An OpenRefine tabe with the values "mathe", "bibliothek", "forst""
+ src="/static/8ee81921b4fd54654268bbb82e704459/0acb4/or1.png"
+ srcset="/static/8ee81921b4fd54654268bbb82e704459/c26ae/or1.png 158w,
+/static/8ee81921b4fd54654268bbb82e704459/6bdcf/or1.png 315w,
+/static/8ee81921b4fd54654268bbb82e704459/0acb4/or1.png 392w"
+ sizes="(max-width: 392px) 100vw, 392px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">An OpenRefine tabe with the values "mathe", "bibliothek", "forst"</figcaption>
+ </figure></p>
+<p>By clicking on the dropdown button of the column we want to reconcile, we choose “Reconcile” -> “Start reconciling…“.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/1ba989cdd764d0e20d0886cbee808596/47ff6/or2.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 115.82278481012658%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAXCAYAAAALHW+jAAAACXBIWXMAAA7EAAAOxAGVKw4bAAACeklEQVQ4y51Ua2vaUBjOH1u/lOo+rD9yMAb70FnYhbEORm1HwVq6KhulGi/RRHMziSYxiYnPeI5NsINplwNP3nDec57z3qX+MMZtS8b5ZQPXN79h2nNwrddrgfz/uUvSDQu9Xg+O6yFJVkiSRBDouo4oiv6f0PNcjEYj2LaNLMsKxWKxKEfoui5UVcNgMIDv+4Wrq9WqHCEtsyxLIAwDcZmWcr8UoWmaGI0UDAZ9OK5TKLhfmlDTNEynUxHL5XIpFLS4FOFsNsNwOIQs9wRxvgzDKEfIRDiOA9Oy4HmeSEaapmCySrusqmqxkV9mYqIoflLku5Cfk1zXgaapyNI0pxTf2cxG/GjhZm/9RP/3ykkl23ExVicIowRxkmL5KNlBQbhJULYG0myNZJVilWaI4mQLseguhomQbu/HqN908b1xj6u2gq4e42GyxM+OjruujvPre1y1+mj8UnD7oKElG9ADYOpDSG0BJNlWDA3bgzKxMJpYmJguDMcXmFoeNNNDV9Ex1l38aNzh9dsTvHl3ivcfz1D79E3g5MMZPn/5isuLC9TrdUi7MhYGvih4fzHHaa2Gg4MXOD5+hUrlCFWiWhGycnSEw8NDvKxWIe3KHEuKNcqi57BgbRKsDMIwTMi9HhRFEY3APelf2aIMgkAQ8PD2JNpe7CwmpYjhPkK2I18ej8fo9/vodDqibjlDaXXuST5UdhLmLpOAbtF1So46WZYRhuEm1o9SFPY+C9nrBCc425H/DAMf4YNsVZ7bS5hPbVrEywSHB93eHiIs5r0W5qQcss1mE+12W7hLC5kgWknLeI6Ez7KQki/P53PhGrMZx3EB6khE3fZU+gNiEuG48BEAOwAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="OpenRefine dropdown menu to start a reconciliation process"
+ title="OpenRefine dropdown menu to start a reconciliation process"
+ src="/static/1ba989cdd764d0e20d0886cbee808596/f058b/or2.png"
+ srcset="/static/1ba989cdd764d0e20d0886cbee808596/c26ae/or2.png 158w,
+/static/1ba989cdd764d0e20d0886cbee808596/6bdcf/or2.png 315w,
+/static/1ba989cdd764d0e20d0886cbee808596/f058b/or2.png 630w,
+/static/1ba989cdd764d0e20d0886cbee808596/47ff6/or2.png 852w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">OpenRefine dropdown menu to start a reconciliation process</figcaption>
+ </figure></p>
+<p>After clicking “Add standard service”, we can enter the url we were provided with by the upload service:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/83b34368b33ec5c1c18d5da66e80c322/e8e04/or3.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 32.911392405063296%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAHCAYAAAAIy204AAAACXBIWXMAAA7EAAAOxAGVKw4bAAABKklEQVQoz43LeU+CYBzAcd7/22mutbXK0gZxKgLe4oEcghdoLZVvA13Lf1zP9tn391zC3ZPC/VuDooWHuknlRedRdJBsD8maXXg3idYM2fER9PYUs+ejtFxUy8UaRmjOFHsUE+0gTCHMLr0hSCHegxAnK/xwQRQvCRcJQRQTL9cU63DMOZzg+9JjDscT5XyW/9nn5IBQeZapyS1qskVVbPL60aIqmUiNHkprgGz2EY0uotHhXXPKua7YyM0+qjVEMrrUFJu6aqPYE4QgyQiS9IqfpMwXG/x4e9VZuMKLNnjRunwXLndli7v5Yku8+UL43O/I0i1ZtmWXpb+9pfgzGbsM+r1SFAa4oyFjd4TgdLrY7Q620zn3H5x2F81oIKs6iqpjNE0UTS/PfgB4q/7qtL6stgAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Adding the Service Manifest URL in OpenRefine"
+ title="Adding the Service Manifest URL in OpenRefine"
+ src="/static/83b34368b33ec5c1c18d5da66e80c322/f058b/or3.png"
+ srcset="/static/83b34368b33ec5c1c18d5da66e80c322/c26ae/or3.png 158w,
+/static/83b34368b33ec5c1c18d5da66e80c322/6bdcf/or3.png 315w,
+/static/83b34368b33ec5c1c18d5da66e80c322/f058b/or3.png 630w,
+/static/83b34368b33ec5c1c18d5da66e80c322/40601/or3.png 945w,
+/static/83b34368b33ec5c1c18d5da66e80c322/78612/or3.png 1260w,
+/static/83b34368b33ec5c1c18d5da66e80c322/e8e04/or3.png 1270w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Adding the Service Manifest URL in OpenRefine</figcaption>
+ </figure></p>
+<p>Then we just have to start the reconciliation by clicking “Start reconciling…” and our reconciliation service will be queried with the terms in our OpenRefine project.
+We are then presented with the results:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/d4191f055b104445a4c283e09a9ebec8/ad997/or4.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 131.0126582278481%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAaCAYAAAC3g3x9AAAACXBIWXMAAA7EAAAOxAGVKw4bAAADMklEQVRIx42ViY6jOBCG8/7PttLu9jEzaY6E2+ATGwKY419h2Ew6nckM0qeKI/hdh6t8+BEO8AKCkvcII4O/Xzx8/0hRVO1X6G+oWhyCtN8XBn6Y4xwXIKVAWSmQSl1tUa7I3f6aQ1q2qLWGkBJRJhEmCkFSI8klGJfgUkOpGsbUaIyBMQZar+iHHLKygdE16rpGzizeQ42/vjG8etzx7wfHu89wTigqKiCVgtHKfbx+c88nwazq8OOsndD3UOBbKPDqCbz5Ai+ewD/H9X+GIOYoKwalboW147OHtEeQaHiRgB9LeLFEEDGkRCAvFTKykRKFinLUN4KPQ6YdjpHGe8DhxxynVCLOBVKy2k2IcQG9h3wv9kUwo73L4csHw/HEEKYSpGSfPLnnqYdZ1bscvvkcr/6Wv1NSgTIOqXaRG4HfCha0u+YwSKQjTCSOZ4m04FCSw5jHod4Itp+K4sU/PXzbeQ8EXn0OP+IoSn6t7pNzqJxgTC7wzhwfQYYwyhBnFIQqFJVEXkpX6ZJKSCkf5tOdQ8It2m5A07SIU4I4LdFeeox2wDRuzJPFMltnp9HCWothGB5ykAbumeYZVFokZe86JqUjKmmhTI+mmx2XfsY4jnj2HKRZACxYlgVULfBzi2PcISYdMmqRc+s2WCHCgqkerB4d0ozgekLTDpim8ZHgjFNhEeYDlL5gmZ97c/+sGp8EWT0joyMyZkHlgEvXYVm2F/+ELx6ugkk1IiID4srCNC2wzNfdN3AjsGB5lsNKzfCzAR9JhyDrEZHOhT7vod97+/9GT4qy5dBLO5zJ4MQj0jvRaZrce/c5e1Ll2YWcsxEFX/M4IqUWcTm4ahPeuWNUyhFMWfT9cHXkloNqfqoLAxSsR8UbZFXjCsPqCbqdoZoZ0kxuXbfT7vGDc1jJCcM4oW0viHMOKhqoukHftbC2xzyNrjsm1yWjs+PeLY84JKSFqjWEEO5iinPtBuspVTin67QRDlJWqOuth7WbNBtfejlfh4PZRtHa19v44jsCXiRxPEl3j6SFQFHV1+tgtVyIX8/DuGjgRwznOEeeF+7ekEqDyxpcbIj9N9vX6yi7FfwPKQ3BT/UodQMAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="OpenRefine UI with lists of reconciliation candidates"
+ title="OpenRefine UI with lists of reconciliation candidates"
+ src="/static/d4191f055b104445a4c283e09a9ebec8/f058b/or4.png"
+ srcset="/static/d4191f055b104445a4c283e09a9ebec8/c26ae/or4.png 158w,
+/static/d4191f055b104445a4c283e09a9ebec8/6bdcf/or4.png 315w,
+/static/d4191f055b104445a4c283e09a9ebec8/f058b/or4.png 630w,
+/static/d4191f055b104445a4c283e09a9ebec8/40601/or4.png 945w,
+/static/d4191f055b104445a4c283e09a9ebec8/ad997/or4.png 1012w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">OpenRefine UI with lists of reconciliation candidates</figcaption>
+ </figure></p>
+<p>This already looks good!
+Now we can choose matches by clicking the checkmark or get additional information by hovering over the proposed entry from the reconcile service.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/3400dd15c8f9dd3db28fa4ee867fedff/42d54/or5.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 64.55696202531645%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAANCAYAAACpUE5eAAAACXBIWXMAAA7EAAAOxAGVKw4bAAACBUlEQVQ4y32RWU/bQBRG8/9/RtWHtuprH9qqqBJCtCQiJMFZxltsvNvjZSZNwlJOZQeCIIWHT3Ol0Rydb27PdAJMJySXK9JcU0hNUWqKaoOst93Zpr3PpXq6fxmpGcwUvShbIXzFcF4wnEtsvyBMamZzl0tjwUK4CMsjzjTb7QatNUo1/42sGnqp3OJGG6a2ZOaWiGWJ8BSDwTlnZ7/p9/ucnBxzMbrEMAziKGS1WtE06iBaKXpJscEO1oxNycQsmbttGsaTS/pnpwg7xFo2GCJjPHVw/YSyqqmqCqVaULM/23RAJ1xjtIbOztD0NTNLYyxKTO8WP4FlfIcdNAivwQtyiiJ/BnzMk6GQjM2SidVW1xwdw9cj+PLtlvH8jjC/JkkLZFlT1c0zs0NguKts+5IgKgiSPwyNml/nKaeDv/w8hZl1Q13LZ4CXsB1QPlVeLEssT+IEmu8/Uj58FHz67PDuvcvFpGK91vu/etUwfjCcmJKpXXamU0dhGA3DQcboIud8kOK4Cq3Vq6A9MJUb3EdDt2SxlAhPExYQyV1CCYm8Y6Ubmu7h/2F7QytYMxJlt5SRkBiO7jbrx/f4yf1ujjZkeX5Q+QCYyS3LeI3tZZjLHOFmWL7iKuUBdt/NbnhNEEuqsny7shcpLL9GOBFBmJBnMWmu8FO46mA7wyi/QbeVm7cr/wPRcMi+E1rICwAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Interative OpenRefine pop-up to define a match"
+ title="Interative OpenRefine pop-up to define a match"
+ src="/static/3400dd15c8f9dd3db28fa4ee867fedff/f058b/or5.png"
+ srcset="/static/3400dd15c8f9dd3db28fa4ee867fedff/c26ae/or5.png 158w,
+/static/3400dd15c8f9dd3db28fa4ee867fedff/6bdcf/or5.png 315w,
+/static/3400dd15c8f9dd3db28fa4ee867fedff/f058b/or5.png 630w,
+/static/3400dd15c8f9dd3db28fa4ee867fedff/42d54/or5.png 858w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Interative OpenRefine pop-up to define a match</figcaption>
+ </figure></p>
+<p>If we want we can also search through our vocabulary by clicking “Search for match”:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/cd12abcd7fdd02842fd5225eab7ceaf2/f5209/or6.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 46.202531645569614%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAJCAYAAAAywQxIAAAACXBIWXMAAA7EAAAOxAGVKw4bAAABoklEQVQoz32P6W4aMRRG5/0fBQgBHqA/qmwVyY+2QFhm3z2L7YFkKAlwqnGDVFVRLX0690q+x9fWdBbwtAj5vs54nAeE1RtpA0mDoak1xOpMrM+GkToR/5Wu7+6t4wYrjHOyojaR+oX3E5TbN+JUkGUZWZZTFCW1lNS15Hg88e85nc6Gutlize2UtZ8h5At5vaPcHtCv70gjqKnqmte25XA4sN/vP9Iatu0fHo9HOJ+RSmGFnktoL8lCj8hdI2IfKRL2W8V+p/m107SNxHEcPM9DCEGSpuR5TvrB7lGtNWEUY909PLFYPLNxHGzHxfV8KqkQRUmaCcM4SZhOp8xmM6qqMsOXKKXMbzre3N5hffn6jflqgxsmZKUkEZWhqDV5pSjVDj9KGQ6vGAwG9Ho9rq+vGY/HjEYjk66+9NbtwyMbN+DH/JnlxkV0EtkY5qWkVA1+lBhZv9837IYnk8mnsW7upyxtj/lyw8/FiigrKGRjZN2mXe2FCb1e3wi7/FfYbWj7IW4Qs7Jd1o6PG0QUtUbUikpvCeLuy0OuhkPDi/Az8W+XxoPlF1vQggAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Searching for a vocabulary term in OpenRefine"
+ title="Searching for a vocabulary term in OpenRefine"
+ src="/static/cd12abcd7fdd02842fd5225eab7ceaf2/f058b/or6.png"
+ srcset="/static/cd12abcd7fdd02842fd5225eab7ceaf2/c26ae/or6.png 158w,
+/static/cd12abcd7fdd02842fd5225eab7ceaf2/6bdcf/or6.png 315w,
+/static/cd12abcd7fdd02842fd5225eab7ceaf2/f058b/or6.png 630w,
+/static/cd12abcd7fdd02842fd5225eab7ceaf2/40601/or6.png 945w,
+/static/cd12abcd7fdd02842fd5225eab7ceaf2/f5209/or6.png 1061w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Searching for a vocabulary term in OpenRefine</figcaption>
+ </figure></p>
+<p>After selecting the appropritate matches we have successfully reconciled our data:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 340px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/9f933/or7.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 84.81012658227847%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAARCAYAAADdRIy+AAAACXBIWXMAAA7EAAAOxAGVKw4bAAACy0lEQVQ4y4WTW2/bRhBG+f/fiyII0gAp0Ke8GPCD4biObER1U9uS66CNL0kkkqJ43wvvK9o6BVdKkIuNDnDwfTPYHQwXQyfKDNe3HW5g8AKJv1QsQv2FINKESUUqOpK8tfo1Sb7ROGuRxQonESuOjwtuPhh2dl/x9NmvPHv+Gz/9/AtPnr5gb/+Yk78uUeWaOGts0+9JRbtpqFc4YWrQ1ZpM3hFlrZ0miEoWYUEQFURpQ5jWZHJFKs2DZNKQCLOZMIwLPn7ymc0D5u4S1wu/xV9a5l6A6w36I0N9uJ/LFscLG4JYs0w0QbLRzywizcxX/8t8ofjoSfuejusrXo8yxic5x29yziaCyYVkdJzxx585qqjIRYWQFUI9Ti5KirLByTPF6WnK6duEv6cpF9PU6sk44d1lhmkLmqqgLDVFoSkfodCKuq5xslxx4wmuXcH7uWAWSJvf+ht/9m/G9CojzRRaa5TWVr9HKUVVVThlWZOIiijfkMqaTNdWddUQi5pc1/R9i+ka6rqhaX5kmK5tWxylen4/0IzfFJydVkzOK8bjgum05uKiZjqpLOfnFUL0DLFe82Dc39/jtO0ds1mH73X4fkcYGha+IVwaXLdjOfxBXkcQGOr6bttw/SC24erunqXomcWGT5Gx3ktXuMmKRPdc+Z2tf45HhttOuB7esCFKanTZIHSDLlvri6q1iKIhVw1d19E0HcZ09H1n80GHfKBpWowxOL4vODqKeX0YWfb2Qg5eRZajUczBwaa+s7NkNIoIQ8n+fsjubsjhYcRoFPPyZcB4HNN1NY6UmihWuJ5ksZDM55K5u2EZKmYziedLPE+yXCqE0NYHgbLnh5rrSuJ4u4fDfl3e5lzc5EyuMt7PBO8+5JbrueDtPxmTq9zuX10VdufquqCq9BeGvCi2ezh8d9sami1tt6U1dJ2hbjb14dyG1Vf+W/q+5z9ZqfhGhqCzCwAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Matched values in OpenRefine column"
+ title="Matched values in OpenRefine column"
+ src="/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/9f933/or7.png"
+ srcset="/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/c26ae/or7.png 158w,
+/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/6bdcf/or7.png 315w,
+/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/9f933/or7.png 340w"
+ sizes="(max-width: 340px) 100vw, 340px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Matched values in OpenRefine column</figcaption>
+ </figure></p>
+<h2>Further reads</h2>
+<p>Christel Annemieke Romein, Andreas Wagner and Joris J. van Zundert published a tutorial about <a href="https://doi.org/10.21825/dlh.85751">building and deploying a classification schema using open standards and technology</a>.
+In this tutorial they make use of SKOS, SkoHub Vocabs and SkoHub Reconcile.
+We recommend having a look to see the use of SkoHub services in action.</p>
+<h2>Next steps</h2>
+<p>The services are currently in an <code class="language-text">alpha</code> phase and ready for testing.
+You can test the service under <a href="https://reconcile-publish.skohub.io/">https://reconcile-publish.skohub.io/</a>.</p>
+<p>Feedback is very much appreciated: via email (<a href="mailto:skohub@hbz-nrw.de">skohub@hbz-nrw.de</a>), as an <a href="https://github.com/skohub-io/skohub-reconcile/issues">issue</a> or – primarily for the German-speaking users – in the newly set up discourse forum <a href="https://metadaten.community">metadaten.community</a>.</p>
+<p>Our next step will be integrating the above mentioned <code class="language-text">lang</code> parameter to be able to serve all languages of a vocabulary without the need to specify it beforehand.</p>
+<h2>Repositories</h2>
+<ul>
+<li><a href="https://github.com/skohub-io/skohub-reconcile/">SkoHub Reconcile</a></li>
+<li><a href="https://github.com/skohub-io/skohub-reconcile-publish/">SkoHub Reconcile Publish</a></li>
+</ul>https://blog.skohub.io/2023-11-22-shacl-shape/https://blog.skohub.io/2023-11-22-shacl-shape/Wed, 22 Nov 2023 00:00:00 GMT<p>To improve the error messages thrown by SkoHub Vocabs for invalid RDF Turtle files, we decided to implement a validation step, before the static site of a vocabulary gets built with the <a href="https://www.gatsbyjs.com/">Gatsby framework</a>.
+This validation step should provide more meaningful error messages than the currently cryptic ones thrown by Gatsby.
+While we could have gone with one of the existing SKOS validator tools like <a href="https://skos-play.sparna.fr/play/">SKOS Play!</a>, <a href="http://www.w3.org/2001/sw/wiki/Skosify">SKOSify</a> or <a href="https://www.poolparty.biz/skos-and-skos-xl">Poolparty</a> (it’s pretty epensive) we decided to go with a more generic approach and define the shape rules not in code, but in data.</p>
+<figure class="gatsby-resp-image-figure" style="">
+ <span class="gatsby-resp-image-wrapper" style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 256px; ">
+ <a class="gatsby-resp-image-link" href="/static/3180b8faaa30763f57cf0aeda0f8a2e3/6f3f2/shacl-logo.png" style="display: block" target="_blank" rel="noopener">
+ <span class="gatsby-resp-image-background-image" style="padding-bottom: 100%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAUCAYAAACNiR0NAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAEkElEQVQ4y4WTe0xTVxzHj7VFAps2KzH849AlLsojpLhmGHRlE7BlxvWy4hybPDbGY/FFRIE4gxmBQQCNcYIVWlsaBBNrLeJE0Dk3RSbbYDAfwMDHhosxE8gYpaX3fJdzWnS6Lfsl3+Sc373nc3+/7+9cEhISQsrLy0lTUxMxm83EYrE8JbPZQtrsDWTjDgMhy6tVufu+7s07eIUuTTl2z+8Ng/6tzbXkcsdJsm37dsJDLpeHxcXF7dPr9eU6na5CEISnpBOEio0bkirCo7WlH1dd+O3ug0kse7/JE5BQh4D4OvfKxA/2v7cx6bNXo1dWEkIWksjIyA3d3d1wuVwYGxvD+PgE18SEV4/GxuF2TuJi9zCKDF2ISD8+47fGgAVao4fE1ODA8Wtg0XisCYSQcBIVFaUbGBhgORelmAGeSGTy5Vqv3psJSbaK89YYMF9TTxdojSArP6e2S8P8uaPlNAMuJ0qlUrhx4wYDigAoi0mnm7L1rFo779CgdUdpYEIdZdU9t7aektU1SNx1hjqnve86HA4GDGUVCteve4Fnu+7S/EOdKDR0Ya+pG9ZzA9DvOYewtGaYz97Cw/EpWNpuYd6aI4jd6oB7xsOOcWBLS4sXGBYRKdwdGULv8IS4u66LftN3Hy/qrZCoD4O8cgCa/FY4XTP4e8Ruc6CisYevp5yup4Gh4V7g4dOD4kFbH11f9AWHPa+pR7DOgl8fTvKD024P3MxVAFmVX2Hf8R+9eZf7mQrDI4Xbw4Ownh8Rl2ywUrK6Fi+sM2G+1siuBS7+MMoPejwinC4PX6eXfYlPzd/9e4Ws5ZGfB3H7gVNcuP4oJatqQF6rBQP7xx1BsM6Mxo7Bx+3+MeXGS+80Ym1+qy/zjIdsKD/5hnKl/z4Vdrdh1WY7ik3dGPxlDGeu3uE+qrc6UGy6hp01nahu7uVDYe8+mpj655Rv3nxybfAfsaf+GqSvG9A/8jt2130LEnMIJPog8g9d5kD7KR+Q3cP+/n4OdDqn6bTLDbfbDee0ixnOPAKoB5N/TmPz/kvQ7mzlQ5MnGiFR10KV2cyBtpMnvUCVSiWMjo7+b4UsCo198I83sT8Fcm09SGw9ttd+z4Ed5y94gUFBQUJKSgq2bNkibtq0iaalpSEzMxM5OTnIyspCdnY2PsrKRk7Wh4gXMkCiikFW1cIvwQKi/ARvvp1GC3btoNrERC+QECIQQthGVKlUVKlUQqFQsD0CAwMhkUggk8mwdOnLPBexIgY5exuwTJ2OiKholqM+QSqVhpKAgAAOzMjIEIeGhqggCCgtLUVZWRksFgs0Gg0KCgrQ19eHxYuX4JTdhsz0FKS+q4e1wcyBfn5+dO7cud4KpVIpB4aFhYlGo5E2NDRwoN1uR3t7O1JTU2Gz2dDT04OqqioYTSZeqTJqBU6csHHgnDlzqK/LUCKTyThQrVaL1dXVtLKyEnl5eWyPpKQkFBcXo6ioCLm5uSgsLERJSQmsViuSk5P5R0tKSqhcLudAmUz2xEOJRCIuWrRo9kvcN9aGv78/ZnO+thAcHPzYY4VCQSUSyeMK/wIeEsftKQa/pAAAAABJRU5ErkJggg=='); background-size: cover; display: block;"></span>
+ <img class="gatsby-resp-image-image" alt="SHACL logo" title="SHACL logo" src="/static/3180b8faaa30763f57cf0aeda0f8a2e3/6f3f2/shacl-logo.png" srcset="/static/3180b8faaa30763f57cf0aeda0f8a2e3/c26ae/shacl-logo.png 158w,
+/static/3180b8faaa30763f57cf0aeda0f8a2e3/6f3f2/shacl-logo.png 256w" sizes="(max-width: 256px) 100vw, 256px" style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;" loading="lazy" decoding="async">
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">SHACL logo</figcaption>
+ </figure>
+<p>If you want to validate the shape of an RDF graph, you currently have two options to do that. You can either use <a href="https://shex.io/shex-primer/index.html">Shape Expressions (ShEx)</a> or the <a href="https://www.w3.org/TR/shacl/">Shapes Constraint Language (SHACL)</a>.
+We decided to go with SHACL for the following reasons:</p>
+<ul>
+<li>slightly better tooling (e.g. the zazuko Team provides a JS Validation library <a href="https://github.com/zazuko/rdf-validate-shacl">rdf-validte-shacl</a>)</li>
+<li>there were some existing SKOS-XL SHACL definitions published by the Publications Office of the European Union: <a href="https://op.europa.eu/en/web/eu-vocabularies/application-profiles">https://op.europa.eu/en/web/eu-vocabularies/application-profiles</a></li>
+</ul>
+<p>Unfortunately the SKOS-XL shape did not work with our tooling (<a href="https://jena.apache.org/documentation/shacl/">Apache Jena SHACL</a>) out of the box.
+Therefore we decided to build a SKOS shape from the ground up based on the <a href="https://www.w3.org/TR/skos-reference/">SKOS Reference</a>.</p>
+<h2>SKOS Reference Shape</h2>
+<p>The goal was to implement every consistency example from the SKOS Reference as a test case for the shape.
+To accomplish this it was on the one hand needed to formalize the class and property definitions from the spec as well as the integrity conditions.
+On the other hand we needed a triple store with reasoning capabilities to apply these rules to the very basic examples in the reference.
+We used the Apache Jena tooling for this and built a <a href="https://github.com/skohub-io/jena-docker">jena-docker containers</a> based on the docker containers of <a href="https://github.com/stain/jena-docker">this repo</a>.
+The SKOS class and property definitions are defined in <a href="https://github.com/skohub-io/shapes/blob/main/skosClassAndPropertyDefinitions.ttl">this file</a>.
+The workflow for validating the SKOS shape is as follows:</p>
+<ol>
+<li>The fuseki container is started with the <a href="https://github.com/skohub-io/shapes/blob/main/fuseki/config_inference.ttl">inference configuration</a>.</li>
+<li>The class and property definitions are appended to <code class="language-text">skos.shacl.ttl</code> file which is mounted in the container.</li>
+<li>The respective example is mounted in the container as a ttl file (e.g. <a href="https://github.com/skohub-io/shapes/blob/main/tests/valid/skos.shacl.ttl/ex05.ttl">example05</a> )</li>
+<li>The validation result (which is itself a RDF graph) is temporarily stored</li>
+<li>The validation result is <a href="https://github.com/skohub-io/shapes/blob/main/scripts/checkForViolation.rq">queried with SPARQL for errors</a></li>
+<li>Based on the query result an error message is put out</li>
+</ol>
+<p>This way we accomplished to validate all valid examples from the SKOS reference up to <a href="https://www.w3.org/TR/skos-reference/#example-68">example 68</a> as valid and all the invalid examples as invalid (entailment vs non-entailment examples were left out).</p>
+<h2>SkoHub Shape</h2>
+<p>In SkoHub Vocabs we are a bit stricter regarding some aspects of the SKOS reference.
+For example we want every <code class="language-text">skos:Concept</code> to have at least one <code class="language-text">skos:prefLabel</code>.
+Therefore we developed a SkoHub specific shape with <a href="https://github.com/skohub-io/shapes/blob/main/skohub.shacl.ttl"><code class="language-text">skohub.shacl.ttl</code></a>.
+In contrast to the generic SKOS shape this shape does <strong>not</strong> contain any <a href="https://www.w3.org/TR/shacl/#sparql-constraints">SPARQL based SHACL constraints</a>.
+Though it is possible and especially for more elaborated queries useful to use SPARQL to check constraints, the available tools (at least for javascript <a href="https://github.com/zazuko/rdf-validate-shacl">rdf-validate-shacl</a>) do not support such queries.</p>
+<p>As a result, validation errors and warnings help SkoHub users to improve the quality of their vocabularies. See for example the validation warning for no provided license:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">-----------Warning--------------
+Message: [
+ Literal {
+ value: 'A provided license increases reusability of a vocabulary. Should be an URI.',
+ language: '',
+ datatype: NamedNode { value: 'http://www.w3.org/2001/XMLSchema#string' }
+ }
+]
+Path: http://purl.org/dc/terms/license
+Node, where the error occured: http://w3id.org/example-cs/
+Severity of error: http://www.w3.org/ns/shacl#Warning</code></pre></div>
+<p>Or the validation error if the object of <code class="language-text">skos:hasTopConcept</code> is not a <code class="language-text">skos:Concept</code>:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">-----------Violation--------------
+Message: [
+ Literal {
+ value: 'The target class for hasTopConcept should be skos:Concept',
+ language: '',
+ datatype: NamedNode { value: 'http://www.w3.org/2001/XMLSchema#string' }
+ }
+]
+Path: http://www.w3.org/2004/02/skos/core#hasTopConcept
+Node, where the error occured: http://w3id.org/example-cs/
+Severity of error: http://www.w3.org/ns/shacl#Violation</code></pre></div>
+<h3>Community</h3>
+<p>Thanks to a lightning talk at SWIB23 (<a href="https://pad.gwdg.de/p/0ytHDo597">slides</a>, <a href="https://www.youtube.com/watch?v=jzdU1zHKlNU">recording</a>) we got some attention to the shape.
+Suggestions made by Jakob Voß, Osma Suominen and Antoine Isaac already greatly improved the shape.
+Further suggestions and improvements as well as your use cases are highly welcome.</p>
+<h2>Outlook</h2>
+<p>We were quite a bit surprised that we did not find any usable existing SKOS SHACL shape.
+Hopefully our work may help others validating their SKOS files and improve the overall quality of vocabularies.
+There is currently still <a href="https://github.com/skohub-io/shapes/issues/9">an open ticket</a> for implementing the <a href="https://github.com/cmader/qSKOS/wiki/Quality-Issues#Ambiguous_Notation_References">qSKOS best practice rules</a>.
+Any feedback and collaboration on the shapes is welcome!</p>https://blog.skohub.io/2023-02-09-tests-updated/https://blog.skohub.io/2023-02-09-tests-updated/Thu, 09 Feb 2023 00:00:00 GMT<p>For quite some time we have been adding new features to SkoHub Vocabs like <a href="https://github.com/skohub-io/skohub-vocabs/issues/79">switching languages</a>, display of all relevant properties on the concept page and <a href="https://github.com/skohub-io/skohub-vocabs/issues/159">support for <code class="language-text">skos:Collection</code></a>. Unfortunately there were no tests added to actually test these new functionalities. This led to some surprises now and then, for example when we noticed that at one point language tags did not show up when visiting a Collection page directly.</p>
+<p>Originally SkoHub Vocabs already contained some tests, so being the maintainer of SkoHub Vocabs I decided to follow up on that and got myself a little bit more familiar on the topic. Quickly I stumbled over the topic of <a href="https://en.wikipedia.org/wiki/Test-driven_development">Test Driven Development</a> (TDD) and though I heard of it before, I decided to dive a little deeper and check, if that pattern might be appropriate for SkoHub Vocabs and the other SkoHub modules (and maybe my coding approaches in general).</p>
+<p>The general idea of TDD is as follows (borrowed heavily from the Wikipedia article):</p>
+<ul>
+<li>Requirements of a new feature are first translated into test cases</li>
+<li>Tests are written</li>
+<li>Code is written</li>
+</ul>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/bb776e8e9127b9f8b7db84ec085e8e24/c8551/TDD_Global_Lifecycle.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 58.86075949367089%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAMCAYAAABiDJ37AAAACXBIWXMAAAsTAAALEwEAmpwYAAACoElEQVQoz1WTW2sbRxiG9et6mdtelPaq975Of0HvixrZVWklDIGaksSFthCFONiWoSSpE1zr4Oiwc9hdaVe7lnajg3dnnzKT+qIDDx/zzcsL32FqWuszpdQHKWVPCNG3SCkdSqm+1trFe+7v99qpJ/6Xr2mtY6UUNzc3DIdDRqORYzAYIIRAa419v8fePc9z+vF4jBh9QP2n00phDX0pJbdJUmKMoapMaYxZZblRShmttPF14NBaG6mkSdPUlIUxd8aYqzQ3Ok6Mr5TT1zwhg/w25amIzXd9zcZUFOs1bDds1huE9BhPe0ymA4ScslzmlCXstor3achnv7/miYgpPmZ4QlITSgVQ8fNAm69eXLIoDIQhgZqxLNbo8B9G3iVj8Q5PX7ErE/LiBqpbsruCw1HAdZpjj/Z9amEYBnfbLcUiNRGQ73Zks4gvO5d8+27IZpUipY9SPsliyTD+gT+uH6DyIcPZirLYObM7UxFFEbXA94Msy9hd/GWWf75guytZlxX7vQHdKGOVJAgpkFIQzRek2x59+YhloijKikG64utX7zmfpaziiJqcesEGiB/9aPoPPmej+uQmADKoKnzfdxP8FBW7nbEdYi19zGTCTz3JFydXyGVG6GtqUspgHseYRWK4HkC5pWTjygjDGZPJxGFXZDqdOtP15iO5HRxwLGN+FTF2sHbNXA//fvuWw8ePzS/Hz3h58orjZ7/RarXodDpMJxO3n3YvreHZ2RntVoujoyNenpxw/PQJh+023W6XMAw/Gb5+84Zms2kOGg2+r9dpNBocHBzw/HnHiWy5ltlsxvn5uXvb39+nXq+72Gw2OT09dZqaUiqwwiiKzH2/LPP5nCRJnKElCAKHzdlp2l9j892LC9qHbR5+85C9vT3+Bb6OWOgwZKY/AAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Test-Driven-Development Cycle"
+ title="Test-Driven-Development Cycle"
+ src="/static/bb776e8e9127b9f8b7db84ec085e8e24/f058b/TDD_Global_Lifecycle.png"
+ srcset="/static/bb776e8e9127b9f8b7db84ec085e8e24/c26ae/TDD_Global_Lifecycle.png 158w,
+/static/bb776e8e9127b9f8b7db84ec085e8e24/6bdcf/TDD_Global_Lifecycle.png 315w,
+/static/bb776e8e9127b9f8b7db84ec085e8e24/f058b/TDD_Global_Lifecycle.png 630w,
+/static/bb776e8e9127b9f8b7db84ec085e8e24/40601/TDD_Global_Lifecycle.png 945w,
+/static/bb776e8e9127b9f8b7db84ec085e8e24/78612/TDD_Global_Lifecycle.png 1260w,
+/static/bb776e8e9127b9f8b7db84ec085e8e24/c8551/TDD_Global_Lifecycle.png 2814w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Test-Driven-Development Cycle</figcaption>
+ </figure></p>
+<p>This leads to the following development cycle:</p>
+<ol>
+<li>
+<p><strong>Write tests</strong>: This ensures that the developer actually understands the user requirements. Usually this is done with the help of use cases and user stories.</p>
+</li>
+<li>
+<p><strong>Run tests</strong>: The tests should now fail. If not, it might be the case that the actual feature is already present in the code and no further code needs to be written. Maybe documentation has to be updated accordingly.</p>
+</li>
+<li>
+<p><strong>Write the simplest code that passes the new tests</strong>: The code can (and should) later be refactored, so it can be ugly at this point.</p>
+</li>
+<li>
+<p><strong>All tests should now pass</strong>: If the code is still failing it should be revised till all tests pass.</p>
+</li>
+<li>
+<p><strong>Refactor as needed</strong>: Your tests now verify that the new feature is working as expected. If any tests during refactoring fail, you now can and will immediately correct your code.</p>
+</li>
+</ol>
+<h2>Consequences on SkoHub development</h2>
+<p>This approach has some consequences for the development of SkoHub modules.
+These changes will also be reflected in the <a href="https://github.com/skohub-io/skohub-vocabs/issues/242">yet to be published CONTRIBUTING.md</a>.
+Issues for new features should contain use cases and user stories as well as some notes that indicate when the feature is actually correctly implemented.
+Just when all of this is present, the issue can be marked as ready.
+The use cases and notes can then be used to write the tests and follow the above mentioned development cycle.</p>
+<p>Regarding code review, this approach also has some consequences. Code review should only be approved if tests were added for the new feature or the tests were adjusted in case of bug fixing.</p>
+<h2>Testing Strategies and Technologies in SkoHub Vocabs</h2>
+<p>At the end of this blog post I want to give you a short overview of currently used testing strategies and technologies used in SkoHub Vocabs development. We use <strong>unit tests</strong>, <strong>integration tests</strong> and <strong>end-to-end tests</strong> whereby we try to write more unit tests than integration tests than end-to-end tests. The reason for this is that end-to-end tests take long and are quite expensive regarding computing power and time. Unit and integration tests on the other hand are cheap and can also auto-run in the background on every save, thus giving you immediate feedback when something is broken.</p>
+<p>For unit and integration tests we use <a href="https://jestjs.io/">Jest</a> and the <a href="https://testing-library.com/docs/react-testing-library/intro">React-Testing-Library</a> since Gatsby – with which SkoHub Vocabs is built – uses React.
+Some of the older tests used <a href="https://enzymejs.github.io/enzyme/">Enzyme</a>, but after upgrading to React 18 I noticed that Enzyme was no longer working, because <a href="https://dev.to/wojtekmaj/enzyme-is-dead-now-what-ekl">the project is dead</a>.
+After some research I found the React-Testing-Library as the most recommended testing framework and migrated the old Enzyme tests.
+After some initial training, writing tests became actually quite handy and fun.</p>
+<p>Adding tests is by no way finished at this point, but a lot of the lately added features now have some proper testing. This already required a few changes to the code base from now and then when I noticed things weren’t working as expected.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/a0ad657099e495b1977b78ba0e9879f4/3f3b9/test_coverage.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 79.74683544303798%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAQCAYAAAAWGF8bAAAACXBIWXMAAAsTAAALEwEAmpwYAAACj0lEQVQ4y21U13LbMBDkd7hQoggSANHZJbmoOLE9cezEk///lc3gKI9K8nBzHJBY7N4umDhdo8gFipxT54WErhziulEeztQItoFWDlaFqetAVYcWwTXwtqZeMomk6QOsNwS4mBUEKoVCOwY476GUxnjXopIK/bqG1haVUmiHSKQiAhEodpaVSEJvsdrVsEGD5SXyrERZSCy3AaH1qEyF/fsSnEvcP7cw1kAqgfunDqrkmKU5shnD/NAT1xjs30dsf46wtSbAfM4ReoNuWaMoODavA/KMo7/3cMFCKon1toPmAot5cVaJ8Rr7jxHrfUsfspwTYD1YdKuaTn14HsAWETCcARohkGcXgMTwY0S78pCVRMHiqRNgv6oxT/8PuNq2NLNZyjC7ZUhvGT0nvjX49rlCM7pphovyyPAg+fFloFGcMdx1GIOCVwK2EnBKoOJ8Yvj42iP2gpVHyb0hhiR506C4YHi37zB4Ba8FgpZUWnAkJmjsf41YP7VwjUZZHCRHwHWN9DrHw3NPsTo1JUq+ucpxfZXj9iY/kdxZbN8GLDc1hBDI0uJMchxDlLeYlRguTHFSIhgJWXJSQqbEjbv3ETGPylbQVh9jczDl/nsPll2aEnMoYKQAZ+UpoMPznzU2PwZs3gb060CbyeV1g/Rmkjy5fC45vYkO53ToYs6mYPs+xmbAuAnk6OyaIUtLuFYRwxiJ6HK8khNDh+ogOY7hn2C3TY+u69C2HXxjKI/eBzShg3c1xUWKivIZDeNlRQbRWs6J+bFKJL8/PrF73KHiGsu7DtuXFaw19IephME8SrrNkaWMpMU7+9WjxMtKNpsn1HUPWTk0zRLe9ZjPS2QLTiGP2SwYP+lTxXdfRpxK/gupYeYBEmn59gAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Test Coverage in SkoHub Vocabs"
+ title="Current test coverage in SkoHub Vocabs"
+ src="/static/a0ad657099e495b1977b78ba0e9879f4/f058b/test_coverage.png"
+ srcset="/static/a0ad657099e495b1977b78ba0e9879f4/c26ae/test_coverage.png 158w,
+/static/a0ad657099e495b1977b78ba0e9879f4/6bdcf/test_coverage.png 315w,
+/static/a0ad657099e495b1977b78ba0e9879f4/f058b/test_coverage.png 630w,
+/static/a0ad657099e495b1977b78ba0e9879f4/3f3b9/test_coverage.png 870w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Current test coverage in SkoHub Vocabs</figcaption>
+ </figure></p>
+<p>For end-to-end tests I decided to go with <a href="https://www.cypress.io/">cypress</a> since it has an excellent documentation, is fully open source and runs tests in a real browser.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/058d6/cypress.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 32.278481012658226%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAGCAYAAADDl76dAAAACXBIWXMAAAsTAAALEwEAmpwYAAABWklEQVQY04WRy07CUBRF+QAf8RG11pZSsICKiGgc6C9oSFBnOjD6hU408Recmgg+iBoTQ6VYKHLb3i5TiI+ZO1nJGa1k75NIW1toRgkjs0Emu4Wqr/6gp0pMTGfYPzyj05M8vdk0HZcgkIThEN8PESKkL3xabY+EtbSNubiJbpXJrexgpMtoRhHNWCNprjM1Y1E9OMX56NNst+h4HkIIXu0mL3YTKSVRFBGn5XRJzGsFBpirGPkyWqqAouWZVSwUNcfI6Dx7lSM63YC2+4HwfT77fey2Q0+Igehb6MTCuFosVM0iyWwJPRXXXWZGWRwKx1R2K0e43QAR+IRRhB8GBGGIBAIpB0gi3p3OH2FmDWNpA80soOh55tQcippnZGyBvcoxPU/yXzz381e4kC6Rym6STJdRk8OnxFuOT5pUqyc0bJfz2g2X9zUu7m4HDO8aVw91rh7rXDee+QI+MD3OeY3jmQAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Test in Cypress"
+ title="Example of Tests running in Cypress. On the left you see the test and can also jump forth and back between states, on the right you see the rendered HTML page"
+ src="/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/f058b/cypress.png"
+ srcset="/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/c26ae/cypress.png 158w,
+/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/6bdcf/cypress.png 315w,
+/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/f058b/cypress.png 630w,
+/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/40601/cypress.png 945w,
+/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/78612/cypress.png 1260w,
+/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/058d6/cypress.png 2277w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Example of Tests running in Cypress. On the left you see the test and can also jump forth and back between states, on the right you see the rendered HTML page</figcaption>
+ </figure></p>
+<p>All of the tests are actually integrated in the SkoHub Vocabs CI pipeline and run before a new docker image gets build.</p>https://blog.skohub.io/2022-12-19-workshop-summary/https://blog.skohub.io/2022-12-19-workshop-summary/Mon, 19 Dec 2022 00:00:00 GMT<p>Due to new funding for SkoHub development we conducted – as <a href="https://blog.skohub.io/2022-11-skohub-workshop/">previously announced</a> – a workshop on the 17th of November. About <a href="https://pad.gwdg.de/s/2022-11-17-skohub-workshop#Attendees">15 participants from three different countries</a> joined updates and discussion around the following topics:</p>
+<ol>
+<li>Current state of SkoHub</li>
+<li>Introduction of Metadaten.nrw project</li>
+<li>Collection of requirements from the community regarding PubSub and reconciliation</li>
+<li>Presentation of SkoHub Reconcile prototype by Andreas Wagner</li>
+</ol>
+<p>We split the workshop in two parts. First, we gave a general overview about the current state of SkoHub and the renewed funding through the Metadaten.nrw project. Then we had a general discussion about SkoHub PubSub – the module to connect a SKOS vocab with the Fediverse – as well as Andreas Wagner’s Reconciliation prototype. See also the <a href="https://pad.gwdg.de/p/veKgVreqw">slides for the first part of the workshop</a>.</p>
+<p>In the second half Andreas gave us a technical deep dive into his reconciliation prototype, walked us through the code and we discussed the architecture as well as future development and integration into the SkoHub ecosystem.</p>
+<p>In the following, we will go deeper into what happened in the different parts.</p>
+<h2>Current state of SkoHub</h2>
+<p>Currently, SkoHub Vocabs is by far the most used SkoHub module. It is used by the <a href="https://www.hbz-nrw.de/">hbz</a>, the metadata standardization groups around <a href="https://wiki.dnb.de/display/DINIAGKIM">KIM</a>, <a href="https://wirlernenonline.de/">WirLernenOnline</a>, The <a href="https://www.iqb.hu-berlin.de/">Institute for Educational Quality Improvement (IQB)</a>, in research projects in the area of digital humanities and by other people and institutes to publish their controlled vocabularies.</p>
+<p>The browser plugin SkoHub Editor as well as the PubSub module haven’t been used in production yet and have been shut down temporarily in March 2022 due to missing resources.</p>
+<p>In 2022 the work on SkoHub started again when we <a href="https://blog.skohub.io/2022-05-eww-project-kickoff/">partnered with effective WEBWORK (eWW)</a> to redesign the web pages, create a new logo, improve UI configuration and address other issues as for example the support of <code class="language-text">skos:Collection</code>. (See the <a href="https://github.com/orgs/skohub-io/projects/2">project kanban</a> for an overview.)</p>
+<h2>Decouple software and services</h2>
+<p>The general idea is to further decouple the software “SkoHub” from its running instances. Therefore we also wanted eWW to work on UI configuration possibilities, so other institutes or projects can easily brand their SkoHub instance.</p>
+<p>In the future, we will move the hosted instance, currently running at skohub.io to metadaten.nrw, the project which grants the further development of SkoHub.</p>
+<h2>Metadaten.nrw</h2>
+<p>In the end of 2021 hbz secured some funding by the Ministry of Culture and Science of North Rhine-Westphalia (MKW) for a project called Metadaten.nrw. It consists of two sub-projects, with one called “Infrastructure Initiative Metadata Services” being located in the Open Infrastructure team (OI) at hbz, where SkoHub development will take place.</p>
+<p>We got four positions funded from which two are already filled, amongst them Steffen for SkoHub development. The goal of the project is to expand the community of users for the existing metadata infrastructure provided by hbz/OI, with focus on libraries and scholars in North Rhine-Westphalia (NRW), and to establish hbz as a competence center for metadata in NRW.</p>
+<p>Accordingly, we plan to develop SkoHub further regarding the following topics:</p>
+<ul>
+<li>Fediverse integration: Further development of SkoHub PubSub in the context of a concrete use case</li>
+<li>Reconciliation: Bringing the SkoHub reconciliation module into production</li>
+<li>Possibly support <a href="https://annif.org/">Annif</a> integration in a later project phase</li>
+<li>Offer SkoHub tutorials and workshops</li>
+</ul>
+<h2>Community, PubSub & Reconciliation</h2>
+<p>To further encourage contributions like the one from Andreas with the reconciliation prototype, we will set up contributing guidelines to have a clear and transparent definition of the development and deployment processes.</p>
+<h3>SkoHub PubSub</h3>
+<p>Afterwards we made a small (re)introduction to SkoHub PubSub and discussed possible use cases. We developed ideas about SkoHub PubSub serving as a communication hub between researchers for their research fields. Raphaëlle Lapotre came up with a conrete use case. They currently have some pains in the context of <a href="https://datu.ehess.fr/timel/en/">Timel Thesaurus</a>, an indexing Thesaurus for huge amounts of digitized pictures of medieval iconography. Currently, there are problems with the task of storing the large amounts of images centrally in a repository. Researchers could hold the files locally in their NextCloud and publish the image metadata to inboxes of SKOS concepts. A central service could then listen to the data provided by each concept’s inbox and then display the metadata with a link pointing to the image in its storage location. There are actually two possible use cases : one with the digitized illuminations pictures of the Ahloma lab (EHESS, <a href="https://didomena.ehess.fr/collections/3x816n47s?locale=fr">sample</a>), the second one with painted ceilings pictures from all the mediterranean area, <a href="https://rcppm.org/blog/histoire-et-decouverte/carte-interactive-des-plafonds-peints-medievaux/">collected by an association of scholars and retired volunteers</a>. Possibly, the <a href="https://nextcloud.com/blog/nextcloud-introduces-social-features-joins-the-fediverse/">support for ActivityPub in Nextcloud</a> could help with such a project.</p>
+<p>Another topic was the idea of community building around concepts. The Open Educational Resource Search Index (<a href="https://oersi.org">OERSI</a>) as well as the <a href="https://wirlernenonline.de/">WirLernenOnline</a> project already use elaborated vocabularies to index their resources. Interested humans could easily follow these concepts and engage in discussions around them.</p>
+<p>This is also applicable to researchers that will be able to build up a topic-specific data base and open discussions about their research in the fediverse. This also rose practical questions about what happens on the notification side with broader and narrower concepts. If I’m following a concept do I also want to get notifications about its narrower or broader concepts? These are questions that can be discussed further in our community.</p>
+<h3>SkoHub Reconciliation</h3>
+<p>Following the PubSub discussion Andreas presented his reconciliation prototype. The reconciliation prototype is based on the <a href="https://reconciliation-api.github.io/specs/latest/">Reconciliation API spec</a> developed by the <a href="https://www.w3.org/community/reconciliation/">W3C Entity Reconciliation Community Group</a>, so it is interoperable and can be used in any kind of application that acts as a reconciliation client. Andreas implementation already worked in <a href="https://openrefine.org/">OpenRefine</a> as well as in <a href="https://teipublisher.com">TEI Publisher</a>’s annotation tool. After showing the implementation with some examples we went into a technical deep dive.</p>
+<p>Andreas walked us through the code and we discussed the current implementation as well as the future architecture of the SkoHub modules. His current approach is based on the SkoHub Vocabs webhook part and lending code from SkoHub PubSub regarding the elasticsearch indexing.</p>
+<p>The discussion resulted in the proposal to separate SkoHub Vocabs from the webhook module and by this further separate concerns of the respective modules. He integrated a <code class="language-text">doreconc</code> query parameter to the webhook, which triggers a <a href="https://github.com/mpilhlt/skohub-vocabs/blob/feature-reconc/src/populateReconciliation.js">script</a> that will populate the vocabulary to the <a href="https://github.com/skohub-io/skohub-reconcile">reconcile prototype</a>.</p>
+<p>After the workshop we transferred the <a href="https://github.com/skohub-io/skohub-reconcile">skohub-reconcile</a> repository from Andreas to the SkoHub organization and are happy to start further developing it in 2023.</p>
+<h2>Final thoughts</h2>
+<p>The workshop was a great event to discuss with SkoHub users and those who want to be. We collected valuable feedback and ideas for development in the upcoming two years.
+Especially the sudden rise in awareness for the Fediverse opens up interesting use cases for SkoHub PubSub, which we are happy to engage in. The highlight of the workshop was the presentation of Andreas’ reconciliation prototype and its transfer in the SkoHub organization. This is a good example for the benefits of open source and use case driven development.</p>
+<p>We are looking forward to future community events, more use cases and even more modules to be developed in the SkoHub ecosystem.</p>
+<p><strong>More links:</strong></p>
+<ul>
+<li><a href="https://reconciliation-api.github.io/">Reconciliation API spec & ressources</a> (e.g. a <a href="https://reconciliation-api.github.io/testbench/#/">list of databases offering a Reconciliation Endpoint</a>)</li>
+<li><a href="https://docs.openrefine.org/manual/reconciling">Reconciliation in OpenRefine</a>, <a href="https://lobid.org/gnd/reconcile">GND reconciliation for OpenRefine</a>, <a href="https://histhub.ch/reconciling/">HistHub Blog entry (dt.)</a></li>
+<li><a href="https://teipublisher.com/exist/apps/tei-publisher/doc/blog/tei-publisher-710.xml">Annotation in TEI Publisher</a></li>
+<li>SkoHub Reconcile prototype: <a href="https://github.com/skohub-io/skohub-reconcile">GitHub</a>/<a href="https://c111-064.cloud.gwdg.de/reconc/">Test Instance</a></li>
+</ul>https://blog.skohub.io/2022-12-02-new-look/https://blog.skohub.io/2022-12-02-new-look/Fri, 02 Dec 2022 00:00:00 GMT<p>We are happy to announce the new SkoHub logo and design we have deployed right in time for our <a href="https://swib.org/swib22/programme.html#day3">SWIB22 workshop</a> on Wednesday! In the last months, Kai Mertens and effective WEBWORK helped to work out this new look in the context of the <a href="https://blog.skohub.io/2022-05-eww-project-kickoff/">project we have announced earlier</a>. We have now updated the <a href="https://skohub.io">SkoHub website</a>, this blog and the default <a href="https://github.com/skohub-io/skohub-vocabs">SkoHub Vocabs</a> setup to incorporate the new logo and design.</p>
+<p>Here is an <a href="https://w3id.org/kim/hochschulfaechersystematik/scheme">example</a> of how a vocabulary built with SkoHub Vocabs will look now with this default design:</p>
+<p><a href="https://w3id.org/kim/hochschulfaechersystematik/scheme"><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 71.51898734177216%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAOCAIAAACgpqunAAAACXBIWXMAAAsTAAALEwEAmpwYAAABqElEQVQoz5WS7WrUQBSGczWJU3MJFQv1TuyCP/y4JUtrs6OBbku3FfcCRPCPaLWo7W6p2aSTnXwn83WSkSRWakHdPrw/hgPPHHg5xtra/Qfr6/dWVy3LMk0TIWTb9t2brNi2jRAyTdOyLITQHdNcQcjYHDqbL7afb21t7+y4rosxHl7hDIeO4/RvjPHLDtd1R7u7rzB2MDbqphE1KK2hafRtaLQ2pGiRHWXVoqT0QvLu5POH029vvxwfz6YX5PLM92bB/HTunZPg+/zHBSGCC4MLQZMkTlNQqqyqvCxAKZ8uPp5PP83O3n89mQZ+ENF5tPBj6tHQo4uCM621AjAqxvyIlpxLAAWgtYYayrRIL6MsL3JWxUVR0IyHGQsztsgZySBrZehlksQ0z0gSJ2Xefym51Kpu/tlCK3MhwjQmSRwXeVaVQnVI+d/CWplx7kc0iGhvSuhttZTclsw5k+L3tFu83GboSrrOLeS+4eYqWmuplOjk5i/5tQPA0HV983QAagVLnFhjPJ2MB4ejjWsZHO0NjvY2/hz2eXg4evR6/9lk/OTNwePJ+Ce3Edi1hGz4mQAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The overview page of the SkoHub Vocabs build of the SKOS Destatis classification for subject groups, study areas and study subjects"
+ title="The overview page of the SkoHub Vocabs build of the SKOS Destatis classification for subject groups, study areas and study subjects"
+ src="/static/384b141891ff0a44d576e0792c5562e3/f058b/skohub-vocabs-screenshot.png"
+ srcset="/static/384b141891ff0a44d576e0792c5562e3/c26ae/skohub-vocabs-screenshot.png 158w,
+/static/384b141891ff0a44d576e0792c5562e3/6bdcf/skohub-vocabs-screenshot.png 315w,
+/static/384b141891ff0a44d576e0792c5562e3/f058b/skohub-vocabs-screenshot.png 630w,
+/static/384b141891ff0a44d576e0792c5562e3/40601/skohub-vocabs-screenshot.png 945w,
+/static/384b141891ff0a44d576e0792c5562e3/78612/skohub-vocabs-screenshot.png 1260w,
+/static/384b141891ff0a44d576e0792c5562e3/b880f/skohub-vocabs-screenshot.png 1507w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The overview page of the SkoHub Vocabs build of the SKOS Destatis classification for subject groups, study areas and study subjects</figcaption>
+ </figure></a></p>
+<p>I looks much clearer and more modern than before. We also love the new logo. Kai did a great job creating the new logo and design while not totally breaking with the previous look. We created an extra repo at <a href="https://github.com/skohub-io/skohub-logo">https://github.com/skohub-io/skohub-logo</a> for the logo and related files. In the next days, we will add some more documentation, especially on <a href="https://github.com/skohub-io/skohub-vocabs/issues/188#issuecomment-1327321911">how to configure a customized version for your own SkoHub Vocabs instance</a>.</p>https://blog.skohub.io/2022-11-skohub-workshop/https://blog.skohub.io/2022-11-skohub-workshop/Fri, 04 Nov 2022 00:00:00 GMT<p>In this blog post we want to introduce to you a new member in the Open Infrastructure team at the Hochschulbibliothekszentrum NRW who will be working on SkoHub as well as invite you to a workshop to present and discuss our future plans for the SkoHub project.</p>
+<p>After some time with not much happening (or even with shutting down some SkoHub services), we are happy to inform about a lot of movement in the space. We already <a href="https://blog.skohub.io/2022-05-eww-project-kickoff/">announced</a> the current project with the effective WEBWORK team to develop a new logo, improve the design and fix some minor issues in SkoHub. We are happy about the improvements made. Watch this space for more details in an upcoming post.</p>
+<h3>Welcome, Steffen!</h3>
+<p>We are very happy to welcome <a href="https://lobid.org/team/sr#">Steffen Rörtgen</a> in the open infrastructure team at the Hochschulbibliothekszentrum NRW who has joined the team this November. In his former projects he already made heavy use of SkoHub Vocabs and contributed to the project. He will now focus on further SkoHub development in context of the Metadaten.nrw project which is funded by the Ministry of Culture and Science of North Rhine-Westphalia (MKW).</p>
+<p>With the grant for this project, we have resources for further development and would like to discuss with you our plans, especially regarding the <strong>use of SkoHub for reconciliation</strong> and the ActivityPub-based <strong>publish/subscribe</strong> approach (SkoHub PubSub). Having already defined some work packages within the Metadaten.nrw project we would like to align these with use cases and ideas the SkoHub community has.</p>
+<h3>Upcoming workshop</h3>
+<p>Therefore we are happy to invite you to a small workshop on Thursday, the <strong>17th of November</strong>. We will start at <a href="https://zonestamp.toolforge.org/1668675632">10:00h CET</a> and split the workshop into two parts with each one lasting about two hours:</p>
+<p>In the <strong>first part</strong> we will give an overview of the past and current developments of SkoHub as well as an introduction to the Metadaten.nrw project and its plans for SkoHub. At the end of the first part we want to discuss use cases for the publish/subscripe approach of SkoHub as well as for reconciliation. Our community member Andreas Wagner already developed a prototype for reconciliation with SkoHub, which he will present.</p>
+<p>In the <strong>second part</strong> of the workshop we will then deep dive into the reconciliation topic. We will discuss Andreas’ approach and develop a roadmap for SkoHub to implement the desired reconciliation functionalities.</p>
+<p>For more details, see the <a href="https://pad.gwdg.de/s/2022-11-17-skohub-workshop">pad with the preliminary agenda</a>.</p>
+<p>If you are interested in the workshop, please send an email to <a href="mailto:skohub@hbz-nrw.de?subject=Registration for SkoHub-Workshop on Nov, 17th&body=Hello there,%0D%0AI'm interested in your workshop!%0D%0AI will join%0D%0A- [ ] the whole workshop%0D%0A- [ ] just the first part%0D%0A"><a href="mailto:skohub@hbz-nrw.de">skohub@hbz-nrw.de</a></a>
+and give us a note if you are joining the whole workshop or just the first part.</p>
+<p><em>Please be aware that we won’t give a general introduction to SkoHub and expect you to be familiar with SKOS and the approach of SkoHub.</em></p>https://blog.skohub.io/2022-05-eww-project-kickoff/https://blog.skohub.io/2022-05-eww-project-kickoff/Thu, 19 May 2022 00:00:00 GMT<p>In a kickoff workshop, the SkoHub team at hbz has launched a cooperation with the Hamburg-based company <a href="https://www.effective-webwork.de/">effective WEBWORK</a> to work on some pending issues regarding both functionality and design of <a href="https://github.com/skohub-io/skohub-vocabs">SkoHub Vocabs</a>. The issues and progress of the project can be followed in a <a href="https://github.com/orgs/skohub-io/projects/2">Kanban board</a>.</p>
+<p><a href="https://github.com/orgs/skohub-io/projects/2"><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 55.69620253164557%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAIAAADwazoUAAAACXBIWXMAAAsTAAALEwEAmpwYAAABj0lEQVQoz42Q2W7bMBBFqWijNmqhJJKSuEjUYtlKYrdu4AKFH/v/n1TQbgwD6UOB8zAcnovBDBDrNoyHtGxSzFBBEWYIs3ud3or08/kVQDvd8BFTQdqetb1Sc93IqlGYcEx4SUVJRd2ou53k5Bmg9L5iMsnqOKvNtFs3Sqs4e2C+kpzEWZXk9RME5HWbE+FFhRdmLkzcAIWoxJQHqPTjwo9NH0Z5UlBMJUxKYxryIC4AG1a1/+jm7934zsSkxkNJO328iv1PNrw1+r3mk1mBiX67iOUsdme+/BDzibY9qNWqXq9y+8XnU6vfGNdZyTq9cb0xObf9Srm5gguR40UujF3/LyHCgMhFHC78NpmKqWIyLdlwvMr9pZtOfPfRTkfaacePgQ2BHRicANjQDRBox9f5/Fvuzmr51qqFdDopaCNmxnXFJOkGTLkaV8ePLQdaTnAH2NALU4AKhmuB8hphijDNSxYkOXjxgQ0fnu1FJmx/CVs2BJZn7E9cmDwkgw1tP/53+MUNLSd8tt0A/Wf4Dxd0MJGMINoYAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The Backlog, Ready and Working columns of the project's Kanban board as of 2022-05-19"
+ title="The Backlog, Ready and Working columns of the project's Kanban board as of 2022-05-19"
+ src="/static/a2a343d6e0b18eb6e55f81fc6f505d25/f058b/kanban.png"
+ srcset="/static/a2a343d6e0b18eb6e55f81fc6f505d25/c26ae/kanban.png 158w,
+/static/a2a343d6e0b18eb6e55f81fc6f505d25/6bdcf/kanban.png 315w,
+/static/a2a343d6e0b18eb6e55f81fc6f505d25/f058b/kanban.png 630w,
+/static/a2a343d6e0b18eb6e55f81fc6f505d25/40601/kanban.png 945w,
+/static/a2a343d6e0b18eb6e55f81fc6f505d25/d56b5/kanban.png 1215w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The Backlog, Ready and Working columns of the project's Kanban board as of 2022-05-19</figcaption>
+ </figure></a></p>
+<p>A big part of the project will concern a new functionality to support the grouping of concepts by creating SKOS collection pages (<a href="https://github.com/skohub-io/skohub-vocabs/issues/159">Issue #159</a>). Another central goal of the cooperation is a redesign of the static sites that are generated by SkoHub Vocabs. Finally, a logo is to be developed for the SkoHub softwar suite that will hopefully capture the spirit of the software and the community behind it. We are looking forward to presenting the results within the next months.</p>
+<p>The team from effective WEBWORK consists of software developers, a librarian and a designer whose skills overlap with those of the SkoHub community in many ways and especially regarding their enthusiasm for open source solutions. We are looking forward to this collaboration!</p>https://blog.skohub.io/2021-12-10-skohub-vocabs-workshops/https://blog.skohub.io/2021-12-10-skohub-vocabs-workshops/Fri, 10 Dec 2021 00:00:00 GMT<p>We facilitated two workshops in November with the goal to introduce participants into Simple Knowledge Organization System (SKOS) by hands-on learning how to publish a small vocabulary with SkoHub Vocabs. The first one “Eine Einführung in SKOS mit SkoHub Vocabs” was held as a cooperation between the Hochschulbibliothekszentrum NRW and the Göttingen eResearch Alliance with 14 German-speaking participants on 2021-11-02, see the <a href="https://pad.gwdg.de/s/OCbQBibi2">workshop pad</a> and the <a href="https://pad.gwdg.de/p/einfuehrung-in-skos-mit-skohub">slides</a>.</p>
+<p>As the workshop worked quite well, we applied the same approach to our Workshop “An Introduction to SKOS with SkoHub-Vocabs” on 2021-11-30 at <a href="https://swib.org/swib21">SWIB21</a> (<a href="https://pad.gwdg.de/p/introduction-in-skos-with-skohub">slides</a>) with around 20 participants.</p>
+<p><a href="https://pad.gwdg.de/p/introduction-in-skos-with-skohub"><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 56.32911392405063%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAYAAAB/Ca1DAAAACXBIWXMAAAsTAAALEwEAmpwYAAABlElEQVQoz4VSy07CUBBlLyqUPqBK2gRCeEOgLSQaU2kkDSE1vArh4dqNccXKnzJu3fgBLvyjY2bIbRANLk7OzNwzp3PnNpZOp5HJZKBpGmRZhqIoSKVSzCIXTJAkic+FhnLqJQ/yiokgm82iVquhVCqh0WgwV6tVlMtl5mKxiEKhAMuyGK1Wi2sUU68YjA3pS5VKBaPRCP1+H/P5HJvNBsvlkmPiIAjgui7W6zVWqxWm0ymGwyHHZKyq6s5QOBPT+IlEgnmxWGA8HnPDZDKB7/sIwxCz2YzNhTaZTGLfgycUO8zn82g2mzwtMV2rXq+j2+1y7jgOOp0Ox+12G7ZtI5fLRTuMriwMTdNkMzLp9XrcYFltOFYLd94tbMuC53kYDAa8b7qqYRi/DQ9fmnISqooCu3sNP1jgPnzA1Y0LTVVhmCZraff7L/xjQlGIYl3H+Vkcj09bvL5/4u3jC8/bF5zGT/hMaCO9MKTkEPtiTVWgKSkossQTHxpFOHyU/6EfP6dBLi4Ro//nGGhH+/hbt6traR3fuxM7qEiV7HEAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="First slide of the SKOS introduction workshop at SWIB21"
+ title="First slide of the SKOS introduction workshop at SWIB21"
+ src="/static/5617f981b723e476e73f28ec316dd720/f058b/slides.png"
+ srcset="/static/5617f981b723e476e73f28ec316dd720/c26ae/slides.png 158w,
+/static/5617f981b723e476e73f28ec316dd720/6bdcf/slides.png 315w,
+/static/5617f981b723e476e73f28ec316dd720/f058b/slides.png 630w,
+/static/5617f981b723e476e73f28ec316dd720/40601/slides.png 945w,
+/static/5617f981b723e476e73f28ec316dd720/78612/slides.png 1260w,
+/static/5617f981b723e476e73f28ec316dd720/29114/slides.png 1920w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">First slide of the SKOS introduction workshop at SWIB21</figcaption>
+ </figure></a></p>
+<p>Generally, we had the impression that participants of both workshops where having a good time, at least nobody left the conference room before the end of the workshop. Here are some notes and lessons learned we collected after the SWIB workshop:</p>
+<ul>
+<li>We invited participants to share their screen to not be talking to the void. Around 4-5 participants followed our invitation which was ok for us.</li>
+<li>Many participants joined without their microphones connected. We missed to explicitly ask them to turn their mics on.</li>
+<li>We introduced ourselves and used the BigBlueButton poll feature to get to know the participants and their previous experience bit more.</li>
+<li>We divided the workshop into two parts: an introduction as a frontal presentation and a hands-on part with discussion and Q&A phases in between. Though this worked quite well, we think it might be nice to switch more often between explanatory parts and hands-on parts.</li>
+<li>Participants were a bit shy. We got some good feedback in the chat at the end but only from some participants. Next time, we should prepare another poll for getting feedback at the end of the workshop</li>
+</ul>
+<p>With regard to the further development of SkoHub Vocabs, it became clear during the workshops that it would be great to have an automatic test with each commit that lets you know whether a SKOS/Turtle file in a repo is “SkoHub-ready”, i.e. conforms to the pattern that is supported by SkoHub Vocabs. <a href="https://github.com/skohub-io/skohub-vocabs/issues/91">Issue #91</a> is already addressing this need and should be worked on to accomplish this.</p>https://blog.skohub.io/2021-lis-workshop/https://blog.skohub.io/2021-lis-workshop/Mon, 12 Jul 2021 00:00:00 GMT<p><a href="https://pad.gwdg.de/p/lis-workshop21-skohub"><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 56.32911392405063%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAIAAADwazoUAAAACXBIWXMAAAsTAAALEwEAmpwYAAABSklEQVQoz4WR626CQBCFeQd19jILW4Qm/KCx0A12Kxq7tUmBhESpVtT3f46mbOql0fT7scnZmdlzNuOwDkKIlNL3fSml2xEEwV2HEAIRPc+z97afUup5nnMU4/F4v98fDoeyLLMsW6/Xbdt+dVRVVdf1arVq2zZJEgBgjEkpHc45AMRxXNf1crncbrebzcYYs9vtrGyaxhjTdFRVVRRFGIYA8DN8dD6eQggAMOa1LD4+m0br516vh4iEkPPOU+y/EBDRo1RvMplhEDFKGOeMMd5hh0/OVttXGWMAMAzvx5NZpnPpD+0nzxsuhimlo9FIa52mqVIqyzKl1ORFT/OJUirP8yRJtNbz+TyKIpv/whkR7VYQ0XVdKeXifZHPpnZPosN1XUT8JzYhhHOePqXxQ2wl/eVK7Kv0+/3BYHC9Rqnw7hxyG2t1u8y+AfH4XWGcvyn/AAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="First slide of SkoHub presentation at LIS Workshop 2021"
+ title="First slide of SkoHub presentation at LIS Workshop 2021"
+ src="/static/23e9cd5c100418dc7d1c27eed6901f62/f058b/slides.png"
+ srcset="/static/23e9cd5c100418dc7d1c27eed6901f62/c26ae/slides.png 158w,
+/static/23e9cd5c100418dc7d1c27eed6901f62/6bdcf/slides.png 315w,
+/static/23e9cd5c100418dc7d1c27eed6901f62/f058b/slides.png 630w,
+/static/23e9cd5c100418dc7d1c27eed6901f62/40601/slides.png 945w,
+/static/23e9cd5c100418dc7d1c27eed6901f62/78612/slides.png 1260w,
+/static/23e9cd5c100418dc7d1c27eed6901f62/29114/slides.png 1920w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">First slide of SkoHub presentation at LIS Workshop 2021</figcaption>
+ </figure></a></p>
+<p>Last Friday afternoon, the 2021 Workshop on Classification and Subject Indexing in Library and Information Science (LIS Workshop) took place organized by the Working Group within the GfKL – Data Science Society. Adrian and Steffen had the chance to present SkoHub in the workshop’s first presentation. The slides can be viewed at <a href="https://pad.gwdg.de/p/lis-workshop21-skohub">https://pad.gwdg.de/p/lis-workshop21-skohub</a>.</p>
+<p>Here is an overview over the full <a href="http://www.gfkl.org/blog/2019/10/14/lis-workshop-2021-rotterdam/">programme</a> that comprised six talks:</p>
+<ul>
+<li>Adrian Pohl (hbz, Cologne, Germany) & Steffen Rörtgen (GWDG, Göttingen, Germany): <em>SkoHub: Publishing and using knowledge organization systems on the web</em></li>
+<li>Colin Higgins (University of Cambridge, Cambridge, United Kingdom): <em>Justice, governance, and the thesaurus – the Cambridge experience with ‘illegal aliens’</em></li>
+<li>Gislene Rodrigues da Silva & Célia da Consolação Dias (Universidade Federal de Minas Gerais, Belo Horizonte Brasil): <em>Subjective aspects of indexing photographs from visual communication using a reading model based on the complex method and the primary functions of the image</em></li>
+<li>Heidrun Wiesenmüller (Stuttgart Media University, Stuttgart, Germany): <em>Orientation and exploration – the presentation of subject headings in German catalog</em></li>
+<li>Karin Schmidgall (Deutsches Literaturarchiv Marbach, Marbach, Germany) & Matthias Finck (Effective Webwork, Hamburg, Germany): <em>Glückliche Funde - ein Katalog der Forschende auf neue Ideen und Pfade bringt</em></li>
+<li>Julijana Nadj-Guttandin (Deutsche Nationalbibliothek, Frankfurt, Germany) & Sarah Pielmeier (University and State Library, Münster, Germany): <em>Ein neues und modulares Regelwerk für die verbale Inhaltserschließung / A new and modular standard for subject indexing</em></li>
+</ul>
+<p>The workshop happened as part of the virtual conference “Data Science, Statistics & Visualisation and European Conference on Data Analysis 2021” (DSSV-ECDA 2021). The promotion for the workshop could probably have been better, e.g. the presentations weren’t even listed in the regular DSSV-ECDA programme. In the end, the speakers and moderators were among themselves with little additional audience. However, the talks discussed interesting topics and discussion was lively.</p>https://blog.skohub.io/2020-11-25-swib20-workshop/https://blog.skohub.io/2020-11-25-swib20-workshop/Wed, 25 Nov 2020 00:00:00 GMT<p>From 23-27 November the <a href="https://swib.org/swib20/">12th Semantic Web in Libraries Conference</a> took place online. The programme with links to recordings and slides forum can be viewed at <a href="https://swib.org/swib20/programme.html">https://swib.org/swib20/programme.html</a>.</p>
+<p>Wednesday was workshop day where Adrian and Steffen offered a SkoHub workshop. They tried out a flipped classroom approach and created several video tutorials and wrote down walkthroughs for the participants to prepare themselves. This did not work out as intended, lesson learned: Always be prepared that a bigger number of participants is not prepared.</p>
+<p>Here are some links to the materials:</p>
+<ul>
+<li><a href="https://pad.gwdg.de/p/HJeIvDvq5w">workshop agenda</a></li>
+<li><a href="https://github.com/skohub-io/swib20-workshop">workshop repo</a></li>
+<li><a href="https://github.com/skohub-io/swib20-workshop/blob/main/resources/README.md">overview over the workshop materials</a></li>
+</ul>https://blog.skohub.io/2020-10-09-skohub-apconf/https://blog.skohub.io/2020-10-09-skohub-apconf/Fri, 09 Oct 2020 00:00:00 GMT<p>From 2-5 October the <a href="https://conf.activitypub.rocks/">ActivityPub conference</a> happened online where people using the ActivityPub protocol came together to discuss topics all around federated networks and the respective web standards. For presentations, a flipped classroom approach was chosen where talks would be uploaded before the conference and the live part would be Q&A sessions for each talk. On Sunday, there was an additional round of lightning talks and some birds of a feather (bof) sessions where – quite similar to a barcamp session – people interested in a topic could propose a session and meet with likeminded people.</p>
+<p>The programme with links to the recordings and forum can be viewed at <a href="https://conf.activitypub.rocks/#live">https://conf.activitypub.rocks/#live</a>.</p>
+<p>On Friday, we had a session about SkoHub. Here is our previously recorded video:</p>
+<div class="gatsby-resp-iframe-wrapper" style="padding-bottom: 56.25%; position: relative; height: 0; overflow: hidden; margin-bottom: 1.0725rem" > <iframe sandbox="allow-same-origin allow-scripts allow-popups" src="https://conf.tube/videos/embed/85a7d230-7e75-48fd-b399-d182ddece030" frameborder="0" allowfullscreen="" style=" position: absolute; top: 0; left: 0; width: 100%; height: 100%; "></iframe> </div>
+<p>With regard to creating and maintaining controlled vocabularies and assigning topics, (at least some) people who develop applications for the Fediverse have quite some interest in building on experiences and approaches from the library world. What we learned from our Q&A session is to better prepare next time as there most certainly will be some people who haven’t fully watched the recording or where some time has passed since watching it: Next time we would have some slides ready to recap the basic concepts and some links to point to exemplary implementations and further information.</p>
+<p>Here are some projects that are dealing with categories, taxonomies in the Fediverse:</p>
+<ul>
+<li><a href="https://socialhub.activitypub.rocks/t/commonspub-and-the-quest-for-a-modular-decentralised-app-ecosystem/938"><strong>CommonsPub</strong></a> – which builds on experiences from MoodleNet – is working on “[f]ederated taxonomies for topic-based search and discovery across instances.”</li>
+<li>The CommonsPub-based <a href="https://haha.academy/"><strong>HAHA</strong> Academy</a> is experimenting with collaboratively created and maintained <a href="https://haha.academy/#the_knowledge">“taxonomy of all human knowledge”</a>.</li>
+<li><a href="https://socialhub.activitypub.rocks/t/learnawesome-org-building-a-better-goodreads-with-activitypub/946"><strong>LearnAwesome.org</strong></a> helps people in managing and finding learning material online in very different formats. It supports following certain topics.</li>
+<li>The <a href="https://socialhub.activitypub.rocks/t/the-reboot-of-the-indymedia-project/942"><strong>“Rebooting Indymedia” project</strong></a> wants to use topic-based channels to create moderation workflows for building independent news sites from decentrally published content.</li>
+<li>In a related effort by Trolli Schmittlauch, the problem is addressed how to create a comprehensive <strong>hashtag search</strong> & subscription in a federated social network. See <a href="https://git.orlives.de/schmittlauch/paper_hashtag_federation/src/branch/master/paper_hashtag_federation.pdf">this paper</a> for details.</li>
+</ul>
+<p>In a birds of a feather session about <a href="https://socialhub.activitypub.rocks/t/topics-and-services-we-can-subscribe-to/995">“Topics” and Services we can subscribe to</a> some of the people working on tags, controlled vocabularies etc. came together and had a fruitful exchange, sorting out the different problems and which approaches exist to address them. We are looking forward to further working on SkoHub and discussing common approaches to assigning or following controlled topics in the fediverse.</p>https://blog.skohub.io/2020-06-25-skohub-pubsub/https://blog.skohub.io/2020-06-25-skohub-pubsub/Thu, 25 Jun 2020 00:00:00 GMT<hr>
+<p>ⓘ <strong>Update, 2022-03-01</strong>: <em>Due to lacking resources for maintenance, we decided to shut down the SkoHub PubSub demo server at skohub.io for an indefinite time. However, the <a href="https://github.com/skohub-io/skohub-pubsub">code</a> is still there for anybody to set up their own instance.</em></p>
+<hr>
+<p>In the previous blog posts we have presented <a href="http://blog.lobid.org/2019/09/27/presenting-skohub-vocabs.html">SkoHub Vocabs</a> and <a href="http://blog.lobid.org/2020/03/31/skohub-editor.html">SkoHub Editor</a>. In the final post of this SkoHub introduction series we will take a deeper look at SkoHub PubSub, the part of SkoHub that brings the novel approach of KOS-based content subscription into the game.</p>
+<p>Let’s refresh what SkoHub is about by quoting the gist from <a href="https://skohub.io/">the project homepage</a>:</p>
+<blockquote>
+<p>SkoHub supports a novel approach for finding content on the web. The general idea is to extend the scope of Knowledge Organization Systems (KOS) to also act as communication hubs for publishers and information seekers. In effect, SkoHub allows to follow specific subjects in order to be notified when new content about that subject is published.</p>
+</blockquote>
+<p>Before diving into the technical implementation and protocols used, we provide an example on how this subscription, publication and notification process can be carried out in practice. Although SkoHub PubSub constitutes the core of the SkoHub infrastructure being the module that brings all SkoHub components together, it is not visible to end users by itself but only through applications which send out notifications or subscribe to a specific topic. (This is the great thing about open standards as it also invites everybody to develop new clients for specific use cases!)</p>
+<p>So, let’s take a look at an example workflow involving SkoHub Editor and the federated microblogging service <a href="https://en.wikipedia.org/wiki/Mastodon_(software)">Mastodon</a> to demonstrate the functionalities.</p>
+<h2>Subscribing to a subject</h2>
+<p>In one of the already mentioned blog posts we exemplarily published the <a href="https://w3id.org/class/esc/scheme">Educational Subjects Classification</a> with SkoHub Vocabs. Now, let’s take a look at a single subject from this classification, e.g. <a href="https://w3id.org/class/esc/n0322">Library, information and archival studies</a>:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/85426ef69485c8c526778217db5f24f5/1e1c3/concept.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 58.86075949367089%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAMCAYAAABiDJ37AAAACXBIWXMAAAsTAAALEwEAmpwYAAACC0lEQVQoz6WS22oTURSG531EPIEgIop4443P4aWI+AAieC2FNtU7RbC0KkUUpJaKaGnS2mQ6maSpxZY2c85kDplJ5phPMj3goVbBH3722ou9P9YPS5hfXeSTtMKiXKUsi4WXG2tU6iJLa1Uqkkit1aDWahaWvraQNlqsynW+yHXKYo3ycoUv1RoLlTLCHfklt6QZbtamuLH0mOufH3HtY4mrH0pcWZjk0nyJi3MlLryb5PzbEudeT3B2doLTL8Y5NT3ByefjnHgyxpm7t7n84B6CFQbooc9uz2XT69BwDKq2wpK2xcL2Ou+3Gsxtr/NmU2J2Y41XLZGZZpUpeZVn0gpPxWWmv60zVRe5/3AMYXenjapo6JqBruqYqoHbcem7Pay2iq3qOJpR2FY00l5IHvTJgpCkF5AGIQeybRthZ3cHwzTxez5BGNAfDIiTGD+ICcKIKI4IB30G0aiOGfK7sjwv+m1FRbAsi57vEwQBg36fNEmLR2mWcZSGw+GRHklRNQRV1zDtDkbHouu5xFlKkmUF8LjPB+ePdQF0fA/bd+l4Lm4YECYxUZqQ7U+4B+FY/QRUFAXf83FdF8/ziuhZmh1GHv6N9itQVVW6jkO32y3sOA5JkpCke5HzfPjH6AfO87wAtlUVYQSJouiw+T/SDXM0oVasjWla+6eJbhhoul7cDeNfbDDaFrnZ5Dvze3ezzKW6aQAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot of the HTML version of a SKOS concept published with SkoHub."
+ title="Screenshot of the HTML version of a SKOS concept published with SkoHub."
+ src="/static/85426ef69485c8c526778217db5f24f5/f058b/concept.png"
+ srcset="/static/85426ef69485c8c526778217db5f24f5/c26ae/concept.png 158w,
+/static/85426ef69485c8c526778217db5f24f5/6bdcf/concept.png 315w,
+/static/85426ef69485c8c526778217db5f24f5/f058b/concept.png 630w,
+/static/85426ef69485c8c526778217db5f24f5/40601/concept.png 945w,
+/static/85426ef69485c8c526778217db5f24f5/78612/concept.png 1260w,
+/static/85426ef69485c8c526778217db5f24f5/1e1c3/concept.png 1670w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot of the HTML version of a SKOS concept published with SkoHub.</figcaption>
+ </figure></p>
+<p>On the left-hand side, you can see the location of the topic in the classification hierarchy. On the right-hand side, there is some basic information on the subject: It has a URI (<code class="language-text">https://w3id.org/class/esc/n0322</code>), a notation (<code class="language-text">0322</code>), a preferred label (<code class="language-text">Library, information and archival studies</code>) and an inbox. This is how the <a href="https://w3id.org/class/esc/n0322.json">underlying JSON data</a> (e.g. by adding the format suffix <code class="language-text">.json</code> to the URI) looks like:</p>
+<div class="gatsby-highlight" data-language="json"><pre class="language-json"><code class="language-json"><span class="token punctuation">{</span>
+ <span class="token property">"id"</span><span class="token operator">:</span><span class="token string">"https://w3id.org/class/esc/n0322"</span><span class="token punctuation">,</span>
+ <span class="token property">"type"</span><span class="token operator">:</span><span class="token string">"Concept"</span><span class="token punctuation">,</span>
+ <span class="token property">"followers"</span><span class="token operator">:</span><span class="token string">"https://skohub.io/followers?subject=hbz%2Fvocabs-edu%2Fheads%2Fmaster%2Fw3id.org%2Fclass%2Fesc%2Fn0322"</span><span class="token punctuation">,</span>
+ <span class="token property">"inbox"</span><span class="token operator">:</span><span class="token string">"https://skohub.io/inbox?actor=hbz%2Fvocabs-edu%2Fheads%2Fmaster%2Fw3id.org%2Fclass%2Fesc%2Fn0322"</span><span class="token punctuation">,</span>
+ <span class="token property">"prefLabel"</span><span class="token operator">:</span><span class="token punctuation">{</span>
+ <span class="token property">"en"</span><span class="token operator">:</span><span class="token string">"Library, information and archival studies"</span>
+ <span class="token punctuation">}</span><span class="token punctuation">,</span>
+ <span class="token property">"notation"</span><span class="token operator">:</span><span class="token punctuation">[</span>
+ <span class="token string">"0322"</span>
+ <span class="token punctuation">]</span><span class="token punctuation">,</span>
+ <span class="token property">"broader"</span><span class="token operator">:</span><span class="token punctuation">{</span>
+ <span class="token property">"id"</span><span class="token operator">:</span><span class="token string">"https://w3id.org/class/esc/n032"</span><span class="token punctuation">,</span>
+ <span class="token property">"prefLabel"</span><span class="token operator">:</span><span class="token punctuation">{</span>
+ <span class="token property">"en"</span><span class="token operator">:</span><span class="token string">"Journalism and information"</span>
+ <span class="token punctuation">}</span>
+ <span class="token punctuation">}</span><span class="token punctuation">,</span>
+ <span class="token property">"inScheme"</span><span class="token operator">:</span><span class="token punctuation">{</span>
+ <span class="token property">"id"</span><span class="token operator">:</span><span class="token string">"https://w3id.org/class/esc/scheme"</span><span class="token punctuation">,</span>
+ <span class="token property">"title"</span><span class="token operator">:</span><span class="token punctuation">{</span>
+ <span class="token property">"en"</span><span class="token operator">:</span><span class="token string">"Educational Subjects Classification"</span>
+ <span class="token punctuation">}</span>
+ <span class="token punctuation">}</span>
+<span class="token punctuation">}</span></code></pre></div>
+<p>Besides the usual SKOS properties, the <code class="language-text">followers</code> key gives a hint that I can somehow follow this subject. Clicking on the associated URL, I will see a JSON file containing the list of followers of this subject. I am also interested in this topic and want to follow it to receive notifications about new online resources that are published and tagged with this subject. How do I achieve this?</p>
+<p>As already noted, what I need is an application that speaks ActivityPub. In this case we will use one of the most popular services in the <a href="https://en.wikipedia.org/wiki/Fediverse">Fediverse</a>: Mastodon. So, I open up my Mastodon client and put the topic URI into the search box:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 296px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/7ac0ab406412da9d354db3ca10b81ac9/b1a44/subscribe.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 95.56962025316456%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAATCAYAAACQjC21AAAACXBIWXMAAA7EAAAOxAGVKw4bAAADNUlEQVQ4y41UuXIrVRCd34BnjTRaZ5U0i0YzkkazSJqRtXosy7Lxqwc2b6MoKIqYT6BIqCKkipAIIgISPoCAmISciE840C3JdgImONV9ezm3um/3FdRmD2Z3gqYVsbRIt2MYZsg6oeUkaNnx/Zli252E9WOu1U0h610ITStEMrvB2fXH8MM13P4c+c0nePH2C9a7wRKz85dYbt/AHSz4fH7zKdLl++gOFvDDM6x2HyFMd9DbAYSWEyPKrpGtb9GL8z3B5hW2Lz6HF6zgDdeYnt1hmt/BG6740vnmFUazG/ZRDvmJ0DCHECoNE7Luoa44LBXDQ01xUG1YrBPIV1c692fyN1SXdcqpHXLLtRaEYlljpVQxGFL1AXubDqnavNcf4g42iq01Uam3UZRUCKR4wwWS2RV6EZU4RzA+RzS9hBcuEIxy9OIVwvSC7f14jcEohx8tEU8vEWVb+OGCyyUuLrlph+j0M1jeGK1OzNL2Jmh3YriDU5jdEWx/wrC6Izh+yj6ym27C+Vp7gHK9BaFQklEsqyAplhSIksqyQDpLGaK0l8eYo008xFAOtaNQlPcl0/w0NBcN3UVd7TDoTPb/gwZJgx6lCaGm2Oj0TmF7GUx3DHdAs7dgqZsBdHN4kP8Nww5BXEJNdR4m3olgeykcf8q2B8KnEHAfmVA6PHupqnNTpdp+FEivyhYqsonqE+AYxUKxokF4t1CBYQ1hdseoa9Q/h/tChAVJhlhWngY9UFnFSbFOhFU07Qi2nzIhlU3rSMQURLf+G8h/BJ1Pig0IVB7hnUINz8QGxLLGOClRoI5ixUCBbTpLAtnEig6a4SOodNo6gZxBvMCXX3+Lr775Dul8C68/wXR+gXC0QBDNkM0ukEzWGGc5JtOcbcNkjrq6Hy3adcU47PIzUcZmd4tff/8Tv/3xF+7efIZwvMbu+Wtsru6wzJ/j4upD5JcfYHv9kvXZ6hrnu1vo7T73f5BsmJAeVxAlDUmW4/sff8YPP/2C+dl7ML0UfrRCZ3AKu5cxLG+C7nABp3/K/XaDOZPRDtO3phj+npA2paE9mnraGM3lL4m/LfURHtvoC+PYDo8XT8Y/hH8DA7lqZs2FNQEAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot of a Mastodon search result for a topic URL with adjacent subscribe button"
+ title="Screenshot of a Mastodon search result for a topic URL with adjacent subscribe button"
+ src="/static/7ac0ab406412da9d354db3ca10b81ac9/b1a44/subscribe.png"
+ srcset="/static/7ac0ab406412da9d354db3ca10b81ac9/c26ae/subscribe.png 158w,
+/static/7ac0ab406412da9d354db3ca10b81ac9/b1a44/subscribe.png 296w"
+ sizes="(max-width: 296px) 100vw, 296px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot of a Mastodon search result for a topic URL with adjacent subscribe button</figcaption>
+ </figure></p>
+<p>I click on the follow button and am now following this subject with my Mastodon account and will receive any updates posted by it.</p>
+<h2>Describing and announcing a resource with SkoHub Editor</h2>
+<p>Let’s now switch into the role of a scholar, teacher, tutor or general interested person who has created an instructive online resource and wants to publish it to all people interested in the topic of “Library, information and archival studies”. In this case, I published a <a href="http://blog.lobid.org/2020/01/29/skohub-talk-at-swib19.html">blog post about a talk at SWIB19 – Semantic Web in Libraries Conference</a> and want to share it with others. I somehow need to send a message to the topic’s inbox, in this case I am using the SkoHub Editor (but it could be any other ActivityPub client or even the command line interface from which I publish). For the best user experience I download the SkoHub browser extension (<a href="https://addons.mozilla.org/firefox/addon/skohub-extension/">Firefox</a>, <a href="https://chrome.google.com/webstore/detail/skohub/ghalhmcgaicdcpmdicinaegnoanfmggd">Chrome</a>).</p>
+<p>As the default JSON schema uses another classification, we first have to configure the editor based on a schema that actually makes use of the Educational Subjects Classification. For this, we created a <a href="https://raw.githubusercontent.com/dini-ag-kim/lrmi-profile/useEsc4Subjects/draft/schemas/schema.json">version of the default schema</a> that does so. Now I put it into the extension’s settings:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/0c9af9bcdba57992a010aab21454ead2/c391c/configure-extension.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 57.59493670886076%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAMCAIAAADtbgqsAAAACXBIWXMAAAsTAAALEwEAmpwYAAABMElEQVQoz6WQ227DIAxA8/+/NmndAqRZs3ZNQ64iEMhlhOtE2ql7mrTuyLKE5SNjR0TNXdOc88tYaWf97zgX8rHVRTeun3OEJa9x+Zam1XvDCKGUDmzoe0opZWxgwyA4F2IUQoQnY8s0pnmfnRu1ysgoxQZ+qmhyacGx+cBtVZZVXRcFpowtyzKN0zhO87zIG6vRq9HKGBt1XafUyqQvuMfcS+0E52FuT5VS0zQR0hPSCyHchv1Gax0RQo6nU12XZZHjSy6lvO7mnLfWhnbvQ2yaMdbYOxHGZQzgC4BPz7vdKwAoAck+RiiGCKAEon0MYQxQCIhimACUxBClhyzIVV0n+3SV0mw/0VorpX7kW0WpraLuDc65qGnbQ5YZY/wfCbJ/lCBfj/Gg/K/J9lGMMV9xgrNr3lzzTgAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot of how to configure a custom schema in the SkoHub Editor extension for Firefox"
+ title="Screenshot of how to configure a custom schema in the SkoHub Editor extension for Firefox"
+ src="/static/0c9af9bcdba57992a010aab21454ead2/f058b/configure-extension.png"
+ srcset="/static/0c9af9bcdba57992a010aab21454ead2/c26ae/configure-extension.png 158w,
+/static/0c9af9bcdba57992a010aab21454ead2/6bdcf/configure-extension.png 315w,
+/static/0c9af9bcdba57992a010aab21454ead2/f058b/configure-extension.png 630w,
+/static/0c9af9bcdba57992a010aab21454ead2/c391c/configure-extension.png 673w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot of how to configure a custom schema in the SkoHub Editor extension for Firefox</figcaption>
+ </figure></p>
+<p>Then, I fire up the extension when visiting the web page I like to share and add data to the input form:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/492b5e6e95c98c3719cf62258f6ff200/29114/describing.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 55.06329113924051%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAYAAAB/Ca1DAAAACXBIWXMAAAsTAAALEwEAmpwYAAAB+ElEQVQoz3WSPW/TUBhGM5ZfQYtgohMLUidEBwQSjMCfIUykYmNgofwRaJeKlVKpH2qq1k4TJ/68TmzHHze+tnOQr2gHqrzSkWU/1tF9Xt3O78NDjo7+cHx8rDk9PaXf72vOzs70+w0nJyf6W5tdXFxo+i3//j/v9+kEQhCGgiiKiFuShCRJ9LPIc6RcaMqy1OR5TpZlnIdjhpbFcGRxZZiYhsHgekiHFTNLUszLc+yRzbU5IggCXNelqiqdL+oK1/PxA0EgQpI0wxeCznK55H/aaU93dWVgmgaGYTIejxkMBhSy0Hld14ysMbbjIqZTZlGMNbHvnvBGqJQijhOklMhCspBKV1aqhmUrbBAi1PXLUqGqCi8IVgtHlsW3r1/Y6X2i19thp/eZbvcjP3/s6byqa4Kg3f+UNMs1E8e9K2yrtLO/t8+TRw94uHGf9fUNNjcfc29tjW73g85LpRBhqPfY1s3yAsf1VgsPDn6x/ew5796/ZfvFS169fsPTrS12v+/ersT1PF07TuZkWY4vwtWVF6rEmQlymTJNZsRpxDxLWJTqtrLjB/hiyjRKiOcZjuevFkrZ4LuSeZyRxjnzWUo0U/h+Q9MsaZoG23UZ244W2a7HxHZWC4tiiec1VGVNuagoZYVSUJbt1UILW9mlOcAYDLFsV1+bv4/4J7HOt8tVAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Describing a resource with the SkoHub Editor browser extension"
+ title="Describing a resource with the SkoHub Editor browser extension"
+ src="/static/492b5e6e95c98c3719cf62258f6ff200/f058b/describing.png"
+ srcset="/static/492b5e6e95c98c3719cf62258f6ff200/c26ae/describing.png 158w,
+/static/492b5e6e95c98c3719cf62258f6ff200/6bdcf/describing.png 315w,
+/static/492b5e6e95c98c3719cf62258f6ff200/f058b/describing.png 630w,
+/static/492b5e6e95c98c3719cf62258f6ff200/40601/describing.png 945w,
+/static/492b5e6e95c98c3719cf62258f6ff200/78612/describing.png 1260w,
+/static/492b5e6e95c98c3719cf62258f6ff200/29114/describing.png 1920w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Describing a resource with the SkoHub Editor browser extension</figcaption>
+ </figure></p>
+<p>I select the topic “Library, information and archival studies” from the suggestions in the “subject” field, add information on licensing etc. and click “Publish”. A pop up lets me know that the resource is published to “Library, information and archival studies”. In the background, the description of the resource is sent to the respective topic (it could be more than one) which distributes the information to all its subscribers. Thus, in the end I as a subscriber of the topic will receive a notification of the resource in my Mastodon timeline:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 472px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/158048ebfeeb80ac9314a6beda41e5d6/3c5de/toot.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 36.075949367088604%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAHCAYAAAAIy204AAAACXBIWXMAAAsTAAALEwEAmpwYAAABdklEQVQoz3WMvZKTYBiFuRVndsxHAoHwFwKbP34CYQOEYDbJ7kYbG5tttPQCLLfayrH3KrwBWys7b8BLeBwYHNNYPHPOO+e8RwrTPXl1pqheM13kaOaSkR2iW0HHpf9H07nEGMdcBxWS5UZ40zXmOMJwQjRzgW41o0Grf+9LRvYSdTRvUfQZij5FNebtsPTp6TPfvv/k7bv33JQHsuLAZnuiqO54dXhDXh7YlEfq/ZmiPFLVD9T7B/zZGsuNcbwYaxyg6D4DzUd6/vKVH79+8/jhI3FatU/l7p5sU7Op7lnlJ+bJvtUoP7FIaoLswKo8s1zfkm7PJMUdpp8iVA/JvU6I0x22v2KW3uJHO5bZkeDmhLHY8tJOEHaMcBKEkyKsztsrhBnSM8LW992MwWiGdCVMXlxpiKGPOskQxhLZilpVxyv6wwnywEb0rY5LbyG6rCebyKqLJJQxDc2jbs6RFYfBcIKieejmos3koUev6/2PJpfVCX8A+bf0AFYI6eEAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The toot announcing a resource newly published to a SkoHub topic"
+ title="The toot announcing a resource newly published to a SkoHub topic"
+ src="/static/158048ebfeeb80ac9314a6beda41e5d6/3c5de/toot.png"
+ srcset="/static/158048ebfeeb80ac9314a6beda41e5d6/c26ae/toot.png 158w,
+/static/158048ebfeeb80ac9314a6beda41e5d6/6bdcf/toot.png 315w,
+/static/158048ebfeeb80ac9314a6beda41e5d6/3c5de/toot.png 472w"
+ sizes="(max-width: 472px) 100vw, 472px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The toot announcing a resource newly published to a SkoHub topic</figcaption>
+ </figure></p>
+<h2>Protocols and implementation</h2>
+<p>The SkoHub-PubSub server is built in <a href="https://nodejs.org/en/">Node.js</a> and implements a subset of <a href="http://activitypub.rocks/">ActivityPub</a>, <a href="https://www.w3.org/TR/ldn/">Linked Data Notifications</a> and <a href="https://docs.joinmastodon.org/spec/webfinger/">Webfinger</a> to achieve the behavior described above. On the ActivityPub side, Server to Server <a href="https://www.w3.org/TR/activitypub/#follow-activity-inbox">Follow</a> and corresponding <a href="https://www.w3.org/TR/activitypub/#undo-activity-inbox">Undo</a> interactions can be received to handle the subscription mechanism. Non-activity messages are considered Linked Data Notifications and can simply be sent to the inbox of a subject using a <a href="https://www.w3.org/TR/ldn/#sender">POST request</a> with any JSON body. These notifications are considered metadata payload, wrapped in a <code class="language-text">Create</code> action and distributed to every follower of the corresponding subject again using ActivityPub.</p>
+<p>As for the internals, <a href="https://www.mongodb.com/">MongoDB</a> is used to manage followers lists and an <a href="https://www.elastic.co/elasticsearch/">Elasticsearch</a> index is used to keep an archive of all payloads that have been distributed. This archive can be used to search and further explore metadata that has been distributed, e.g. by visualizing the distribution of subjects across all payloads.</p>
+<p>The most challenging aspects of the implementation were to gain an understanding of <a href="https://github.com/hbz/skohub-pubsub/issues/27">Webfinger</a> for user discovery and of the details of message signatures and how to validate them. <a href="https://blog.joinmastodon.org/2018/06/how-to-implement-a-basic-activitypub-server/">“How to implement a basic ActivityPub server”</a> was a good guidance here!</p>
+<h2>Outlook</h2>
+<p>We currently consider PubSub the least mature component of SkoHub. In the future, we would like to validate incoming Linked Data Notifications against a JSON schema that should be specific enough to ensure a consistent experience when viewing them e.g. in Mastodon but flexible enough to support additional use cases. We would also like to support <a href="https://github.com/hbz/skohub-pubsub/issues/38">ActivityPub on the publication side</a> and <a href="https://www.w3.org/TR/activitypub/#announce-activity-inbox">Announce</a> activities in order to enable use cases such as <a href="https://github.com/hbz/skohub-pubsub/issues/37">mentioning a SkoHub concept on Mastodon</a>. We would really value your input on this!</p>https://blog.skohub.io/2020-03-31-skohub-editor/https://blog.skohub.io/2020-03-31-skohub-editor/Tue, 31 Mar 2020 00:00:00 GMT<hr>
+<p>ⓘ <strong>Update, 2022-03-01</strong>: <em>Due to lacking resources for maintenance, we decided to shut down the SkoHub Editor demo for an indefinite time. However, the <a href="https://github.com/skohub-io/skohub-editor">code</a> is still there for anybody to set up their own instance.</em></p>
+<hr>
+<p>In a <a href="http://blog.lobid.org/2019/09/27/presenting-skohub-vocabs.html">previous blog post</a> we presented a first SkoHub module: <em>SkoHub Vocabs</em>. Before talking about another module, first a short summary of the features SkoHub Vocabs offers. Basically, it provides an editorial workflow to publish a SKOS vocabulary on the web which can then be consumed by humans and applications. It builds on git-based online software development platforms (currently GitHub and GitLab are supported) where you maintain a SKOS vocabulary as a Turtle file. This allows you to use all the associated features such as branches and pull requests for a full-fledged review process. With every new commit in a branch, triggered by a webhook, SkoHub Vocabs will build a static site for the vocab – with HTML for human consumption and JSON-LD for consumption by applications.</p>
+<p>In this post, we present <em>SkoHub Editor</em> (<a href="https://skohub.io/editor/"><del>demo</del></a>, <a href="https://github.com/skohub-io/skohub-editor">code</a>) that is accompanied by a browser extension. In a nutshell, SkoHub Editor enables the automatic generation of a web form based on a JSON schema, along with the possibility to look up terms in a controlled vocabulary that is published with SkoHub Vocabs. Additionally, metadata generated by the editor can be published using SkoHub PubSub, which we will describe in an upcoming post. Let’s take a look at the specifics by configuring an editor that lets you create JSON-LD describing an open educational resource (OER) on the web.</p>
+<h2>Describing a resource with the browser extension</h2>
+<p>Let’s start with actually using SkoHub Editor. You will have the most comfortable experience when using the SkoHub browser extension that wraps the SkoHub Editor and pre-populates some field in the web form. The browser extension is available both for <a href="https://addons.mozilla.org/en-US/firefox/addon/skohub-extension/">Firefox</a> and <a href="https://chrome.google.com/webstore/detail/skohub/ghalhmcgaicdcpmdicinaegnoanfmggd">Chrome</a>. Just add the extension to your browser and a little icon will be shown on the right-hand side of your navigation bar:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/2b8d11bca715d365580506508d352f7a/63ec5/extension-icon.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 15.18987341772152%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAADCAIAAAAcOLh5AAAACXBIWXMAAAsTAAALEwEAmpwYAAAAjElEQVQI1zXNQRKDIAxAUe9/mfYGHa1oSQIoISqChaXH6NgZ3+rvfiOBObBn75zVoBHRGIMAiEhEiMAw+Ns8TRJkTynlTETNi0CprlfdMPZk0Bia/ph59v6KIMtNllWWa7cya62b/fiKCCLGGHPOpZRaLtu2lVLO86y1ppScc2QtqMFbM7/H+Hx82vYHYBSTRpcqu6YAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The SkoHub extension icon in between other extensions in the Firefox nav bar"
+ title="The SkoHub extension icon in between other extensions in the Firefox nav bar"
+ src="/static/2b8d11bca715d365580506508d352f7a/f058b/extension-icon.png"
+ srcset="/static/2b8d11bca715d365580506508d352f7a/c26ae/extension-icon.png 158w,
+/static/2b8d11bca715d365580506508d352f7a/6bdcf/extension-icon.png 315w,
+/static/2b8d11bca715d365580506508d352f7a/f058b/extension-icon.png 630w,
+/static/2b8d11bca715d365580506508d352f7a/63ec5/extension-icon.png 812w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The SkoHub extension icon in between other extensions in the Firefox nav bar</figcaption>
+ </figure></p>
+<p>While having any web page open, you can now open the SkoHub editor in your browser to describe that web resource. Let’s use as an example the YouTube video <a href="https://www.youtube.com/watch?v=ZaiDDOZcaqc">“COVID-19 – 6 Dangerous Coronavirus Myths, Busted by World Health Organization”</a> published recently by the World Economic Forum under a CC-BY license. Open the video in your browser, click on the extension and you will see that several fields are automatically filled out.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/ec0c8783a270bd477ef034281c1c4796/29114/prepopulated-webform.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 56.32911392405063%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAIAAADwazoUAAAACXBIWXMAAAsTAAALEwEAmpwYAAACnklEQVQozx2O20tTcQDH94dED0UkEgWCDz1pRih4qSlakTM3NufOfTvt7JwtZ7XpdDNTRNLhjRQ1MwozgyhXim2agUG99JCXqbv9ds7Z2dntnJWxvnwePi8f+CpAMHh8EARxFgDAcxzHchzL8nwiEksAwLEgHo8DlmUzydT473X069zDwJL146J2ec7+7o0ik82OPBnQaGCoVU+p1HdUBo/D9XxqZG6UnO9D7RTdDtPTE5Mz49OvAh9f//o2/90/s+N/trPl3VhTJASBMZLNevOthtaWq9XllTcxtfbtcOMMeX6BvrDQV61WaVyPeqyMfcu/mUlnU6lMVpJFWT4MRwqxlSBhi7NNg7Zcq6lW3tVhTrOmYUJ3bh4vnsWKKG2jy+FBDIZAYCMhCFEAYgDwPH8cCisSCYHu7CE63Eab29HtacOdZdftVUpCXVlqqynxQGrMyDBd/QaTfTuwHY2B/WAweHTEcnw4Ei3ErsFR99CYe2hscXkFMXbV3tD39fQyhFFTq3RCkNc7QXcPKpuxFy/f5/N5PiEISTGdziQEoXB7xbfu+7n7eY/9sscu+X90Ogddzseq+iZtfR2hQ6z4vfomXZ1S9WF1TZIkUUyl0ulsLieIogLEYhTF9Hunn07NDgyPeCdnb2sdJWXa0tKqi6dPFV+qvXxFdfZMUUV5hc/3SRTFcCTK8byQTPKCoOB4HqWsFoq20VbGQnfYu0xmB0E+YOwek4kmzZ1Wey9JUpABXl31JUUxCgDL8XGWY3leIcv5zRB7zArZnCTLspTLyVJOlqW8LJ0U9vfk5M9/OZFkeXf/YD94eBQKRWIgKaYUMNYOwQYUNyG4EcOJAoQRw3EMR1EcQVAEQRAMRwwQBGOEAUZJs4Wx3dfqEbVa8w+jHqtDtRlU5QAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="SkoHub extension in the sidebar of the browser with fields 'URL', 'Title' and 'Description' being pre-populated"
+ title="SkoHub extension in the sidebar of the browser with fields 'URL', 'Title' and 'Description' being pre-populated"
+ src="/static/ec0c8783a270bd477ef034281c1c4796/f058b/prepopulated-webform.png"
+ srcset="/static/ec0c8783a270bd477ef034281c1c4796/c26ae/prepopulated-webform.png 158w,
+/static/ec0c8783a270bd477ef034281c1c4796/6bdcf/prepopulated-webform.png 315w,
+/static/ec0c8783a270bd477ef034281c1c4796/f058b/prepopulated-webform.png 630w,
+/static/ec0c8783a270bd477ef034281c1c4796/40601/prepopulated-webform.png 945w,
+/static/ec0c8783a270bd477ef034281c1c4796/78612/prepopulated-webform.png 1260w,
+/static/ec0c8783a270bd477ef034281c1c4796/29114/prepopulated-webform.png 1920w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">SkoHub extension in the sidebar of the browser with fields 'URL', 'Title' and 'Description' being pre-populated</figcaption>
+ </figure></p>
+<p>We can now add additional metadata by selecting a type (<code class="language-text">VideoObject</code> in this case), add a creator, creation date, language etc. As we mentioned, you can look up a subject from a controlled vocabulary for some fields in the web form. You will experience this when inputting content into the fields “Subject”, “License”, “Learning Resource Type”, and “Intended Audience”. For those fields you will get a drop down with suggestions from a controlled vocabulary, e.g. for “Subject” from a German <a href="https://w3id.org/kim/hochschulfaechersystematik/scheme">classification</a> of subjects in Higher education that is published with SkoHub Vocabs.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 458px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/6417112bfbd5cb40999e9ce1d97b7b82/f7a31/auto-suggestion-from-skos-vocab.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 79.11392405063292%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAQCAYAAAAWGF8bAAAACXBIWXMAAAsTAAALEwEAmpwYAAACaUlEQVQ4y6WUW2+bQBCF/f//VNWoVaXIMZj7slwXbO5rcOLEjh9ONQMkVaM+1dLR7gIevj1nlk3btpBSoqkb/M/v9n7H+HzBZhxHqEIhzmJUbY2mbzGMGnrUPK7S/xDdO00ab7cbppdXbK7XK6bzhJ+Pv/Dt4QHff/6AE/hwQwHvD9HaFasCHum65XvYWQam8xnnyxs29/sdWmvYngvTtmG5DgzLgmHt8fi0xdYwsDV2eDINmLYFw7awdxwed3sTYRwjKwr0esSZCOu6RhAEUGWFrDiwUlWyoiSDkDFElCDOFJK84Ov58gzdG8aZrO31XLDXJ6R5gSjN4XgBHJ8kuIhMUsRZzoX9UHJRT0iIKEaqCvasP03o9AlNP8wFm7aDCCWyooSMU8gkQ5zm/IckU0ySqRJJrpiMitNclUe0wwndcOKx6ZaCbdNACAF1qGDaDmtnWnCDEG4gsHc8WI7HJC9vNzy/Xlm0TSJcRZScctcPcFwP5bGe8emtepzfzlvRqLthuT7f+9BK12s0XT8THqsa252JsmrYP9cXTBTGCcI4RZRmi4cRe0je+ouPBLGSfhAOwwClFPLyyJ5ROKt3KXtXcGhkSVnV/Bx1Aq2PTfeVkI9eFHEggYznYJY50dFIRHSsLtd31urhp48X3j4T1m0HY29zcpQsJbr2Y7ZQzuuSSYmYiFYPV78/Uu76HjKKeZtioSJCmocLKfWeJ0ImpX6kkOb+G78WHPqeT8qhahDICEEYwQtC7kWioSD09MxbpbYh/d0ytP4MRWuocm7Y9ZTYnv9JJCRbQSFQM7NovigvD3yaqqblj8NvkLCbgQ33WlAAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="The string 'gesund' is input in the subject field and several entries with this string from a controlled vocabulary are suggested."
+ title="The string 'gesund' is input in the subject field and several entries with this string from a controlled vocabulary are suggested."
+ src="/static/6417112bfbd5cb40999e9ce1d97b7b82/f7a31/auto-suggestion-from-skos-vocab.png"
+ srcset="/static/6417112bfbd5cb40999e9ce1d97b7b82/c26ae/auto-suggestion-from-skos-vocab.png 158w,
+/static/6417112bfbd5cb40999e9ce1d97b7b82/6bdcf/auto-suggestion-from-skos-vocab.png 315w,
+/static/6417112bfbd5cb40999e9ce1d97b7b82/f7a31/auto-suggestion-from-skos-vocab.png 458w"
+ sizes="(max-width: 458px) 100vw, 458px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">The string 'gesund' is input in the subject field and several entries with this string from a controlled vocabulary are suggested.</figcaption>
+ </figure></p>
+<p>Currently, only the fields “URL”, “Type” and “Title” are obligatory, all other fields are optional. When you think you have described the resource sufficiently, you can click on “Show Preview” in the extension, copy & paste the JSON-LD to the clipboard and include it in the HTML of any web page within a <code class="language-text"><script type="application/ld+json"></code> tag.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 486px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/70707507ae9efdd0f829cba5b111f2fd/4ee7f/json-preview.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 125.9493670886076%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAZCAYAAAAxFw7TAAAACXBIWXMAAAsTAAALEwEAmpwYAAADjElEQVQ4y51V2XbjNgzN//9XX9qm2Sb2xI4d2bJ2i7I27tRtAWW3Zx7Kc3BESSRwiXsBXg1SwzgHYy+Yc9DWQhvL7/R8N2tf11le13Y9Tl2Pq1EZ/G70/YA0L7A/JEjSDHlZsWVFiSTLcUgzZHmOLMtQ1YIcat44TdOZ0aiFwOJpheubW6yfN1g+rfHwuMCPxU/cPTzi+vYeq/Uz0jRFeRQzwtlBuIjQBWAYJaTS0NZhlIrn4zhCSslPYwziwwFV3eBKagPnHKQyUMpDSgetPJybMHmLMDSAkYAZMY0toIfZvo19HM8OR22A4KCrFE2coHnZoEsSqEbA1wnMyx18V8MPJ/i2hO8F3KkEJYRtOnNoMVkFvb2FTVYwux9whyXsyw3s4Sf05h/Y7Bm+yeGqPWzxgjCe4ModXBnBEepzhB62FTjuUog4hZM9bLyE3t7AZhu4fAubrvhdb645CAUORr6T9+5QGoegBpjnv2EJWbxgow1mv2B0tojgmgKhF4zOd0f4tuKT/SKHHrouIbIaRmv4oeEgZLTRa4lASggek3dsb+MMIefQWxhCFt3B7h5g9w/8dCKFq2J4kTEhc952Mzpy/kmvHw6pUqYALSqIaIfTdo02jiGiCEMnEayaHXY1S4iMCAqqnxHiIssaenOLfruETCOoKke/voepErhjDL3+C3r1J2y6ZkI05TvfwBURgtXfHBqHySio5R+wuzu49Al2dw8b3cLlG9YhpYMYdnUys15EMCSzdM3HJ5zfHMpZCvESJnqAofyRqOMlbL7lOf2nfBL7dGT6TicL56TMlWKaI6p9gWJXQnUtow6y42Bz7k7MOjkngoLsEfQIr8ZfsEyao6PGC7jihXNkonu4Jodvilk+Inv9t4U7Hrhy6BvV3z4+fGXZnGqIfQKxi2FORw7ABBye5rIjlt8K99M4Z/m1H/oAlHmPLOmgdIAaOpixB7zF5F5b3Ieaz/rmVx3S0mmCMQpWWwythNIz+yRuPu6phD8V3HWmz0jPHeov0Jt6xH5zRNcbhLZkDXI9UyfaP8IkK/ixhadAFCRcrJTpldERXdVAJAU8sSs7bgTEJlUG9UJCy98JLeV2aC8IW49zz6PNg0AQh3kxsXqMec5oqARp3tVznfN79k3Y2swdxFm+M/7viD9kM+dwGCT6doBRDtY4aGlhtONa9dwH27mljS3r9vNN+XbdNv/dy/8CJkl96BetmE0AAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Preview of the structured JSON data in the SKoHub extension"
+ title="Preview of the structured JSON data in the SKoHub extension"
+ src="/static/70707507ae9efdd0f829cba5b111f2fd/4ee7f/json-preview.png"
+ srcset="/static/70707507ae9efdd0f829cba5b111f2fd/c26ae/json-preview.png 158w,
+/static/70707507ae9efdd0f829cba5b111f2fd/6bdcf/json-preview.png 315w,
+/static/70707507ae9efdd0f829cba5b111f2fd/4ee7f/json-preview.png 486w"
+ sizes="(max-width: 486px) 100vw, 486px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Preview of the structured JSON data in the SKoHub extension</figcaption>
+ </figure></p>
+<p>Using the <a href="http://blog.lobid.org/2019/05/17/skohub.html#subject-specific-subscription-to-web-resources">content subscription & publication features of SkoHub</a>, you can furthermore publish the resource via SkoHub PubSub (to be covered in detail in an upcoming post).</p>
+<h2>Configuring the web form with JSON Schema</h2>
+<p>As said above, the SkoHub Extension wraps the SkoHub Editor running at <a href="https://skohub.io/editor/">https://skohub.io/editor/</a>. SkoHub Editor is configured with a <a href="https://json-schema.org/understanding-json-schema/">JSON schema</a> document that is used both to generate appropriate form inputs and to validate the entered content. Thus, the JSON Schema is the central, most crucial part when working with SkoHub Editor. Currently, we are using as default schema a <a href="https://dini-ag-kim.github.io/lrmi-profile/draft/schemas/schema.json">draft schema for OER</a> we created using relevant properties and types from <a href="https://schema.org">schema.org</a>. With the JSON schema URL, we can now load the <a href="https://skohub.io/editor/?schema=https://dini-ag-kim.github.io/lrmi-profile/draft/schemas/schema.json">web form</a> you already know from the browser extension by providing the link to the schema. Of course, you can just write your own schema to build a web form for your use case.</p>
+<p>Let’s take a short look at the underlying schema, which we tried to keep as straightforward as possible. Generally, with JSON schema you can specify a number of optional or mandatory properties and what type of input each expects. The <code class="language-text">"title"</code> of each property will be used as the label for the field in the web form.</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">{
+ "properties": {
+ "name": {
+ "title": "Title",
+ "type": "string"
+ }
+ }
+}</code></pre></div>
+<p>It is also possible to allow only values from a predefined list (an <code class="language-text">enum</code>), which the editor will render as a drop down:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">{
+ "type": {
+ "title": "Type",
+ "type": "string",
+ "enum": [
+ "AudioObject",
+ "Book",
+ "Course",
+ "CreativeWork",
+ "DataDownload",
+ "ImageObject",
+ "PresentationDigitalDocument",
+ "SoftwareApplication",
+ "VideoObject"
+ ]
+ }
+}</code></pre></div>
+<p>Such lists of allowed values can be considered controlled vocabularies, and ideally they should be shared across many data sources. This is where SkoHub Vocabs comes into play. Instead of embedding the list of allowed values into our schema, we can reference a SKOS vocabulary on the web:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">{
+ "about": {
+ "title": "Subject",
+ "type": "array",
+ "items": {
+ "type": "object",
+ "properties": {
+ "inScheme": {
+ "type": "object",
+ "properties": {
+ "id": {
+ "type": "string",
+ "enum": [
+ "https://w3id.org/kim/hochschulfaechersystematik/scheme"
+ ]
+ }
+ }
+ }
+ },
+ "_widget": "SkohubLookup"
+ }
+ }
+}</code></pre></div>
+<p>Notice the custom key <code class="language-text">_widget</code> in the JSON schema. This will configure the editor to use the specified UI element for the given field. In our example, the <code class="language-text">SkohubLookup</code> widget is used, which works with all controlled vocabularies that are published with SkoHub Vocabs. All custom JSON schema extensions start with an underscore <code class="language-text">_</code> and are used to control the look and feel of the editor; see below for an example for how to hide a field on the form.</p>
+<p>Finally, to make our data JSON-LD, we also set a mandatory <code class="language-text">@context</code> property and a default object value for the <code class="language-text">@context</code>. This makes the editor add it to the document without any user interaction needed.</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">{
+ "$schema": "http://json-schema.org/draft-07/schema#",
+ "title": "OER",
+ "description": "This is a generic JSON schema for describing an Open Educational Resource with schema.org",
+ "type": "object",
+ "default": {
+ "@context": {
+ "id": "@id",
+ "type": "@type",
+ "@vocab": "http://schema.org/",
+ "skos": "http://www.w3.org/2004/02/skos/core#",
+ "prefLabel": "skos:prefLabel",
+ "inScheme": "skos:inScheme",
+ "Concept": "skos:Concept"
+ }
+ },
+ "properties": {
+ "@context": {
+ "type": "object",
+ "additionalProperties": true,
+ "_display": {
+ "className": "hidden"
+ }
+ }
+}</code></pre></div>
+<h2>Implementation</h2>
+<p>Of course you can also poke around the editor while running it locally:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">$ git clone https://github.com/hbz/skohub-editor.git
+$ cd skohub-editor
+$ npm install</code></pre></div>
+<p>As is the case with SkoHub Vocabs, the editor is implemented in <a href="https://reactjs.org/">React</a>. The form components are located in <code class="language-text">src/components/JSONSchemaForm</code>. In a nutshell, a <code class="language-text">Form</code> provides data to the various input components:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text"><Form
+ data={{title: 'A title'}}
+ onSubmit={console.log}
+>
+ <Input property="title" />
+ <Textarea property="description" />
+ <button type="submit">Publish</button>
+</Form></code></pre></div>
+<p>Obviously it would be tedious to manually code all the inputs for a given schema. This is where the <code class="language-text">Builder</code> comes into play. It reads a schema and creates all necessary input components:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text"><Form
+ data={{title: ''}}
+ onSubmit={console.log}
+>
+ <Builder schema={{
+ "$schema": "http://json-schema.org/draft-07/schema#",
+ "title": "My JSON schema",
+ "type": "object",
+ "properties": {
+ "title": {
+ "type": "string",
+ "title": "Title"
+ },
+ "description": {
+ "type": "string",
+ "title": "Description",
+ "_widget": "Textarea"
+ }
+ }
+ }} />
+ <button type="submit">Publish</button>
+</Form></code></pre></div>
+<p>The browser extension is essentially a simple wrapper for the editor running at <a href="https://skohub.io/editor/">https://skohub.io/editor/</a>, which the extension injects as an <code class="language-text">iframe</code> into the current page. Additionally, before the iframe is injected, some metadata is scraped from that page. This data is used to pre-populate the editor. This process obviously depends both on the data found in the web page and on the schema the editor is configured to use. YouTube for example uses <code class="language-text">meta name="description"</code> for data about YouTube itself rather than the actual video, which is described in <code class="language-text">meta property="og:description"</code>. Even if the correct metadata is extracted, there is no guarantee that the schema used to configure the editor even has a <code class="language-text">description</code> field. In the future, it would be nice to find a possibility to somehow map page metadata to properties in the schema itself.</p>
+<h2>Outlook</h2>
+<p>SkoHub Editor already works very well and can be extremely useful. However, some things are still work in progress and will need some future effort to be improved:</p>
+<ul>
+<li><strong>Using schema.org markup for pre-population</strong>: This might sound obvious but we have not implemented it yet, see <a href="https://github.com/hbz/skohub-extension/issues/17">#17</a>.</li>
+<li><strong>Further issues</strong>: See also the issues at <a href="https://github.com/hbz/skohub-editor/issues">https://github.com/hbz/skohub-editor/issues</a> for further ideas for improvement.</li>
+</ul>
+<p>Furthermore, some work will have to be put into the current default schema and the controlled vocabularies it uses:</p>
+<ul>
+<li><strong>Develop JSON Schema</strong>: The JSON Schema definitely is not finished yet. For example, it makes sense to include <code class="language-text">http://schema.org/keywords</code> in the future for adding arbitrary tags to describe a resource. We plan to develop the schema within the common <a href="https://oerworldmap.org/resource/urn:uuid:fd06253e-fe67-4910-b923-51db9d27e59f">OER metadata group</a> of DINI AG KIM & Jointly with a focus on describing OER in the German-speaking world.</li>
+<li><strong>Improve Vocabularies</strong>: For “Learning Resource Type” and “Intended Audience” we are using controlled vocabularies that are not nearly finished but in development at the <a href="https://www.dublincore.org/groups/lrmi-task-group/">LRMI Task Group</a> of the Dublin Core Metadata Initiative (DCMI). Trying out the browser extension, you will for instance see that the educational resources types are missing some options. However, we assume that the combination of SkoHub Editor & SkoHub Vocabs makes a pretty nice environment for the development of these vocabularies in an open and transparent process on GitHub or GitLab.</li>
+</ul>
+<h2>Get involved</h2>
+<p>Please try it out and let us know what doesn’t work or which feature you are missing and also what you like about SkoHub. We are happy about every bug report, suggestion and feature requests for the production version. Get in <a href="https://lobid.org/team-en/">contact</a> with us via a hypothes.is annotation, GitHub, Email, Mastodon or IRC.</p>https://blog.skohub.io/2020-01-29-skohub-talk-at-swib19/https://blog.skohub.io/2020-01-29-skohub-talk-at-swib19/Wed, 29 Jan 2020 00:00:00 GMT<p>On November 27th 2019, Adrian Pohl and Felix Ostrowski (graphthinking) presented SkoHub at the “Semantic Web in Libraries” conference in Hamburg (<a href="http://swib.org/swib19/">SWIB19</a>).</p>
+<div class="gatsby-resp-iframe-wrapper" style="padding-bottom: 56.25%; position: relative; height: 0; overflow: hidden; margin-bottom: 1.0725rem" > <iframe src="https://www.youtube-nocookie.com/embed/9cmkKPC3jlo" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" style=" position: absolute; top: 0; left: 0; width: 100%; height: 100%; "></iframe> </div>https://blog.skohub.io/2019-09-27-skohub-vocabs/https://blog.skohub.io/2019-09-27-skohub-vocabs/Fri, 27 Sep 2019 00:00:00 GMT<p>We are happy to announce that the SkoHub prototype outlined in our post <a href="http://blog.lobid.org/2019/05/17/skohub.html">“SkoHub: Enabling KOS-based content subscription”</a> is now finished. In a series of three post we will report on the outcome by walking through the different components and presenting their features.</p>
+<p>SkoHub is all about utilizing the power of Knowledge Organization Systems (KOS) to create a publication/subscription infrastructure for Open Educational Resources (OER). Consequently, publishing these KOS on the web according to the standards was the first area of focus for us. We are well aware that there are already plenty of Open Source tools to <a href="http://skosmos.org/">publish</a> and <a href="http://vocbench.uniroma2.it/">edit</a> vocabularies based on <a href="https://en.wikipedia.org/wiki/Simple_Knowledge_Organization_System">SKOS</a>, but these are usually monolithic database applications. Our own workflows often involve managing smaller vocabularies as <a href="https://github.com/hbz/lobid-vocabs">flat files on GitHub</a>, and <a href="https://github.com/dcmi/lrmi/tree/master/lrmi_vocabs">others</a> seem to also do so.</p>
+<p>We will thus start this series with <a href="https://github.com/hbz/skohub-vocabs">SkoHub Vocabs</a> (formerly called “skohub-ssg”), a static site generator that provides integration for a GitHub-based workflow to publish an HTML version of SKOS vocabularies. Check out the <a href="https://jamstack.org/best-practices/">JAMStack Best Practices</a> for some thoughts about the advantages of this approach. SkoHub Vocabs – like SkoHub Editor that will be presented in a separate post – is a stand-alone module that can already be helpful on its own, when used without any of the other SkoHub modules.</p>
+<h2>How to publish a SKOS scheme from GitHub with SkoHub Vocabs</h2>
+<p>Let’s take a look at the editing and publishing workflow step by step. We will use SkoHub Vocabs to publish a subject classification for Open Educational Resources. We will use the “Educational Subject Classification” (ESC), that was created for the <a href="https://oerworldmap.org">OER World Map</a> based on <a href="http://uis.unesco.org/sites/default/files/documents/isced-fields-of-education-and-training-2013-en.pdf">ISCED Fields of Education and Training 2013</a>.</p>
+<h3>Step 1: Publish vocab as turtle file(s) on GitHub</h3>
+<p>Currently, a SKOS vocab has to be published in a GitHub repository as one or more <a href="https://www.w3.org/TR/turtle/">Turtle</a> file(s) in order to be processed by SkoHub Vocabs. ESC is already <a href="https://github.com/hbz/vocabs-edu/blob/master/esc.ttl">available on GitHub</a> in one Turtle file, so there is nothing to do in this regard. Note that you can also use the static site generator locally, i.e. without GitHub integration; see <a href="#implementation">below</a> for more about this.</p>
+<h3>Step 2: Configure webhook</h3>
+<p>In order to publish a vocabulary from GitHub with SkoHub Vocabs, you have to set up a webhook in GitHub. It goes like this:</p>
+<ol>
+<li>In the GitHub repo where the vocab resides, go to “Settings” → “Webhooks” and click “Add webhook”</li>
+</ol>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 52.53164556962025%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAYAAAB/Ca1DAAAACXBIWXMAAAsTAAALEwEAmpwYAAABhElEQVQoz6WS626cMBBGef9nS6SoiqpEVdVtdlkWDCz4gm/YpzK021WTH6060vGM0PDZ89mVmBbqwXIQFrkEQoy4EHE+4EN8h/WB07GmPzeocaRpO6TStF2PNgtVWaReGCeFNpZZKiapMIvlOkmUNhtSGaRUKOuYHh7pHx+YX15wZsEBPuWNSi1+23ld140QAsMw0gmB954SOedb3qryXes9K0l4ekJ8ekZ/O1AJvdLrldkU0bSJDsNA23XMs0RrgzblhAqtNVqXSTTKGGbjmMcJ9XakP3zHjVeq9vMXprcTOa2klIgxcmlbmubC+dxwqmvOTUNd19RNy/HSc2oEFzEirpr2qjE+YEMgAVX7/Mrw9Y207oLlhGXUe5xzew4rNoANmZj4HcWSYkdKVGVcofaRQ9xFf3n2LrYf00bOaev9k6qXASHDT6G0+fhR40eUjfc67xeWM1V5LuNs8H5/ezGud43p1vi3VGJUdFeDth7rPCHE/xP0MWFDIq73vuRb/a+CPwDkpVjCrQYeOQAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot of the Webhook page in a GitHub repo with highlighted fields for the navigation path."
+ title="Screenshot of the Webhook page in a GitHub repo with highlighted fields for the navigation path."
+ src="/static/a30562938e313d345071b4f50e905510/f058b/add-webhook.png"
+ srcset="/static/a30562938e313d345071b4f50e905510/c26ae/add-webhook.png 158w,
+/static/a30562938e313d345071b4f50e905510/6bdcf/add-webhook.png 315w,
+/static/a30562938e313d345071b4f50e905510/f058b/add-webhook.png 630w,
+/static/a30562938e313d345071b4f50e905510/40601/add-webhook.png 945w,
+/static/a30562938e313d345071b4f50e905510/78597/add-webhook.png 947w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot of the Webhook page in a GitHub repo with highlighted fields for the navigation path.</figcaption>
+ </figure>
+2. Enter <code class="language-text">https://test.skohub.io/build</code> as payload URL, choose <code class="language-text">application/json</code> as content type and enter the secret. (Please <a href="http://lobid.org/team/">contact</a> us for the secret if you want to try it out.)
+<figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 77.84810126582278%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAQCAYAAAAWGF8bAAAACXBIWXMAAAsTAAALEwEAmpwYAAAB+klEQVQ4y41TW24bMQzcq/cO/egNeqCiH2mA/hSJ1/vSavV+UBOQazuO7bQlMBiBNmeHpNSlnOG9h/Mexjo4H6A3I2yshw9Rzta9/86s+j/YhheUUpGv0MVcoV1GqYRSK1prUrTqDdO8YBhnhJgQUoZnjlnAdYJUENOey6WgC7lhMoTN8x8IHOzspR93HAZMs8K8KCiloVYtZ/4Y87ysGKcZi7YgInTWGLweZxBVAE0chlQw6oDVFbhIoFpRa5WCejozyjUTSW3nfUC/OPh0SgIy00PfQykFYwxSLvtISr3MjIs/RgMRCyZCvxZo995yCBHDMGITsXxxwg7POAsyX6NjkaMuCCLGyV3wOE6yTWpArSQu+UytCT+KvWUpVjAuwoeEnAtWG3FcLHwBbCTY2GBT2zkSdCDE0i4zP0Na5gUclcfmssyG3biQcBgU5tWKcxcrXNrhM8HnhlLbXbuyZflO23GOGCO01jDGyoKC90jMzqHmjM9CWgZ26+cER85ZFsKC/GpEUG+IekO27s7ZR4c3SQ4fAqZplsvLgny1jHPYrEVI6eF2LzO8tcxRSoF1DmpdYazF/8ap5XvBypc3ZJRYZIY8089c3bX8SHALBk/Db7zaAT4GpJRkDDGmvwrTrUNq+0v5qZ7x5cdXfPv1HSZZ2Vml/aX8q+U3aH7kicYMJaUAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot of the Webhook page with input (payload URL and secret)."
+ title="Screenshot of the Webhook page with input (payload URL and secret)."
+ src="/static/8650d807e6599370e6b3819a46654b15/f058b/add-webhook2.png"
+ srcset="/static/8650d807e6599370e6b3819a46654b15/c26ae/add-webhook2.png 158w,
+/static/8650d807e6599370e6b3819a46654b15/6bdcf/add-webhook2.png 315w,
+/static/8650d807e6599370e6b3819a46654b15/f058b/add-webhook2.png 630w,
+/static/8650d807e6599370e6b3819a46654b15/40601/add-webhook2.png 945w,
+/static/8650d807e6599370e6b3819a46654b15/38124/add-webhook2.png 953w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot of the Webhook page with input (payload URL and secret).</figcaption>
+ </figure></p>
+<h3>Step 3: Execute build & error handling</h3>
+<p>For the vocabulary to be built and published on SkoHub, there has to be a new commit in the master branch. So, we have to adjust something in the vocab and push it into the master branch. Looking again at the webhook page in the repo settings, you can see a notice that the build was triggered:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/5d638e5a214e95beb550c5ddf028b8fa/01dae/check-webhook-response.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 85.44303797468356%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAARCAIAAABSJhvpAAAACXBIWXMAAAsTAAALEwEAmpwYAAAB8UlEQVQ4y42STW7bMBCFfe2ueoJuuugRuuoFeocCAWpFhWMkiC3EEiWb/z9DcjgsZKWGW8RIPzwQBMlHviG5IiLnvLVukbHWmEVOaZNSKm+BOGtFRNqY04lLqbgQ3nvn5718CNb5jEhvUYiwnM0AYIzVWnMuEEu9gm5wZY7ROieVOgmhjYEARHTb+I/ZgeaGElEuMaWUM5aScy6lvGOuVHs7rtnDhj89iU4LaYTUQmpj8UbBV+Zav3XfP/z89LH5/GX7VY+T7np1YIj4XzVra4SWXEmplQNAolmIWN5hVUrph4ExNjL2cuin41QrXa66LLri+vwVUUlUDGTlwIboQxbacW1PykLKl+DLXpUopgQxQYwZcVUruRC4thOXXT/8eGmlcx6Shbj8kForABx6Zp0HiM5574Pz8/+Za2aMGWNrrSnFKXAkXJJfoqaUtLEhzoQAMUYAWJ6KxmkahoGNo1Q6xzyvAAhhWfxKzjn9TTnHrvt997DdPj4+Pj/vdrtd13X73X6ajv42zrmc8xx7Op2E0lJrqY0yhkvpQkiIMeeYczq315qnUkLEVUEcH7ZxHPM0vep4TOMYFzEGjEX2p7OM9H3WusxPheg3G9m2fL2WbavaX6ptZXMv789qGr5eq3m8FU0j71veNPzuDoYBa/0N6FnSaftrdg8AAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot from GitHub Webhook page with information that build was triggered with link to build log."
+ title="Screenshot from GitHub Webhook page with information that build was triggered with link to build log."
+ src="/static/5d638e5a214e95beb550c5ddf028b8fa/f058b/check-webhook-response.png"
+ srcset="/static/5d638e5a214e95beb550c5ddf028b8fa/c26ae/check-webhook-response.png 158w,
+/static/5d638e5a214e95beb550c5ddf028b8fa/6bdcf/check-webhook-response.png 315w,
+/static/5d638e5a214e95beb550c5ddf028b8fa/f058b/check-webhook-response.png 630w,
+/static/5d638e5a214e95beb550c5ddf028b8fa/01dae/check-webhook-response.png 721w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot from GitHub Webhook page with information that build was triggered with link to build log.</figcaption>
+ </figure></p>
+<p>However, looking at the build log, an error is shown and the site did not build:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/c0afbbe39292d1e772cf87dc2ac2b4f5/29114/error-in-build-log.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 55.06329113924051%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAYAAAB/Ca1DAAAACXBIWXMAAAsTAAALEwEAmpwYAAABvklEQVQoz4WSXW/aMBRA+fltWUmkjQDdntbtt3S8VOWhUqtJFBKHMEyBJLbz3TPFIFg7bbV1dK+udI+u5duRUvK82SDCiJkvmIuQqS+YBQIRLRGLiCc/sASLiCBcEISn2PYFiyVi5iOCkI4yOYky7FKN0hmFaVitC3RsqLKaJm+o8wZq4OUNNaQ6I2lJFUmq6cTKEGtDonO2JsWvfH6mjzwmD0yzKUEd2JpoBOFL+IpFHbHTmkRlxDojVppOovdCrUrmOqRf9nGky2V4iStdBsWAYTVkUA6ODMsh/arP1+IaqXekKreOgzA7CU2IV3k4kUN31rVSL/dOwuqNsHxHONOCT+YjztKhO+/i/HKOE7YSGw95K7x+T9g+eVB4uEuXnt/DjVw85TEqR4yqE1fVlZ32W/md9f+ETyagV/c4l+eWM3nGxfYCp3Re4ZYul+UHPhdf/j1hW1zpDWP9g7Eac5PccJvfMqknTJq/uWvuuK/v2Wp1+GVj188KbUEZlM4p4oJUpmTbDMr9rtlYHPI/T7Pfw7a39Wzi5CC04xpS08oTAhGwXC2RzxK5lmy2G1KVWqqmojncqqmt6CjcJfwGy5Ucr+Jh6OIAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot from build log with error message"
+ title="Screenshot from build log with error message"
+ src="/static/c0afbbe39292d1e772cf87dc2ac2b4f5/f058b/error-in-build-log.png"
+ srcset="/static/c0afbbe39292d1e772cf87dc2ac2b4f5/c26ae/error-in-build-log.png 158w,
+/static/c0afbbe39292d1e772cf87dc2ac2b4f5/6bdcf/error-in-build-log.png 315w,
+/static/c0afbbe39292d1e772cf87dc2ac2b4f5/f058b/error-in-build-log.png 630w,
+/static/c0afbbe39292d1e772cf87dc2ac2b4f5/40601/error-in-build-log.png 945w,
+/static/c0afbbe39292d1e772cf87dc2ac2b4f5/78612/error-in-build-log.png 1260w,
+/static/c0afbbe39292d1e772cf87dc2ac2b4f5/29114/error-in-build-log.png 1920w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot from build log with error message</figcaption>
+ </figure></p>
+<p>Oops, we forgot to check the vocab for syntax errors before triggering the build and there actually <em>is</em> a syntax error in the turtle file. Fixing the syntax in a new <a href="https://github.com/hbz/vocabs-edu/commit/6ab97649874607df7784eaa0787adadbcefde166">commit</a> will automatically trigger a new build:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/a295ec74a05afd3b5175ce477988c1fb/29114/fix-error.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 55.06329113924051%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAYAAAB/Ca1DAAAACXBIWXMAAAsTAAALEwEAmpwYAAABmklEQVQoz42Sb2vbMBCH89GXpc2fbmQM9m6DfaRtaRInaVwY7EW9po4sS5ZkyX6KnDRraQYVPAjufvfjuLveIlkzmy9JVhu2aUp6e8t8uepiP2cLfvyaM5snXC8SrucJy9WG9U3KarM9kay3rBcJ25uU3vTzF8YfPzH+MD0xuprSv7zi/eX4wHDCYDihfzHi3SAypH8Rc5Mu34+awYiv377Ts9bhXN1h41/XCCHZCU2pDaoy6MqijUNIRVEe4pV11LXHHal9QJYlPc68LJf8zgR/7iV77ZG2RVThiCcrHKXxr+qstfTatuU58anKkhcapQ3GOkLb4kNz4hC3nfZ5nTHmvGGlDbKQ7HY5QpTdCKRUKKVRqiKOyftw3vCp3ReGpkQUOVl2Ry72CFlQyAJZyo6ilBhr3m74Ny+4u9+zLxS1b3C+ofZtN4qIDy2had9uKKWkVBrTbd2jjaUyFlM5jHFRfKqB/xj+Exw35upuIZF4EpFMPZCVO0LTvGjiqc45d75D7z1K6+NtHu40xnIjeDCCJura14YhBB4BmLo8H+QEGOsAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot from build log with error message"
+ title="Screenshot from build log with error message"
+ src="/static/a295ec74a05afd3b5175ce477988c1fb/f058b/fix-error.png"
+ srcset="/static/a295ec74a05afd3b5175ce477988c1fb/c26ae/fix-error.png 158w,
+/static/a295ec74a05afd3b5175ce477988c1fb/6bdcf/fix-error.png 315w,
+/static/a295ec74a05afd3b5175ce477988c1fb/f058b/fix-error.png 630w,
+/static/a295ec74a05afd3b5175ce477988c1fb/40601/fix-error.png 945w,
+/static/a295ec74a05afd3b5175ce477988c1fb/78612/fix-error.png 1260w,
+/static/a295ec74a05afd3b5175ce477988c1fb/29114/fix-error.png 1920w"
+ sizes="(max-width: 630px) 100vw, 630px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot from build log with error message</figcaption>
+ </figure></p>
+<p>This time the build goes through without errors and, voilà, SkoHub has published a human-readable version of the vocabulary at <a href="https://test.skohub.io/hbz/vocabs-edu/heads/master/w3id.org/class/esc/scheme.en.html">https://test.skohub.io/hbz/vocabs-edu/heads/master/w3id.org/class/esc/scheme.en.html</a>. SkoHub Static Site Generator also publishes an <a href="https://test.skohub.io/hbz/vocabs-edu/heads/master/index.en.html">overview</a> of all the SKOS vocaularies in the GitHub repo.</p>
+<h3>Step 4: Redirect vocab URI to SkoHub</h3>
+<p>As we want the canonical version of ESC to be the one published with SkoHub Vocabs, we need to redirect the namespace URI we defined in the Turtle file to SkoHub. As we used w3id.org for this, we have to make a pull request in the respective repo.</p>
+<p><a href="https://github.com/perma-id/w3id.org/pull/1483"><figure class="gatsby-resp-image-figure" style="">
+ <span class="gatsby-resp-image-wrapper" style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; ">
+ <span class="gatsby-resp-image-background-image" style="padding-bottom: 89.24050632911393%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAASCAYAAABb0P4QAAAACXBIWXMAAAsTAAALEwEAmpwYAAACvElEQVQ4y3VUSY4cRwys/3/DH/DVJwF+gS8+SJBHI0x3116570sIZNW0JUFKgCCyKhlkBoM5+JiwqIhFRtyPiBAicopIMSKX+oPV2mCcx7zuMNZh2wW0sRBS4xASMWUMdNCFDOMShA5wcodeH4hawLoA5yN8iLDOMxgBHVJxMH0nH2JiT1iD0gr/vb5gnCbM84x123F7THi7P7DtO9btQEwJvXfUWtFaY6M9rXfvvEeiCj+qF/zx+U+s+wYhJKRSUEpiFxq7yZDGw4QCFwms4vtFYO+AnKh3DKUWCCOZi1IqX4OukHKBcQEpV1gfoLRFjAmtgwPbBYQO1FZh9IGSEoZpHPHy5Svmw7Ct0kOHChMbzOXtZcoXrMpjUwE+VZiriLPCytUOxAsRSmQrZdioWqXN0+h/40o6amuX7xdgua7PxWI4HWAtXSki5/I89DNXbABypnOZAb0PJ9j1f6CA2gHriLuMmAp8SMwdcem954Pfd7c30mThzv6QjACj+Ad2+oDoD3izwJkFVpPNMHKC9xboBFSBXkFNDKkwoLXu/yTvgCVMyPoTin1AyRXeCpRskZNGzfYknDV4VphLYwnlnH4NSAG5d9zFA+N9xmPe8JhW3McF6yawC8mmreNJIX59LEyH1gb5J74ZMDSHf4+PONYN07ph3iSWecK2rViWBeu6cEXnaiy+3gs3hbRJlaWceZIGCrjdbmil8bxSZspKwnY+wIVweh9YWsY5CJ7lhEMo3O4j3u4jdqFYkwN1kcaNAqS2PBXUbeo0PQ60twwcT+8jlHGc0IfAOqUXiqp7ckjaogP0WpTafmv1st7AD8GyrM9ZftcrA5Za+TraOChjeVI07y03giaF6OCKQ8BuJKTR2A9x0uE8C/3ZlNY6d+9XRqMWnMc4zpiXHS/LG/56/Ruv8gbCMD6iXA0hwG+z5XasAJgiPgAAAABJRU5ErkJggg=='); background-size: cover; display: block;"></span>
+ <img class="gatsby-resp-image-image" alt="Screenshot of a pull request to redirect ESC to SkoHub" title="Screenshot of a pull request to redirect ESC to SkoHub" src="/static/dcbc73109ed2edf14419bc87466a5342/f058b/open-pr-at-w3id.png" srcset="/static/dcbc73109ed2edf14419bc87466a5342/c26ae/open-pr-at-w3id.png 158w,
+/static/dcbc73109ed2edf14419bc87466a5342/6bdcf/open-pr-at-w3id.png 315w,
+/static/dcbc73109ed2edf14419bc87466a5342/f058b/open-pr-at-w3id.png 630w,
+/static/dcbc73109ed2edf14419bc87466a5342/40601/open-pr-at-w3id.png 945w,
+/static/dcbc73109ed2edf14419bc87466a5342/eb2af/open-pr-at-w3id.png 954w" sizes="(max-width: 630px) 100vw, 630px" style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;" loading="lazy" decoding="async">
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot of a pull request to redirect ESC to SkoHub</figcaption>
+ </figure></a></p>
+<p>If everything looks good, w3id.org PRs are merged very quickly, in this case it happened an hour later.</p>
+<h3>Result: HTML & JSON-LD representation published with SkoHub & basic GitHub editing workflow</h3>
+<p>As a result, we have published a controlled vocabulary in SKOS under a permanent URI and with a human-readable <a href="https://w3id.org/class/esc/scheme.html">HTML</a> representation from GitHub with a minimum amount of work. Additionally, the initial Turtle representation is transformed to more developer-friendly <a href="https://test.skohub.io/hbz/vocabs-edu/heads/master/w3id.org/class/esc/scheme.json">JSON-LD</a>. The HTML has a hierarchy view that can be expanded and collapsed at will:</p>
+<p><a href="https://test.skohub.io/hbz/vocabs-edu/heads/master/w3id.org/class/esc/scheme.en.html"><figure class="gatsby-resp-image-figure" style="">
+ <span class="gatsby-resp-image-wrapper" style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 630px; ">
+ <span class="gatsby-resp-image-background-image" style="padding-bottom: 56.32911392405063%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAALCAYAAAB/Ca1DAAAACXBIWXMAAAsTAAALEwEAmpwYAAABfUlEQVQoz62P20rjUBiF82bOlVgG8UIQxLuBufYBvPQxFLStWBhm8IRVEBRBRWRQk1btEA+1traxSbrbNImJSb8hoR4Rqsws+Nh7/4fF2tJEfoHJQpZx+SffDzN8+z3P2H6a0f05RvbSDO+kGdpOMriV5OvGLIn1GRJrsyRWkwysJBlYTtG/kOJLZpq+uSmkU6OCrF1zqBU5qF6wV1HZrV6wWTpjo3jK+lWe7GWeJVUmWyqwVlZZLv1hsVjg1+UZP85PyKh55tUTUoUckjAaVG7K1Ksad7c1hG7iCAu9fItRqca0dZPAcuAhpJcks9FA0zTato3n+3SAe9+P322ni21jOw4ty6LZaj2dTev5btk2rucj6YZBrVajXq9jmiaue08Q9k7ynqK9OKGu6wjRRAiB47pdww6dzseJDYMQqSGacTLP8wiCgDAM40akx8FeemX49ntR858MH4IgLoTd6FHCx9pnifYk/rMkWVFQlBzyC47fqfVCyeU4Opb5CzP1NdXpUl2bAAAAAElFTkSuQmCC'); background-size: cover; display: block;"></span>
+ <img class="gatsby-resp-image-image" alt="Screenshot of the HTML version of ESC published with SkoHub." title="Screenshot of the HTML version of ESC published with SkoHub." src="/static/e5a580cc6f532f9f98c427ff90a06429/f058b/published-vocab.png" srcset="/static/e5a580cc6f532f9f98c427ff90a06429/c26ae/published-vocab.png 158w,
+/static/e5a580cc6f532f9f98c427ff90a06429/6bdcf/published-vocab.png 315w,
+/static/e5a580cc6f532f9f98c427ff90a06429/f058b/published-vocab.png 630w,
+/static/e5a580cc6f532f9f98c427ff90a06429/40601/published-vocab.png 945w,
+/static/e5a580cc6f532f9f98c427ff90a06429/78612/published-vocab.png 1260w,
+/static/e5a580cc6f532f9f98c427ff90a06429/29114/published-vocab.png 1920w" sizes="(max-width: 630px) 100vw, 630px" style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;" loading="lazy" decoding="async">
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot of the HTML version of ESC published with SkoHub.</figcaption>
+ </figure></a></p>
+<p>There also is a search field to easily filter the vocabulary:</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 500px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/15d14a94c2569955b4f41228f46ab1ec/0b533/skohub-ssg-filter.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 82.27848101265823%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAQCAYAAAAWGF8bAAAACXBIWXMAAAsTAAALEwEAmpwYAAAC7UlEQVQ4y6WRzYscVRDA54+QrJ7j5ubBg+JJFEQPIngSFAQ95ZJcchMha/wa4iHBEBMQRQjBHCKCEJC4SXbjRtyEWXXdzHZPT+/szPZ3T3/MdM9Md8/XT96bzLrrTS34UfXqVdWrqld6uvIZL/9xiecq53nxt4s8v3GBZx+c44XKRV7auMQz6+dYvPcJx9bKHFv7lMWfyzPulllcLfOkYKXM0Ttljt4uU3qqcpaT2nXerF7hXeVbjivXeG3zK96uXuWE+h2vbFzmsZX3eWJ1iYWV0yzcWeKI4PYSR249YvkDFpbP8PjyGUpf7v3CF801LrfWuLB7l893VznfWOGsfouP6z/xkXaTD7WbnK79yHvKDU5Vf+Dk1vcc37zOO79f462Nq7xRucLrD77h1fWvKU2yCdNswmQwQdhzpnO/ZHzo7mDMOBuTD4b0+jmdXp9S3OuTDDLSrCDJ8n9NKinoZwW9rKA0yAsG2YC426Gbdv8zcdKhnw0oFcMxpmNyb/1XNqtV1B19H0Wvo9TrUu+f9b/Pc5+q62ypCi2jSSkfjvADH0WrUW/soO3oVFWF7Zoqz/puY99X323MYhq6vD8Yo+oalmNTyooRYRRjWA6W49EyTDb/3OLhtoph2Vi2K336TkPem7aDaTk8rCpsKzVahsWeadNo7mFZttjhkNFohOd5BEGbOI5IkkTS6XQkcRxLnSRdaYtYoaMoYpBlDIdDiqJgMpnMCgqHptVlkus6MmFOEIQ4jotlWQRBgO+3MQ0D0zSxbVvmHhT5y6JymqaotRphGBKGAZ7r4vk+URRLu9XawzAMWfSfIvKn06lEFhyPx/Klbrcrk3zfx3VdqTVNo9lsyg79dhvXcbAdB8dxsExTdjyXQwVdzyMKQ3zPk7ZIEMFip6LrKAqJ4pgwCGgHgdyfmETY7Xab0Wg8G1n8cp7nrN+/T6vVIk0SGTQbN5JdC8QnxXHn0WfEpGlP7lzEHCrYzwvZ6nwP/0fEyH8B2i950qA9U44AAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Screenshot: Filter the scheme by yping in the search box"
+ title="Screenshot: Filter the scheme by yping in the search box"
+ src="/static/15d14a94c2569955b4f41228f46ab1ec/0b533/skohub-ssg-filter.png"
+ srcset="/static/15d14a94c2569955b4f41228f46ab1ec/c26ae/skohub-ssg-filter.png 158w,
+/static/15d14a94c2569955b4f41228f46ab1ec/6bdcf/skohub-ssg-filter.png 315w,
+/static/15d14a94c2569955b4f41228f46ab1ec/0b533/skohub-ssg-filter.png 500w"
+ sizes="(max-width: 500px) 100vw, 500px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Screenshot: Filter the scheme by yping in the search box</figcaption>
+ </figure></p>
+<p>This filter is based on a <a href="https://github.com/nextapps-de/flexsearch">FlexSearch</a> index that is also built along with the rest of the content. This allows us to implement lookup functionalities without the need for a server-side API. More about this below and in the upcoming post on the SkoHub Editor.</p>
+<h2>Implementation</h2>
+<p>To follow along the more technical aspects, you might want to have SkoHub Vocabs checked out locally:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">$ git clone https://github.com/hbz/skohub-vocabs
+$ cd skohub-vocabs
+$ npm i
+$ cp .env.example .env</code></pre></div>
+<p>The static site generator itself is implemented with <a href="https://www.gatsbyjs.org/">Gatsby</a>. One reason for this choice was our good previous experience with <a href="https://reactjs.org/">React</a>. Another nice feature of Gatsby is that all content is sourced into an in-memory database that is available using <a href="https://graphql.org/">GraphQL</a>. While there is certainly a learning curve, this makes the experience of creating a static site not that much different from traditional database-based approaches. You can locally build a vocab as follows:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text">$ cp test/data/systematik.ttl data/
+$ npm run build</code></pre></div>
+<p>This will result in a build in <code class="language-text">public/</code> directory. Currently, the build is optimized to be served by Apache with <a href="https://httpd.apache.org/docs/2.4/mod/mod_negotiation.html">Multiviews</a> in order to provide content negotiation. Please note that currently only vocabularies are supported that implement the <a href="https://www.w3.org/2001/sw/BestPractices/VM/http-examples/2006-01-18/#slash">slash namespace</a> pattern. We will add support for hash URIs in the future.</p>
+<p>In order to trigger the static site generator from GitHub, a small webhook server based on <a href="https://koajs.com/">Koa</a> was implemented. (Why not <a href="https://expressjs.com/">Express</a>? – It wouldn’t have made a difference.) The <a href="https://developer.github.com/webhooks/">webhook</a> server listens for and validates POST requests coming from GitHub, retrieves the data from the corresponding repository and then spins up Gatsby to create the static content.</p>
+<p>A final word on the FlexSearch index mentioned above. An important use case for vocabularies is to access them from external applications. Using the FlexSearch library and the index pre-built by SkoHub Vocabs, a lookup of vocabulary terms is easy to implement:</p>
+<div class="gatsby-highlight" data-language="text"><pre class="language-text"><code class="language-text"><script src="https://cdnjs.cloudflare.com/ajax/libs/FlexSearch/0.6.22/flexsearch.min.js"></script>
+
+<script>
+ fetch('https://w3id.org/class/esc/scheme', {
+ headers: { accept: 'text/index'}
+ }).then(response => response.json())
+ .then(serialized => {
+ const index = FlexSearch.create()
+ index.import(serialized)
+ console.log(index.search("philosophy"))
+ })
+</script></code></pre></div>
+<p>Note that currently the index will only return URIs associated with the search term, not the corresponding labels. This will change in a future update.</p>https://blog.skohub.io/2019-05-17-skohub/https://blog.skohub.io/2019-05-17-skohub/Fri, 17 May 2019 00:00:00 GMT<p>For a long time, openness movements and initiatives with labels like “Open Access”, “Open Educational Resources” (OER) or “Linked Science” have been working on establishing a culture where scientific or educational resources are by default published with an <a href="http://opendefinition.org/">open</a> license on the web to be read, used, remixed and shared by anybody. With a growing supply of resources on the web, the challenge grows to learn about or find resources relevant for your teaching, studies, or research.</p>
+<p>In this post, we describe the SkoHub project being carried out in 2019 by the hbz in cooperation with graphthinking GmbH. The project seeks to implement a prototype for a novel approach in syndicating content on the web by combining current web standards for sending notifications and subscribing to feeds with knowledge organization systems (KOS, sometimes also called “controlled vocabularies”).*</p>
+<h2>Current practices and problems</h2>
+<p>What are the present approaches to the problem of finding open content on the web, and what are their limitations?</p>
+<h3>Searching metadata harvested from silos</h3>
+<p>Current approaches for publishing and finding open content on the web are often focused on repositories as the place to publish content. Those repositories then provide (ideally standardized) interfaces for crawlers to collect and index the metadata in order to offer search solutions on top. An established approach for Open Access (OA) articles goes like this:</p>
+<ul>
+<li>Repositories with interfaces for metadata harvesting (<a href="http://www.openarchives.org/OAI/openarchivesprotocol.html">OAI-PMH</a>) are set up for scholars to upload their OA publications</li>
+<li>Metadata is crawled from those repositories, normalized and loaded into search indexes</li>
+<li>Search interfaces are offered to end users</li>
+</ul>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 421px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/09b20404ba79837614549b8eb52ee6d1/092ed/repo-approach.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 89.87341772151898%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAASCAYAAABb0P4QAAAACXBIWXMAAAsTAAALEwEAmpwYAAABqElEQVQ4y5WUiYrDMAxE/f8fWChtAr3S9G6T9D60PMF4vSHLbgSuXVsajTWKQ1mWdj6f7Xg82ul08vXhcLDb7Wabzcbquo5nTdP42fV6td1uZ1VVxTP8OA+Xy8WwyWTiDhhg7BOIzedzB8eez6cnVdx6vbaiKEwW9vu9PR4Pz0DW+/3umZjJzkwiWLDGj4TMYqo9/gd+GGySDSYAvd9vdwKEpLBgzR5n+OC7Wq0iOKyDdZgY40Qwxvrz+fjMDfDpsoBTe8i4JleC+WKxiEIoSVdsZNgGJEj1BQh1EQR2iKKYdHaGbcopIMEYrSUlpXIb6E9AZgpPsOpG0akt4vQClKNU3m633ousJU5vhlwZIFghDmJorabuDagaIg7C/KuGKf10AIiyqEzb0MCsSaJPst0ZsW26MmEwgR09qD4U699uFtIr6kEAiPoBhjAyfYaciSkxr9frm2GqqJ4tWgMRAEAEfWZ63jhnAMZ/vVI/AAlmjdNyufQe1L76jmQAUE+uryctFSkCcoXBYGB5nttoNLLxeGxZltl0Oo1Xms1mNhwOfV/nzJRAgF88jHsXHV2+bAAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="Diagram of the current approach with metadata being crawled from repos, indexed and search offered on top."
+ title="Diagram of the current approach with metadata being crawled from repos, indexed and search offered on top."
+ src="/static/09b20404ba79837614549b8eb52ee6d1/092ed/repo-approach.png"
+ srcset="/static/09b20404ba79837614549b8eb52ee6d1/c26ae/repo-approach.png 158w,
+/static/09b20404ba79837614549b8eb52ee6d1/6bdcf/repo-approach.png 315w,
+/static/09b20404ba79837614549b8eb52ee6d1/092ed/repo-approach.png 421w"
+ sizes="(max-width: 421px) 100vw, 421px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">Diagram of the current approach with metadata being crawled from repos, indexed and search offered on top.</figcaption>
+ </figure></p>
+<p>With this approach, subject-specific filtering is either already done when crawling the data to create a subject-specific index, or when searching the index.</p>
+<h3>Maintenance burden</h3>
+<p>When offering a search interface with this approach, you have to create and maintain a list of sources to harvest:</p>
+<ol>
+<li>watch out for new relevant sources to be added to your list,</li>
+<li>adjust your crawler to changes regarding the services’ harvesting interface,</li>
+<li>homogenize data from different sources to get a consistent search index.</li>
+</ol>
+<p>Furthermore, end users have to know where to find your service to search for relevant content.</p>
+<h3>Off the web</h3>
+<p>Besides being error-prone and requiring resources for keeping up with changes in the repositories, this approach also does not take into account how web standards work. As <a href="https://doi.org/10.1045/november2015-vandesompel">Van de Sompel and Nelson 2015</a> (both co-editors of the OAI-PMH specification) phrase it:</p>
+<blockquote>
+<p><small>“Conceptually, we have come to see [OAI-PMH] as repository-centric instead of resource-centric or web-centric. It has its starting point in the repository, which is considered to be the center of the universe. Interoperability is framed in terms of the repository, rather than in terms of the web and its primitives. This kind of repository, although it resides on the web, hinders seamless access to its content because it does not fully embrace the ways of the web.”</small></p>
+</blockquote>
+<p>In short, the repository metaphor guiding this practice obscures what constitutes the web: <strong>resources</strong> that are identified by <strong>HTTP URIs</strong> (<a href="https://en.wikipedia.org/wiki/Uniform_Resource_Identifier">Uniform Resource Identifier</a>).</p>
+<h2>Subject-specific subscription to web resources</h2>
+<p>So how could a web- or resource-centric approach to resource discovery by subject look like?</p>
+<h3>Of the web</h3>
+<p>To truly be part of the web, URIs are the most important part: Every resource (e.g. an OER) needs a URL that locates and identifies it. In order to make use of knowledge organization systems on the web, representing a controlled vocabulary using <a href="https://en.wikipedia.org/wiki/Simple_Knowledge_Organization_System">SKOS</a> vocabulary is the best way to go forward: each subject in the vocabulary is identified by a URI. With these prerequisites, anybody can link their resources to subjects from a controlled vocabulary. This can be done e.g. by embedding <a href="http://www.dublincore.org/specifications/lrmi/lrmi_1/">LRMI</a>, “Learning Resource Metadata Initiative” metadata as JSON-LD into the resource or its description page.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 221px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/91c27021f2bed77637541330d58cb36b/cccdc/subject-indexing-with-uris.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 127.21518987341773%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAZCAYAAAAxFw7TAAAACXBIWXMAAAsTAAALEwEAmpwYAAAB50lEQVQ4y72V56ojMQyF5/0fKZDeCymE9D/pvfeu5dPFw93sTO7sEtZwGNuSFck6J7bq9brs93tZr9cfgTUej+WTQwPO53MJhUISi8WkWCxKPB6XVCqle3zNXjQaVVQqFcnn85JMJiWTyUg2m5Vut/sVcDgc6uR4PGrKp9NJr+BwOMhyubTns9lM7ZvNRn3YwwY4e71evwL2er2Plft8PsXqdDpC2aPRyBHYKKfRaMhkMhEqcvMFlons9osMylksFvLO127K4/FQJycw2u22FAoFyeVyAsVMUDdYXu4FFnA1P2WnGeJM55xAV8/ns9IEalwuF1mtVq7+wKrVatr6746v7N9ut7Lb7dTn/yul3+/bC6jh8/m0Ael0WhKJhCrE7/dLqVRSVRhFBQIBtTWbTT1rmmsrhcX9freVwn0xB1wDa9RB2cwpD4Vw5rcMjVIweOmiEwvIzsBCBW5KIXtYQIcjkYiS25NSfhqU51Xzrkphn9FqtewmlcvltyrxrBSaMZ1OvSkFR0qie06A1EZ6htzvYCF42m/UwNcJrzY3X1spf0sbc8d/lAwNMKKKYDCobwXviHlD2KtWqxIOh3X9+sWOihCABhwMBnaGbPLvcrvdbKUYNVCOeTsAPt/npjo74L+oxOlN+QU8Bnvq+QJwZQAAAABJRU5ErkJggg=='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="A diagram with three resources (an OER, a subject and a classification), each identified by a URI and linked together"
+ title="A diagram with three resources (an OER, a subject and a classification), each identified by a URI and linked together"
+ src="/static/91c27021f2bed77637541330d58cb36b/cccdc/subject-indexing-with-uris.png"
+ srcset="/static/91c27021f2bed77637541330d58cb36b/c26ae/subject-indexing-with-uris.png 158w,
+/static/91c27021f2bed77637541330d58cb36b/cccdc/subject-indexing-with-uris.png 221w"
+ sizes="(max-width: 221px) 100vw, 221px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">A diagram with three resources (an OER, a subject and a classification), each identified by a URI and linked together</figcaption>
+ </figure></p>
+<h3>Web-based subscriptions and notifications</h3>
+<p>So, HTTP URIs for resources and subject are important to transparently publish and thereafter identify and link educational resources, controlled vocabularies and subject on the web. But with URIs as the basic requirement in place, we also get the possibility to utilize further web standards for the discovery of OER. For SkoHub, we make use of <a href="https://www.w3.org/TR/social-web-protocols/">Social Web Protocols</a> to build an infrastructure where services can send and subscribe to notifications for subject. The general setup looks as follows:</p>
+<ol>
+<li>Every element of a controlled vocabulary gets an inbox, identified by a URL.</li>
+</ol>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 479px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/7348d68aaa1145cecf691f6ae17749ca/a9b70/subject-indexing-with-uris-and-inbox.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 58.86075949367089%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAMCAYAAABiDJ37AAAACXBIWXMAAAsTAAALEwEAmpwYAAABCElEQVQoz5WSaaqEQAyEvf/JFG/gf/ddcV/y+AIZfMKg01B0JyRFqtJOWZZSFIXkeS72truua2maRvZ9F855nvJ0HIiyLJM4jiVNU70h5A3ZMAyybdt7wnEctYm773vpuk4sB8gZEfcTnCt7FEU62f1cGx4nnKZJlmWRdV3Vt6qqVOI8z5/Jr8TkqKUHUEdMnlglMxneJUmifoZhqDHLYDH43Lat5sxvak0R9dRA6thkNDIdoIkbQvOVYpYErM5+BCDPcDqhjW0TXCVjyV2ySTUQwwPHv6UY4belvDkOkvAHWeD+RspxHO+/DU3mh/nDG7ORDpDy+mP7vi+e54nruh8QG4Ig+En2H385ocIb7HZiAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together"
+ title="A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together"
+ src="/static/7348d68aaa1145cecf691f6ae17749ca/a9b70/subject-indexing-with-uris-and-inbox.png"
+ srcset="/static/7348d68aaa1145cecf691f6ae17749ca/c26ae/subject-indexing-with-uris-and-inbox.png 158w,
+/static/7348d68aaa1145cecf691f6ae17749ca/6bdcf/subject-indexing-with-uris-and-inbox.png 315w,
+/static/7348d68aaa1145cecf691f6ae17749ca/a9b70/subject-indexing-with-uris-and-inbox.png 479w"
+ sizes="(max-width: 479px) 100vw, 479px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together</figcaption>
+ </figure>
+2. Systems can send notifications to the inbox, for example “This is a new resource about this subject”.
+<figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 479px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/bab82f631204dc5c42b2e30c8705c217/a9b70/sending-notification.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 60.75949367088608%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAMCAYAAABiDJ37AAAACXBIWXMAAAsTAAALEwEAmpwYAAABO0lEQVQoz4WSyQrDMAxE/f9fmENOIfu+74nLU1EwpaUCIXssy6OxTNu2dlkWO02Tnef5cd0Pw2DXdbWu3fctdzSXNRhm6rq2WZaJJ0nyrIuikEjhqqok4mma2q7rxMMwtHEc23EcZX8cx7sgG5i6se97iSTjMLiuSy7BGOY4eZyDkWO2bZMkHOowgMl5noLt+y6FtFXXKEKua4YiHHCRl3zfl1eViasPRh66EekGachVzECbS0TVgsieJIqqPqybpnlkUHk0HyLSMpsoiuRT8jyXGASB4BgylGX5nMOKNR/Cx3EXjHzDizoCRA7QRTGSeFQNNp8jpmMmBdGHdhAe0PM8OXBx/RB1sG8uv0zfMEAf1YgXFYORMvz85W9mEJU2cfRhLonoBuaOhsvylxsY6BC7w42DMwpuwX8MXzdZojR43FRZAAAAAElFTkSuQmCC'); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together plus a notification being sent from the OER to the suject's inbox"
+ title="A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together plus a notification being sent from the OER to the suject's inbox"
+ src="/static/bab82f631204dc5c42b2e30c8705c217/a9b70/sending-notification.png"
+ srcset="/static/bab82f631204dc5c42b2e30c8705c217/c26ae/sending-notification.png 158w,
+/static/bab82f631204dc5c42b2e30c8705c217/6bdcf/sending-notification.png 315w,
+/static/bab82f631204dc5c42b2e30c8705c217/a9b70/sending-notification.png 479w"
+ sizes="(max-width: 479px) 100vw, 479px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together plus a notification being sent from the OER to the suject's inbox</figcaption>
+ </figure>
+3. Systems can subscribe to a subject’s inbox and will directly receive a notification as soon as it is received (push approach).
+<figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 479px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/1f4f72b3e6e880fe043b2587a7b99a98/a9b70/pushing-notification.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 62.0253164556962%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAMCAYAAABiDJ37AAAACXBIWXMAAAsTAAALEwEAmpwYAAABUklEQVQoz22TjY6CQAyEef8XNCSaAIoIKAgqCIi9fE2G7F2uSdMy25/Zbom6rrP3+23jOP6rwzC4Iuu6ui7LYs/nczt7vV6OIVHTNHY+n60oCsvzfPOrqnJL4vV6dYteLhe73+8GkePx6DkUBKNo1LatAwRjH4/H5qsITEMhkXPd4BdDXQHLYZZlXkT45/Ox7/e7aSgUhEAoEQAH0zR5wTRNPWieZ8dVXMK8iSOe29V17bFiHIkys2QO0tvt5jgM5TM35omlmHLwsRDxgho23Ugoy9IfhwDYgasJ51geDeWbXDAv2Pe9X0OrQyHYcCVdWQPX3DQizmCr8ZAXhcMHiOPYaBLiSPgo4PgwgwA+Rf1RwsXGarjSv2ujlYIhlubynSEAndRNC306nRzTC0ogoHNWbLfbWZIknuczJAAHpXioYNoCif4Sih4OB9vv996YePAfLLqchdSPdjkAAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together plus a notification being sent from the OER to the suject's inbox"
+ title="A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together plus a notification being sent from the OER to the suject's inbox"
+ src="/static/1f4f72b3e6e880fe043b2587a7b99a98/a9b70/pushing-notification.png"
+ srcset="/static/1f4f72b3e6e880fe043b2587a7b99a98/c26ae/pushing-notification.png 158w,
+/static/1f4f72b3e6e880fe043b2587a7b99a98/6bdcf/pushing-notification.png 315w,
+/static/1f4f72b3e6e880fe043b2587a7b99a98/a9b70/pushing-notification.png 479w"
+ sizes="(max-width: 479px) 100vw, 479px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">A diagram with four resources (an OER, a subject, an inbox, a classification), each identified by a URI and linked together plus a notification being sent from the OER to the suject's inbox</figcaption>
+ </figure></p>
+<p>This infrastructure allows applications</p>
+<ol>
+<li>to send a notification to a subject’s inbox containing information about and a link to new content about this subject</li>
+<li>to subscribe to the inbox of a subject from a knowledge organization system in order to receive push updates about new content in real time.</li>
+</ol>
+<p>Here is an example: a teacher is interested in new resources about environmental subjects. She subscribes to the subject via a controlled vocabulary like <a href="https://unesdoc.unesco.org/ark:/48223/pf0000235049">ISCED-2013 Fields of Education and Training</a>. She then receives updates whenever a colleague publishes a resource that is linked to the subject.</p>
+<p><figure class="gatsby-resp-image-figure" style="">
+ <span
+ class="gatsby-resp-image-wrapper"
+ style="position: relative; display: block; margin-left: auto; margin-right: auto; max-width: 321px; "
+ >
+ <a
+ class="gatsby-resp-image-link"
+ href="/static/09f488bdb6d0d546d1e0c74279c53e8d/30592/pubsub.png"
+ style="display: block"
+ target="_blank"
+ rel="noopener"
+ >
+ <span
+ class="gatsby-resp-image-background-image"
+ style="padding-bottom: 93.67088607594937%; position: relative; bottom: 0; left: 0; background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAATCAYAAACQjC21AAAACXBIWXMAAAsTAAALEwEAmpwYAAACYElEQVQ4y5VUS0tyURQ9v9aZJOjYJoJjKQMlZw5sEiaiFYRaIaWGpkg+KHtoia9INJ+5Ym04fVdJ4ztwuN59t+usvdY+Wy0WC3BzxWIxHB4eIhQK4eDgAM/PzxL/+vqCXjp3Mpng4eEBT09P8my1WhJXOuH4+BgXFxfy2+12Y3d3F+l0Wt7n8zn0wRq8Xq8jHo/j9fVVtt1uRzgchuLHaDT6A/b4+IizszPkcjnc3NwssTKut7c33N/f/7yfnp4KqGKZyWRSgkzQv/k8Pz/H+/s7Pj4+BIC72+3i8/MT5XIZpVIJzWYT+Xwe/X5fmCqfz4fZbIaXlxdcX18L2OXlJU5OTqTUwWAgz16vJzqNRiPJob4EzGazODo6QiaTQaVSgep0Otjf318CowR/LbKhGSSSSqXkABqktB7BYFBcphZat9+2NoWlUutqtSosKRdNUixHJ2hmxlZaNUS/E5gy0O2rqyuwUuqrjEkMagZG0NW1GqcHegkgheZJ1KPRaGA4HG5kaGzuWq0m5pAM40qXS1PYe4lEQsTexFLHptOp6E9QttYPQ1Lmx3a7Lady/0/Zxnxl/ED7yXbd7fit7PF4jEKhAG2u0iZQu+3tbTgcDtFjHcPV9uH93drakuZeAmRT2mw2WCwW6S3dGuvaRjMKBAIwmUwyTJamDQEjkYg0tr5ef+nHPF45jrxisfjPlNvbW5jNZuzs7MDj8cDr9a7tOaMMzLVarfD7/XA6nTKdlB6se3t7kkBQDgz24qabwunCGcDN/7pcLplOSmvFcXR3dyfzcB3YKijNY6m8x+xd4nwDrPiBhn1BUk4AAAAASUVORK5CYII='); background-size: cover; display: block;"
+ ></span>
+ <img
+ class="gatsby-resp-image-image"
+ alt="A diagram with Teacher A subscribing to a subject tag that a document published by Teacher B is about, thus receiving notifications about the new resource."
+ title="A diagram with Teacher A subscribing to a subject tag that a document published by Teacher B is about, thus receiving notifications about the new resource."
+ src="/static/09f488bdb6d0d546d1e0c74279c53e8d/30592/pubsub.png"
+ srcset="/static/09f488bdb6d0d546d1e0c74279c53e8d/c26ae/pubsub.png 158w,
+/static/09f488bdb6d0d546d1e0c74279c53e8d/6bdcf/pubsub.png 315w,
+/static/09f488bdb6d0d546d1e0c74279c53e8d/30592/pubsub.png 321w"
+ sizes="(max-width: 321px) 100vw, 321px"
+ style="width:100%;height:100%;margin:0;vertical-align:middle;position:absolute;top:0;left:0;"
+ loading="lazy"
+ decoding="async"
+ />
+ </a>
+ </span>
+ <figcaption class="gatsby-resp-image-figcaption">A diagram with Teacher A subscribing to a subject tag that a document published by Teacher B is about, thus receiving notifications about the new resource.</figcaption>
+ </figure></p>
+<p>To be really useful, applications for subscribing to content should enable additional filters, to subscribe to combinations of subjects (e.g. “Environment” & “Building and civil engineering”) or to add addtional filters on educational level, license type etc.</p>
+<h2>Advantages</h2>
+<p>This subject-oriented notification/subscription approach to content syndication on the web has many advantages.</p>
+<p><strong>Push instead of pull</strong>
+<br />
+With the push approach, you subscribe once and content is coming from different and new sources without the subscriber having to maintain a list of sources. Of course quality control might become an issue. Thus, instead of whitelisting by administering a subscription list one would practice blacklisting by filtering out sources that distribute spam or provide low-quality content.</p>
+<p><strong>Supporting web-wide publications</strong></p>
+<p>Being of the web, SkoHub supports publications residing anywhere on the web. While the repository-centric approach favours content in a repository that provides interfaces for harvesting, with SkoHub any web resource can make use of the notification mechanism. Thus, content producers can choose which tool or platform best fits their publishing needs, be it YouTube, a repository, hackmd.io or else. The only requirement is for publications to have a stable URL and, voilà, they can syndicate their content via KOS.</p>
+<p><strong>Knowledge organization systems are used to their full potential</strong>
+<br />
+This additional layer to the use of Knowledge Organization Systems makes them much more powerful (“KOS on steroids”) and attractive for potential users.</p>
+<p><strong>Encouraging creation and use of shared Knowledge Organization Systems across applications</strong>
+<br />
+In the German OER context it is a recurring theme that people are wishing everybody would use the same controlled vocabularies so that data exchange and aggregation required less mapping. With a SkoHub infrastructure in place, there are big additional incentives on going forward in this direction.</p>
+<p><strong>Incentive for content producers to add machine-readable descriptions</strong>
+<br />
+When subject indexing becomes tantamount with notifying interested parties about one’s resources, this means a huge incentive for content producers to describe their resources with structured data doing subject indexing.</p>
+<h2>SkoHub project scope</h2>
+<p>The SkoHub project has four deliverables. While working on the backend infrastructure for receiving and pushing notifications (skohub-pubsub), we also want to provide people with means to publish a controlled vocabulary along with inboxes (skohub-ssg), to link to subjects and send notifications (skohub-editor) and to subscribe to notifications in the browser (skohub-deck).</p>
+<p><strong>skohub-pubsub: Inboxes and subscriptions</strong>
+<br />
+Code: <a href="https://github.com/hbz/skohub-pubsub">https://github.com/hbz/skohub-pubsub</a>
+<br />
+This part provides the SkoHub core infrastructure, setting up basic inboxes for subjects plus the ability of subscribing to push subscriptions for each new notification.</p>
+<p><strong>skohub-ssg: Static site generator for Simple Knowledge Organization Systems</strong>
+<br />
+Code: <a href="https://github.com/hbz/skohub-ssg">https://github.com/hbz/skohub-ssg</a>
+<br />
+This part of the project covers the need to easily publish a controlled vocabulary as a SKOS file, with a basic lookup API and a nice HTML view including links to an inbox for each subject.</p>
+<p><strong>skohub-editor: Describing & linking learning resources, sending notifications</strong>
+<br />
+Code: <a href="https://github.com/hbz/skohub-editor">https://github.com/hbz/skohub-editor</a>
+<br />
+The editor will run in the browser and enable structured description of educational resources published anywhere on the web. It includes validation of the entered content for each field and lookup of controlled values via the API provided by skohub-ssg.</p>
+<p><strong>skohub-deck: Browser-based subscription to subjects</strong>
+<br />
+Code: <a href="https://github.com/hbz/skohub-deck">https://github.com/hbz/skohub-deck</a>
+<br />
+The SkoHub deck is a proof of concept to show that the technologies developed actually work. It enables people to subscribe to notifications for specific subjects in the browser. The incoming notifications will be shown in a Tweetdeck-like interface.</p>
+<h2>Outlook</h2>
+<p>The project will be completed by end of 2019. We intend to provide updates about the process during the way. Next up, we will explain the technical architecture in more detail, expanding on our use of social web protocols. Furthermore, we will provide updates on the development status of the project.</p>
+<hr>
+<p>*<small> Note that while SkoHub has clear similarities with the “Information-Sharing Pipeline” envisioned in Ilik and Koster 2019 regarding the use of social web protocols on authority data, there is also a fundamental difference: While Ilik and Koster are talking about sharing <em>updates</em> of authority entries themselves (e.g. receiving updates for a person profile to be considered for inclusion in one’s own authority file), SkoHub is about sharing new <em>links</em> to an entry in an authority file or other controlled vocabulary.</small></p>
+<h2>References</h2>
+<p>de Sompel, Herbert Van / Nelson, Michael L. (2015): Reminiscing About 15 Years of Interoperability Efforts. D-Lib Magazine 21 , no. 11/12. DOI: <a href="https://doi.org/10.1045/november2015-vandesompel">10.1045/november2015-vandesompel</a></p>
+<p>Ilik, Violeta / Koster, Lukas (2019): Information-Sharing Pipeline, The Serials Librarian, DOI: <a href="https://doi.org/10.1080/0361526X.2019.1583045">10.1080/0361526X.2019.1583045</a>. Preprint: <a href="https://doi.org/10.31219/osf.io/hbwf8">https://doi.org/10.31219/osf.io/hbwf8</a></p>
\ No newline at end of file
diff --git a/static/09b20404ba79837614549b8eb52ee6d1/092ed/repo-approach.png b/static/09b20404ba79837614549b8eb52ee6d1/092ed/repo-approach.png
new file mode 100644
index 0000000..2e1c64b
Binary files /dev/null and b/static/09b20404ba79837614549b8eb52ee6d1/092ed/repo-approach.png differ
diff --git a/static/09b20404ba79837614549b8eb52ee6d1/6bdcf/repo-approach.png b/static/09b20404ba79837614549b8eb52ee6d1/6bdcf/repo-approach.png
new file mode 100644
index 0000000..c45db98
Binary files /dev/null and b/static/09b20404ba79837614549b8eb52ee6d1/6bdcf/repo-approach.png differ
diff --git a/static/09b20404ba79837614549b8eb52ee6d1/c26ae/repo-approach.png b/static/09b20404ba79837614549b8eb52ee6d1/c26ae/repo-approach.png
new file mode 100644
index 0000000..90d3c8f
Binary files /dev/null and b/static/09b20404ba79837614549b8eb52ee6d1/c26ae/repo-approach.png differ
diff --git a/static/09f488bdb6d0d546d1e0c74279c53e8d/30592/pubsub.png b/static/09f488bdb6d0d546d1e0c74279c53e8d/30592/pubsub.png
new file mode 100644
index 0000000..831a3c1
Binary files /dev/null and b/static/09f488bdb6d0d546d1e0c74279c53e8d/30592/pubsub.png differ
diff --git a/static/09f488bdb6d0d546d1e0c74279c53e8d/6bdcf/pubsub.png b/static/09f488bdb6d0d546d1e0c74279c53e8d/6bdcf/pubsub.png
new file mode 100644
index 0000000..00e4dd7
Binary files /dev/null and b/static/09f488bdb6d0d546d1e0c74279c53e8d/6bdcf/pubsub.png differ
diff --git a/static/09f488bdb6d0d546d1e0c74279c53e8d/c26ae/pubsub.png b/static/09f488bdb6d0d546d1e0c74279c53e8d/c26ae/pubsub.png
new file mode 100644
index 0000000..57969bd
Binary files /dev/null and b/static/09f488bdb6d0d546d1e0c74279c53e8d/c26ae/pubsub.png differ
diff --git a/static/0c9af9bcdba57992a010aab21454ead2/6bdcf/configure-extension.png b/static/0c9af9bcdba57992a010aab21454ead2/6bdcf/configure-extension.png
new file mode 100644
index 0000000..965027a
Binary files /dev/null and b/static/0c9af9bcdba57992a010aab21454ead2/6bdcf/configure-extension.png differ
diff --git a/static/0c9af9bcdba57992a010aab21454ead2/c26ae/configure-extension.png b/static/0c9af9bcdba57992a010aab21454ead2/c26ae/configure-extension.png
new file mode 100644
index 0000000..923eb9c
Binary files /dev/null and b/static/0c9af9bcdba57992a010aab21454ead2/c26ae/configure-extension.png differ
diff --git a/static/0c9af9bcdba57992a010aab21454ead2/c391c/configure-extension.png b/static/0c9af9bcdba57992a010aab21454ead2/c391c/configure-extension.png
new file mode 100644
index 0000000..2f132fc
Binary files /dev/null and b/static/0c9af9bcdba57992a010aab21454ead2/c391c/configure-extension.png differ
diff --git a/static/0c9af9bcdba57992a010aab21454ead2/f058b/configure-extension.png b/static/0c9af9bcdba57992a010aab21454ead2/f058b/configure-extension.png
new file mode 100644
index 0000000..62b7ff0
Binary files /dev/null and b/static/0c9af9bcdba57992a010aab21454ead2/f058b/configure-extension.png differ
diff --git a/static/158048ebfeeb80ac9314a6beda41e5d6/3c5de/toot.png b/static/158048ebfeeb80ac9314a6beda41e5d6/3c5de/toot.png
new file mode 100644
index 0000000..fb297cd
Binary files /dev/null and b/static/158048ebfeeb80ac9314a6beda41e5d6/3c5de/toot.png differ
diff --git a/static/158048ebfeeb80ac9314a6beda41e5d6/6bdcf/toot.png b/static/158048ebfeeb80ac9314a6beda41e5d6/6bdcf/toot.png
new file mode 100644
index 0000000..9069c2a
Binary files /dev/null and b/static/158048ebfeeb80ac9314a6beda41e5d6/6bdcf/toot.png differ
diff --git a/static/158048ebfeeb80ac9314a6beda41e5d6/c26ae/toot.png b/static/158048ebfeeb80ac9314a6beda41e5d6/c26ae/toot.png
new file mode 100644
index 0000000..65fd5b4
Binary files /dev/null and b/static/158048ebfeeb80ac9314a6beda41e5d6/c26ae/toot.png differ
diff --git a/static/15d14a94c2569955b4f41228f46ab1ec/0b533/skohub-ssg-filter.png b/static/15d14a94c2569955b4f41228f46ab1ec/0b533/skohub-ssg-filter.png
new file mode 100644
index 0000000..e1f4ffe
Binary files /dev/null and b/static/15d14a94c2569955b4f41228f46ab1ec/0b533/skohub-ssg-filter.png differ
diff --git a/static/15d14a94c2569955b4f41228f46ab1ec/6bdcf/skohub-ssg-filter.png b/static/15d14a94c2569955b4f41228f46ab1ec/6bdcf/skohub-ssg-filter.png
new file mode 100644
index 0000000..da6b133
Binary files /dev/null and b/static/15d14a94c2569955b4f41228f46ab1ec/6bdcf/skohub-ssg-filter.png differ
diff --git a/static/15d14a94c2569955b4f41228f46ab1ec/c26ae/skohub-ssg-filter.png b/static/15d14a94c2569955b4f41228f46ab1ec/c26ae/skohub-ssg-filter.png
new file mode 100644
index 0000000..cd73520
Binary files /dev/null and b/static/15d14a94c2569955b4f41228f46ab1ec/c26ae/skohub-ssg-filter.png differ
diff --git a/static/1ba989cdd764d0e20d0886cbee808596/47ff6/or2.png b/static/1ba989cdd764d0e20d0886cbee808596/47ff6/or2.png
new file mode 100644
index 0000000..ffacca6
Binary files /dev/null and b/static/1ba989cdd764d0e20d0886cbee808596/47ff6/or2.png differ
diff --git a/static/1ba989cdd764d0e20d0886cbee808596/6bdcf/or2.png b/static/1ba989cdd764d0e20d0886cbee808596/6bdcf/or2.png
new file mode 100644
index 0000000..2598089
Binary files /dev/null and b/static/1ba989cdd764d0e20d0886cbee808596/6bdcf/or2.png differ
diff --git a/static/1ba989cdd764d0e20d0886cbee808596/c26ae/or2.png b/static/1ba989cdd764d0e20d0886cbee808596/c26ae/or2.png
new file mode 100644
index 0000000..3da027f
Binary files /dev/null and b/static/1ba989cdd764d0e20d0886cbee808596/c26ae/or2.png differ
diff --git a/static/1ba989cdd764d0e20d0886cbee808596/f058b/or2.png b/static/1ba989cdd764d0e20d0886cbee808596/f058b/or2.png
new file mode 100644
index 0000000..54a4041
Binary files /dev/null and b/static/1ba989cdd764d0e20d0886cbee808596/f058b/or2.png differ
diff --git a/static/1f4f72b3e6e880fe043b2587a7b99a98/6bdcf/pushing-notification.png b/static/1f4f72b3e6e880fe043b2587a7b99a98/6bdcf/pushing-notification.png
new file mode 100644
index 0000000..7911fba
Binary files /dev/null and b/static/1f4f72b3e6e880fe043b2587a7b99a98/6bdcf/pushing-notification.png differ
diff --git a/static/1f4f72b3e6e880fe043b2587a7b99a98/a9b70/pushing-notification.png b/static/1f4f72b3e6e880fe043b2587a7b99a98/a9b70/pushing-notification.png
new file mode 100644
index 0000000..e42af9b
Binary files /dev/null and b/static/1f4f72b3e6e880fe043b2587a7b99a98/a9b70/pushing-notification.png differ
diff --git a/static/1f4f72b3e6e880fe043b2587a7b99a98/c26ae/pushing-notification.png b/static/1f4f72b3e6e880fe043b2587a7b99a98/c26ae/pushing-notification.png
new file mode 100644
index 0000000..dc4a738
Binary files /dev/null and b/static/1f4f72b3e6e880fe043b2587a7b99a98/c26ae/pushing-notification.png differ
diff --git a/static/23e9cd5c100418dc7d1c27eed6901f62/29114/slides.png b/static/23e9cd5c100418dc7d1c27eed6901f62/29114/slides.png
new file mode 100644
index 0000000..1465778
Binary files /dev/null and b/static/23e9cd5c100418dc7d1c27eed6901f62/29114/slides.png differ
diff --git a/static/23e9cd5c100418dc7d1c27eed6901f62/40601/slides.png b/static/23e9cd5c100418dc7d1c27eed6901f62/40601/slides.png
new file mode 100644
index 0000000..c4e0e91
Binary files /dev/null and b/static/23e9cd5c100418dc7d1c27eed6901f62/40601/slides.png differ
diff --git a/static/23e9cd5c100418dc7d1c27eed6901f62/6bdcf/slides.png b/static/23e9cd5c100418dc7d1c27eed6901f62/6bdcf/slides.png
new file mode 100644
index 0000000..fb1aa29
Binary files /dev/null and b/static/23e9cd5c100418dc7d1c27eed6901f62/6bdcf/slides.png differ
diff --git a/static/23e9cd5c100418dc7d1c27eed6901f62/78612/slides.png b/static/23e9cd5c100418dc7d1c27eed6901f62/78612/slides.png
new file mode 100644
index 0000000..d375b5b
Binary files /dev/null and b/static/23e9cd5c100418dc7d1c27eed6901f62/78612/slides.png differ
diff --git a/static/23e9cd5c100418dc7d1c27eed6901f62/c26ae/slides.png b/static/23e9cd5c100418dc7d1c27eed6901f62/c26ae/slides.png
new file mode 100644
index 0000000..6b3f3e3
Binary files /dev/null and b/static/23e9cd5c100418dc7d1c27eed6901f62/c26ae/slides.png differ
diff --git a/static/23e9cd5c100418dc7d1c27eed6901f62/f058b/slides.png b/static/23e9cd5c100418dc7d1c27eed6901f62/f058b/slides.png
new file mode 100644
index 0000000..3871cd3
Binary files /dev/null and b/static/23e9cd5c100418dc7d1c27eed6901f62/f058b/slides.png differ
diff --git a/static/2b8d11bca715d365580506508d352f7a/63ec5/extension-icon.png b/static/2b8d11bca715d365580506508d352f7a/63ec5/extension-icon.png
new file mode 100644
index 0000000..9f595a1
Binary files /dev/null and b/static/2b8d11bca715d365580506508d352f7a/63ec5/extension-icon.png differ
diff --git a/static/2b8d11bca715d365580506508d352f7a/6bdcf/extension-icon.png b/static/2b8d11bca715d365580506508d352f7a/6bdcf/extension-icon.png
new file mode 100644
index 0000000..be54bb3
Binary files /dev/null and b/static/2b8d11bca715d365580506508d352f7a/6bdcf/extension-icon.png differ
diff --git a/static/2b8d11bca715d365580506508d352f7a/c26ae/extension-icon.png b/static/2b8d11bca715d365580506508d352f7a/c26ae/extension-icon.png
new file mode 100644
index 0000000..474c7b0
Binary files /dev/null and b/static/2b8d11bca715d365580506508d352f7a/c26ae/extension-icon.png differ
diff --git a/static/2b8d11bca715d365580506508d352f7a/f058b/extension-icon.png b/static/2b8d11bca715d365580506508d352f7a/f058b/extension-icon.png
new file mode 100644
index 0000000..e3651ac
Binary files /dev/null and b/static/2b8d11bca715d365580506508d352f7a/f058b/extension-icon.png differ
diff --git a/static/3180b8faaa30763f57cf0aeda0f8a2e3/6f3f2/shacl-logo.png b/static/3180b8faaa30763f57cf0aeda0f8a2e3/6f3f2/shacl-logo.png
new file mode 100644
index 0000000..5bfaeda
Binary files /dev/null and b/static/3180b8faaa30763f57cf0aeda0f8a2e3/6f3f2/shacl-logo.png differ
diff --git a/static/3180b8faaa30763f57cf0aeda0f8a2e3/c26ae/shacl-logo.png b/static/3180b8faaa30763f57cf0aeda0f8a2e3/c26ae/shacl-logo.png
new file mode 100644
index 0000000..293e44e
Binary files /dev/null and b/static/3180b8faaa30763f57cf0aeda0f8a2e3/c26ae/shacl-logo.png differ
diff --git a/static/3400dd15c8f9dd3db28fa4ee867fedff/42d54/or5.png b/static/3400dd15c8f9dd3db28fa4ee867fedff/42d54/or5.png
new file mode 100644
index 0000000..b6ee4db
Binary files /dev/null and b/static/3400dd15c8f9dd3db28fa4ee867fedff/42d54/or5.png differ
diff --git a/static/3400dd15c8f9dd3db28fa4ee867fedff/6bdcf/or5.png b/static/3400dd15c8f9dd3db28fa4ee867fedff/6bdcf/or5.png
new file mode 100644
index 0000000..c17fdcb
Binary files /dev/null and b/static/3400dd15c8f9dd3db28fa4ee867fedff/6bdcf/or5.png differ
diff --git a/static/3400dd15c8f9dd3db28fa4ee867fedff/c26ae/or5.png b/static/3400dd15c8f9dd3db28fa4ee867fedff/c26ae/or5.png
new file mode 100644
index 0000000..bbab16a
Binary files /dev/null and b/static/3400dd15c8f9dd3db28fa4ee867fedff/c26ae/or5.png differ
diff --git a/static/3400dd15c8f9dd3db28fa4ee867fedff/f058b/or5.png b/static/3400dd15c8f9dd3db28fa4ee867fedff/f058b/or5.png
new file mode 100644
index 0000000..f7269d8
Binary files /dev/null and b/static/3400dd15c8f9dd3db28fa4ee867fedff/f058b/or5.png differ
diff --git a/static/34c8419d626f86c2e8a6f15ce75998a4/6bdcf/use_gh_pages_website.png b/static/34c8419d626f86c2e8a6f15ce75998a4/6bdcf/use_gh_pages_website.png
new file mode 100644
index 0000000..ecb7fc1
Binary files /dev/null and b/static/34c8419d626f86c2e8a6f15ce75998a4/6bdcf/use_gh_pages_website.png differ
diff --git a/static/34c8419d626f86c2e8a6f15ce75998a4/c26ae/use_gh_pages_website.png b/static/34c8419d626f86c2e8a6f15ce75998a4/c26ae/use_gh_pages_website.png
new file mode 100644
index 0000000..f20d463
Binary files /dev/null and b/static/34c8419d626f86c2e8a6f15ce75998a4/c26ae/use_gh_pages_website.png differ
diff --git a/static/34c8419d626f86c2e8a6f15ce75998a4/f058b/use_gh_pages_website.png b/static/34c8419d626f86c2e8a6f15ce75998a4/f058b/use_gh_pages_website.png
new file mode 100644
index 0000000..9ff3472
Binary files /dev/null and b/static/34c8419d626f86c2e8a6f15ce75998a4/f058b/use_gh_pages_website.png differ
diff --git a/static/34c8419d626f86c2e8a6f15ce75998a4/f0685/use_gh_pages_website.png b/static/34c8419d626f86c2e8a6f15ce75998a4/f0685/use_gh_pages_website.png
new file mode 100644
index 0000000..0187427
Binary files /dev/null and b/static/34c8419d626f86c2e8a6f15ce75998a4/f0685/use_gh_pages_website.png differ
diff --git a/static/384b141891ff0a44d576e0792c5562e3/40601/skohub-vocabs-screenshot.png b/static/384b141891ff0a44d576e0792c5562e3/40601/skohub-vocabs-screenshot.png
new file mode 100644
index 0000000..bd951fd
Binary files /dev/null and b/static/384b141891ff0a44d576e0792c5562e3/40601/skohub-vocabs-screenshot.png differ
diff --git a/static/384b141891ff0a44d576e0792c5562e3/6bdcf/skohub-vocabs-screenshot.png b/static/384b141891ff0a44d576e0792c5562e3/6bdcf/skohub-vocabs-screenshot.png
new file mode 100644
index 0000000..42d6528
Binary files /dev/null and b/static/384b141891ff0a44d576e0792c5562e3/6bdcf/skohub-vocabs-screenshot.png differ
diff --git a/static/384b141891ff0a44d576e0792c5562e3/78612/skohub-vocabs-screenshot.png b/static/384b141891ff0a44d576e0792c5562e3/78612/skohub-vocabs-screenshot.png
new file mode 100644
index 0000000..54598d8
Binary files /dev/null and b/static/384b141891ff0a44d576e0792c5562e3/78612/skohub-vocabs-screenshot.png differ
diff --git a/static/384b141891ff0a44d576e0792c5562e3/b880f/skohub-vocabs-screenshot.png b/static/384b141891ff0a44d576e0792c5562e3/b880f/skohub-vocabs-screenshot.png
new file mode 100644
index 0000000..43b288b
Binary files /dev/null and b/static/384b141891ff0a44d576e0792c5562e3/b880f/skohub-vocabs-screenshot.png differ
diff --git a/static/384b141891ff0a44d576e0792c5562e3/c26ae/skohub-vocabs-screenshot.png b/static/384b141891ff0a44d576e0792c5562e3/c26ae/skohub-vocabs-screenshot.png
new file mode 100644
index 0000000..fedee64
Binary files /dev/null and b/static/384b141891ff0a44d576e0792c5562e3/c26ae/skohub-vocabs-screenshot.png differ
diff --git a/static/384b141891ff0a44d576e0792c5562e3/f058b/skohub-vocabs-screenshot.png b/static/384b141891ff0a44d576e0792c5562e3/f058b/skohub-vocabs-screenshot.png
new file mode 100644
index 0000000..a6654f5
Binary files /dev/null and b/static/384b141891ff0a44d576e0792c5562e3/f058b/skohub-vocabs-screenshot.png differ
diff --git a/static/492b5e6e95c98c3719cf62258f6ff200/29114/describing.png b/static/492b5e6e95c98c3719cf62258f6ff200/29114/describing.png
new file mode 100644
index 0000000..974a4ed
Binary files /dev/null and b/static/492b5e6e95c98c3719cf62258f6ff200/29114/describing.png differ
diff --git a/static/492b5e6e95c98c3719cf62258f6ff200/40601/describing.png b/static/492b5e6e95c98c3719cf62258f6ff200/40601/describing.png
new file mode 100644
index 0000000..6e7f381
Binary files /dev/null and b/static/492b5e6e95c98c3719cf62258f6ff200/40601/describing.png differ
diff --git a/static/492b5e6e95c98c3719cf62258f6ff200/6bdcf/describing.png b/static/492b5e6e95c98c3719cf62258f6ff200/6bdcf/describing.png
new file mode 100644
index 0000000..df4bfd8
Binary files /dev/null and b/static/492b5e6e95c98c3719cf62258f6ff200/6bdcf/describing.png differ
diff --git a/static/492b5e6e95c98c3719cf62258f6ff200/78612/describing.png b/static/492b5e6e95c98c3719cf62258f6ff200/78612/describing.png
new file mode 100644
index 0000000..3a7f650
Binary files /dev/null and b/static/492b5e6e95c98c3719cf62258f6ff200/78612/describing.png differ
diff --git a/static/492b5e6e95c98c3719cf62258f6ff200/c26ae/describing.png b/static/492b5e6e95c98c3719cf62258f6ff200/c26ae/describing.png
new file mode 100644
index 0000000..730a47d
Binary files /dev/null and b/static/492b5e6e95c98c3719cf62258f6ff200/c26ae/describing.png differ
diff --git a/static/492b5e6e95c98c3719cf62258f6ff200/f058b/describing.png b/static/492b5e6e95c98c3719cf62258f6ff200/f058b/describing.png
new file mode 100644
index 0000000..3bbf1d5
Binary files /dev/null and b/static/492b5e6e95c98c3719cf62258f6ff200/f058b/describing.png differ
diff --git a/static/4972457dc61a693775b415a5d0d30d8b/3c492/upload-success.png b/static/4972457dc61a693775b415a5d0d30d8b/3c492/upload-success.png
new file mode 100644
index 0000000..239c788
Binary files /dev/null and b/static/4972457dc61a693775b415a5d0d30d8b/3c492/upload-success.png differ
diff --git a/static/4972457dc61a693775b415a5d0d30d8b/40601/upload-success.png b/static/4972457dc61a693775b415a5d0d30d8b/40601/upload-success.png
new file mode 100644
index 0000000..fa02189
Binary files /dev/null and b/static/4972457dc61a693775b415a5d0d30d8b/40601/upload-success.png differ
diff --git a/static/4972457dc61a693775b415a5d0d30d8b/6bdcf/upload-success.png b/static/4972457dc61a693775b415a5d0d30d8b/6bdcf/upload-success.png
new file mode 100644
index 0000000..85ee36a
Binary files /dev/null and b/static/4972457dc61a693775b415a5d0d30d8b/6bdcf/upload-success.png differ
diff --git a/static/4972457dc61a693775b415a5d0d30d8b/78612/upload-success.png b/static/4972457dc61a693775b415a5d0d30d8b/78612/upload-success.png
new file mode 100644
index 0000000..8083746
Binary files /dev/null and b/static/4972457dc61a693775b415a5d0d30d8b/78612/upload-success.png differ
diff --git a/static/4972457dc61a693775b415a5d0d30d8b/c26ae/upload-success.png b/static/4972457dc61a693775b415a5d0d30d8b/c26ae/upload-success.png
new file mode 100644
index 0000000..382adc9
Binary files /dev/null and b/static/4972457dc61a693775b415a5d0d30d8b/c26ae/upload-success.png differ
diff --git a/static/4972457dc61a693775b415a5d0d30d8b/f058b/upload-success.png b/static/4972457dc61a693775b415a5d0d30d8b/f058b/upload-success.png
new file mode 100644
index 0000000..98e5823
Binary files /dev/null and b/static/4972457dc61a693775b415a5d0d30d8b/f058b/upload-success.png differ
diff --git a/static/4bae802167f80dcb59a36f4d3ec553ed/40601/activate_action.png b/static/4bae802167f80dcb59a36f4d3ec553ed/40601/activate_action.png
new file mode 100644
index 0000000..5dfd30c
Binary files /dev/null and b/static/4bae802167f80dcb59a36f4d3ec553ed/40601/activate_action.png differ
diff --git a/static/4bae802167f80dcb59a36f4d3ec553ed/6bdcf/activate_action.png b/static/4bae802167f80dcb59a36f4d3ec553ed/6bdcf/activate_action.png
new file mode 100644
index 0000000..1361668
Binary files /dev/null and b/static/4bae802167f80dcb59a36f4d3ec553ed/6bdcf/activate_action.png differ
diff --git a/static/4bae802167f80dcb59a36f4d3ec553ed/c26ae/activate_action.png b/static/4bae802167f80dcb59a36f4d3ec553ed/c26ae/activate_action.png
new file mode 100644
index 0000000..fc38aba
Binary files /dev/null and b/static/4bae802167f80dcb59a36f4d3ec553ed/c26ae/activate_action.png differ
diff --git a/static/4bae802167f80dcb59a36f4d3ec553ed/ecf19/activate_action.png b/static/4bae802167f80dcb59a36f4d3ec553ed/ecf19/activate_action.png
new file mode 100644
index 0000000..6a422aa
Binary files /dev/null and b/static/4bae802167f80dcb59a36f4d3ec553ed/ecf19/activate_action.png differ
diff --git a/static/4bae802167f80dcb59a36f4d3ec553ed/f058b/activate_action.png b/static/4bae802167f80dcb59a36f4d3ec553ed/f058b/activate_action.png
new file mode 100644
index 0000000..26e14b9
Binary files /dev/null and b/static/4bae802167f80dcb59a36f4d3ec553ed/f058b/activate_action.png differ
diff --git a/static/5617f981b723e476e73f28ec316dd720/29114/slides.png b/static/5617f981b723e476e73f28ec316dd720/29114/slides.png
new file mode 100644
index 0000000..61ea9ab
Binary files /dev/null and b/static/5617f981b723e476e73f28ec316dd720/29114/slides.png differ
diff --git a/static/5617f981b723e476e73f28ec316dd720/40601/slides.png b/static/5617f981b723e476e73f28ec316dd720/40601/slides.png
new file mode 100644
index 0000000..a55a849
Binary files /dev/null and b/static/5617f981b723e476e73f28ec316dd720/40601/slides.png differ
diff --git a/static/5617f981b723e476e73f28ec316dd720/6bdcf/slides.png b/static/5617f981b723e476e73f28ec316dd720/6bdcf/slides.png
new file mode 100644
index 0000000..3596648
Binary files /dev/null and b/static/5617f981b723e476e73f28ec316dd720/6bdcf/slides.png differ
diff --git a/static/5617f981b723e476e73f28ec316dd720/78612/slides.png b/static/5617f981b723e476e73f28ec316dd720/78612/slides.png
new file mode 100644
index 0000000..caf9e52
Binary files /dev/null and b/static/5617f981b723e476e73f28ec316dd720/78612/slides.png differ
diff --git a/static/5617f981b723e476e73f28ec316dd720/c26ae/slides.png b/static/5617f981b723e476e73f28ec316dd720/c26ae/slides.png
new file mode 100644
index 0000000..be603ac
Binary files /dev/null and b/static/5617f981b723e476e73f28ec316dd720/c26ae/slides.png differ
diff --git a/static/5617f981b723e476e73f28ec316dd720/f058b/slides.png b/static/5617f981b723e476e73f28ec316dd720/f058b/slides.png
new file mode 100644
index 0000000..7d85a8e
Binary files /dev/null and b/static/5617f981b723e476e73f28ec316dd720/f058b/slides.png differ
diff --git a/static/5d638e5a214e95beb550c5ddf028b8fa/01dae/check-webhook-response.png b/static/5d638e5a214e95beb550c5ddf028b8fa/01dae/check-webhook-response.png
new file mode 100644
index 0000000..70592e1
Binary files /dev/null and b/static/5d638e5a214e95beb550c5ddf028b8fa/01dae/check-webhook-response.png differ
diff --git a/static/5d638e5a214e95beb550c5ddf028b8fa/6bdcf/check-webhook-response.png b/static/5d638e5a214e95beb550c5ddf028b8fa/6bdcf/check-webhook-response.png
new file mode 100644
index 0000000..e35b5fe
Binary files /dev/null and b/static/5d638e5a214e95beb550c5ddf028b8fa/6bdcf/check-webhook-response.png differ
diff --git a/static/5d638e5a214e95beb550c5ddf028b8fa/c26ae/check-webhook-response.png b/static/5d638e5a214e95beb550c5ddf028b8fa/c26ae/check-webhook-response.png
new file mode 100644
index 0000000..7ac8039
Binary files /dev/null and b/static/5d638e5a214e95beb550c5ddf028b8fa/c26ae/check-webhook-response.png differ
diff --git a/static/5d638e5a214e95beb550c5ddf028b8fa/f058b/check-webhook-response.png b/static/5d638e5a214e95beb550c5ddf028b8fa/f058b/check-webhook-response.png
new file mode 100644
index 0000000..5186108
Binary files /dev/null and b/static/5d638e5a214e95beb550c5ddf028b8fa/f058b/check-webhook-response.png differ
diff --git a/static/6417112bfbd5cb40999e9ce1d97b7b82/6bdcf/auto-suggestion-from-skos-vocab.png b/static/6417112bfbd5cb40999e9ce1d97b7b82/6bdcf/auto-suggestion-from-skos-vocab.png
new file mode 100644
index 0000000..5a4b78a
Binary files /dev/null and b/static/6417112bfbd5cb40999e9ce1d97b7b82/6bdcf/auto-suggestion-from-skos-vocab.png differ
diff --git a/static/6417112bfbd5cb40999e9ce1d97b7b82/c26ae/auto-suggestion-from-skos-vocab.png b/static/6417112bfbd5cb40999e9ce1d97b7b82/c26ae/auto-suggestion-from-skos-vocab.png
new file mode 100644
index 0000000..093a4ea
Binary files /dev/null and b/static/6417112bfbd5cb40999e9ce1d97b7b82/c26ae/auto-suggestion-from-skos-vocab.png differ
diff --git a/static/6417112bfbd5cb40999e9ce1d97b7b82/f7a31/auto-suggestion-from-skos-vocab.png b/static/6417112bfbd5cb40999e9ce1d97b7b82/f7a31/auto-suggestion-from-skos-vocab.png
new file mode 100644
index 0000000..c1a2f2c
Binary files /dev/null and b/static/6417112bfbd5cb40999e9ce1d97b7b82/f7a31/auto-suggestion-from-skos-vocab.png differ
diff --git a/static/70707507ae9efdd0f829cba5b111f2fd/4ee7f/json-preview.png b/static/70707507ae9efdd0f829cba5b111f2fd/4ee7f/json-preview.png
new file mode 100644
index 0000000..9e8e657
Binary files /dev/null and b/static/70707507ae9efdd0f829cba5b111f2fd/4ee7f/json-preview.png differ
diff --git a/static/70707507ae9efdd0f829cba5b111f2fd/6bdcf/json-preview.png b/static/70707507ae9efdd0f829cba5b111f2fd/6bdcf/json-preview.png
new file mode 100644
index 0000000..e7ad270
Binary files /dev/null and b/static/70707507ae9efdd0f829cba5b111f2fd/6bdcf/json-preview.png differ
diff --git a/static/70707507ae9efdd0f829cba5b111f2fd/c26ae/json-preview.png b/static/70707507ae9efdd0f829cba5b111f2fd/c26ae/json-preview.png
new file mode 100644
index 0000000..566656d
Binary files /dev/null and b/static/70707507ae9efdd0f829cba5b111f2fd/c26ae/json-preview.png differ
diff --git a/static/7348d68aaa1145cecf691f6ae17749ca/6bdcf/subject-indexing-with-uris-and-inbox.png b/static/7348d68aaa1145cecf691f6ae17749ca/6bdcf/subject-indexing-with-uris-and-inbox.png
new file mode 100644
index 0000000..b9000a0
Binary files /dev/null and b/static/7348d68aaa1145cecf691f6ae17749ca/6bdcf/subject-indexing-with-uris-and-inbox.png differ
diff --git a/static/7348d68aaa1145cecf691f6ae17749ca/a9b70/subject-indexing-with-uris-and-inbox.png b/static/7348d68aaa1145cecf691f6ae17749ca/a9b70/subject-indexing-with-uris-and-inbox.png
new file mode 100644
index 0000000..ef3b750
Binary files /dev/null and b/static/7348d68aaa1145cecf691f6ae17749ca/a9b70/subject-indexing-with-uris-and-inbox.png differ
diff --git a/static/7348d68aaa1145cecf691f6ae17749ca/c26ae/subject-indexing-with-uris-and-inbox.png b/static/7348d68aaa1145cecf691f6ae17749ca/c26ae/subject-indexing-with-uris-and-inbox.png
new file mode 100644
index 0000000..a784630
Binary files /dev/null and b/static/7348d68aaa1145cecf691f6ae17749ca/c26ae/subject-indexing-with-uris-and-inbox.png differ
diff --git a/static/7ac0ab406412da9d354db3ca10b81ac9/b1a44/subscribe.png b/static/7ac0ab406412da9d354db3ca10b81ac9/b1a44/subscribe.png
new file mode 100644
index 0000000..466f423
Binary files /dev/null and b/static/7ac0ab406412da9d354db3ca10b81ac9/b1a44/subscribe.png differ
diff --git a/static/7ac0ab406412da9d354db3ca10b81ac9/c26ae/subscribe.png b/static/7ac0ab406412da9d354db3ca10b81ac9/c26ae/subscribe.png
new file mode 100644
index 0000000..153d211
Binary files /dev/null and b/static/7ac0ab406412da9d354db3ca10b81ac9/c26ae/subscribe.png differ
diff --git a/static/83b34368b33ec5c1c18d5da66e80c322/40601/or3.png b/static/83b34368b33ec5c1c18d5da66e80c322/40601/or3.png
new file mode 100644
index 0000000..4e0cceb
Binary files /dev/null and b/static/83b34368b33ec5c1c18d5da66e80c322/40601/or3.png differ
diff --git a/static/83b34368b33ec5c1c18d5da66e80c322/6bdcf/or3.png b/static/83b34368b33ec5c1c18d5da66e80c322/6bdcf/or3.png
new file mode 100644
index 0000000..a886867
Binary files /dev/null and b/static/83b34368b33ec5c1c18d5da66e80c322/6bdcf/or3.png differ
diff --git a/static/83b34368b33ec5c1c18d5da66e80c322/78612/or3.png b/static/83b34368b33ec5c1c18d5da66e80c322/78612/or3.png
new file mode 100644
index 0000000..8d028e6
Binary files /dev/null and b/static/83b34368b33ec5c1c18d5da66e80c322/78612/or3.png differ
diff --git a/static/83b34368b33ec5c1c18d5da66e80c322/c26ae/or3.png b/static/83b34368b33ec5c1c18d5da66e80c322/c26ae/or3.png
new file mode 100644
index 0000000..9cb5080
Binary files /dev/null and b/static/83b34368b33ec5c1c18d5da66e80c322/c26ae/or3.png differ
diff --git a/static/83b34368b33ec5c1c18d5da66e80c322/e8e04/or3.png b/static/83b34368b33ec5c1c18d5da66e80c322/e8e04/or3.png
new file mode 100644
index 0000000..37f34f4
Binary files /dev/null and b/static/83b34368b33ec5c1c18d5da66e80c322/e8e04/or3.png differ
diff --git a/static/83b34368b33ec5c1c18d5da66e80c322/f058b/or3.png b/static/83b34368b33ec5c1c18d5da66e80c322/f058b/or3.png
new file mode 100644
index 0000000..a786154
Binary files /dev/null and b/static/83b34368b33ec5c1c18d5da66e80c322/f058b/or3.png differ
diff --git a/static/85426ef69485c8c526778217db5f24f5/1e1c3/concept.png b/static/85426ef69485c8c526778217db5f24f5/1e1c3/concept.png
new file mode 100644
index 0000000..7154538
Binary files /dev/null and b/static/85426ef69485c8c526778217db5f24f5/1e1c3/concept.png differ
diff --git a/static/85426ef69485c8c526778217db5f24f5/40601/concept.png b/static/85426ef69485c8c526778217db5f24f5/40601/concept.png
new file mode 100644
index 0000000..59ffe43
Binary files /dev/null and b/static/85426ef69485c8c526778217db5f24f5/40601/concept.png differ
diff --git a/static/85426ef69485c8c526778217db5f24f5/6bdcf/concept.png b/static/85426ef69485c8c526778217db5f24f5/6bdcf/concept.png
new file mode 100644
index 0000000..fdb039e
Binary files /dev/null and b/static/85426ef69485c8c526778217db5f24f5/6bdcf/concept.png differ
diff --git a/static/85426ef69485c8c526778217db5f24f5/78612/concept.png b/static/85426ef69485c8c526778217db5f24f5/78612/concept.png
new file mode 100644
index 0000000..f52606f
Binary files /dev/null and b/static/85426ef69485c8c526778217db5f24f5/78612/concept.png differ
diff --git a/static/85426ef69485c8c526778217db5f24f5/c26ae/concept.png b/static/85426ef69485c8c526778217db5f24f5/c26ae/concept.png
new file mode 100644
index 0000000..f224740
Binary files /dev/null and b/static/85426ef69485c8c526778217db5f24f5/c26ae/concept.png differ
diff --git a/static/85426ef69485c8c526778217db5f24f5/f058b/concept.png b/static/85426ef69485c8c526778217db5f24f5/f058b/concept.png
new file mode 100644
index 0000000..66f2831
Binary files /dev/null and b/static/85426ef69485c8c526778217db5f24f5/f058b/concept.png differ
diff --git a/static/8650d807e6599370e6b3819a46654b15/38124/add-webhook2.png b/static/8650d807e6599370e6b3819a46654b15/38124/add-webhook2.png
new file mode 100644
index 0000000..e45399c
Binary files /dev/null and b/static/8650d807e6599370e6b3819a46654b15/38124/add-webhook2.png differ
diff --git a/static/8650d807e6599370e6b3819a46654b15/40601/add-webhook2.png b/static/8650d807e6599370e6b3819a46654b15/40601/add-webhook2.png
new file mode 100644
index 0000000..45aabf0
Binary files /dev/null and b/static/8650d807e6599370e6b3819a46654b15/40601/add-webhook2.png differ
diff --git a/static/8650d807e6599370e6b3819a46654b15/6bdcf/add-webhook2.png b/static/8650d807e6599370e6b3819a46654b15/6bdcf/add-webhook2.png
new file mode 100644
index 0000000..17afd20
Binary files /dev/null and b/static/8650d807e6599370e6b3819a46654b15/6bdcf/add-webhook2.png differ
diff --git a/static/8650d807e6599370e6b3819a46654b15/c26ae/add-webhook2.png b/static/8650d807e6599370e6b3819a46654b15/c26ae/add-webhook2.png
new file mode 100644
index 0000000..0f661e3
Binary files /dev/null and b/static/8650d807e6599370e6b3819a46654b15/c26ae/add-webhook2.png differ
diff --git a/static/8650d807e6599370e6b3819a46654b15/f058b/add-webhook2.png b/static/8650d807e6599370e6b3819a46654b15/f058b/add-webhook2.png
new file mode 100644
index 0000000..5875998
Binary files /dev/null and b/static/8650d807e6599370e6b3819a46654b15/f058b/add-webhook2.png differ
diff --git a/static/8ee81921b4fd54654268bbb82e704459/0acb4/or1.png b/static/8ee81921b4fd54654268bbb82e704459/0acb4/or1.png
new file mode 100644
index 0000000..4ee065b
Binary files /dev/null and b/static/8ee81921b4fd54654268bbb82e704459/0acb4/or1.png differ
diff --git a/static/8ee81921b4fd54654268bbb82e704459/6bdcf/or1.png b/static/8ee81921b4fd54654268bbb82e704459/6bdcf/or1.png
new file mode 100644
index 0000000..881faf6
Binary files /dev/null and b/static/8ee81921b4fd54654268bbb82e704459/6bdcf/or1.png differ
diff --git a/static/8ee81921b4fd54654268bbb82e704459/c26ae/or1.png b/static/8ee81921b4fd54654268bbb82e704459/c26ae/or1.png
new file mode 100644
index 0000000..6138c50
Binary files /dev/null and b/static/8ee81921b4fd54654268bbb82e704459/c26ae/or1.png differ
diff --git a/static/91c27021f2bed77637541330d58cb36b/c26ae/subject-indexing-with-uris.png b/static/91c27021f2bed77637541330d58cb36b/c26ae/subject-indexing-with-uris.png
new file mode 100644
index 0000000..53f6069
Binary files /dev/null and b/static/91c27021f2bed77637541330d58cb36b/c26ae/subject-indexing-with-uris.png differ
diff --git a/static/91c27021f2bed77637541330d58cb36b/cccdc/subject-indexing-with-uris.png b/static/91c27021f2bed77637541330d58cb36b/cccdc/subject-indexing-with-uris.png
new file mode 100644
index 0000000..7853050
Binary files /dev/null and b/static/91c27021f2bed77637541330d58cb36b/cccdc/subject-indexing-with-uris.png differ
diff --git a/static/a0ad657099e495b1977b78ba0e9879f4/3f3b9/test_coverage.png b/static/a0ad657099e495b1977b78ba0e9879f4/3f3b9/test_coverage.png
new file mode 100644
index 0000000..0d9fc56
Binary files /dev/null and b/static/a0ad657099e495b1977b78ba0e9879f4/3f3b9/test_coverage.png differ
diff --git a/static/a0ad657099e495b1977b78ba0e9879f4/6bdcf/test_coverage.png b/static/a0ad657099e495b1977b78ba0e9879f4/6bdcf/test_coverage.png
new file mode 100644
index 0000000..7310625
Binary files /dev/null and b/static/a0ad657099e495b1977b78ba0e9879f4/6bdcf/test_coverage.png differ
diff --git a/static/a0ad657099e495b1977b78ba0e9879f4/c26ae/test_coverage.png b/static/a0ad657099e495b1977b78ba0e9879f4/c26ae/test_coverage.png
new file mode 100644
index 0000000..d08e523
Binary files /dev/null and b/static/a0ad657099e495b1977b78ba0e9879f4/c26ae/test_coverage.png differ
diff --git a/static/a0ad657099e495b1977b78ba0e9879f4/f058b/test_coverage.png b/static/a0ad657099e495b1977b78ba0e9879f4/f058b/test_coverage.png
new file mode 100644
index 0000000..07857da
Binary files /dev/null and b/static/a0ad657099e495b1977b78ba0e9879f4/f058b/test_coverage.png differ
diff --git a/static/a295ec74a05afd3b5175ce477988c1fb/29114/fix-error.png b/static/a295ec74a05afd3b5175ce477988c1fb/29114/fix-error.png
new file mode 100644
index 0000000..d1875dd
Binary files /dev/null and b/static/a295ec74a05afd3b5175ce477988c1fb/29114/fix-error.png differ
diff --git a/static/a295ec74a05afd3b5175ce477988c1fb/40601/fix-error.png b/static/a295ec74a05afd3b5175ce477988c1fb/40601/fix-error.png
new file mode 100644
index 0000000..e711e4d
Binary files /dev/null and b/static/a295ec74a05afd3b5175ce477988c1fb/40601/fix-error.png differ
diff --git a/static/a295ec74a05afd3b5175ce477988c1fb/6bdcf/fix-error.png b/static/a295ec74a05afd3b5175ce477988c1fb/6bdcf/fix-error.png
new file mode 100644
index 0000000..b2cb56a
Binary files /dev/null and b/static/a295ec74a05afd3b5175ce477988c1fb/6bdcf/fix-error.png differ
diff --git a/static/a295ec74a05afd3b5175ce477988c1fb/78612/fix-error.png b/static/a295ec74a05afd3b5175ce477988c1fb/78612/fix-error.png
new file mode 100644
index 0000000..c35e893
Binary files /dev/null and b/static/a295ec74a05afd3b5175ce477988c1fb/78612/fix-error.png differ
diff --git a/static/a295ec74a05afd3b5175ce477988c1fb/c26ae/fix-error.png b/static/a295ec74a05afd3b5175ce477988c1fb/c26ae/fix-error.png
new file mode 100644
index 0000000..1135be0
Binary files /dev/null and b/static/a295ec74a05afd3b5175ce477988c1fb/c26ae/fix-error.png differ
diff --git a/static/a295ec74a05afd3b5175ce477988c1fb/f058b/fix-error.png b/static/a295ec74a05afd3b5175ce477988c1fb/f058b/fix-error.png
new file mode 100644
index 0000000..c739aff
Binary files /dev/null and b/static/a295ec74a05afd3b5175ce477988c1fb/f058b/fix-error.png differ
diff --git a/static/a2a343d6e0b18eb6e55f81fc6f505d25/40601/kanban.png b/static/a2a343d6e0b18eb6e55f81fc6f505d25/40601/kanban.png
new file mode 100644
index 0000000..bce1051
Binary files /dev/null and b/static/a2a343d6e0b18eb6e55f81fc6f505d25/40601/kanban.png differ
diff --git a/static/a2a343d6e0b18eb6e55f81fc6f505d25/6bdcf/kanban.png b/static/a2a343d6e0b18eb6e55f81fc6f505d25/6bdcf/kanban.png
new file mode 100644
index 0000000..3e80ac7
Binary files /dev/null and b/static/a2a343d6e0b18eb6e55f81fc6f505d25/6bdcf/kanban.png differ
diff --git a/static/a2a343d6e0b18eb6e55f81fc6f505d25/c26ae/kanban.png b/static/a2a343d6e0b18eb6e55f81fc6f505d25/c26ae/kanban.png
new file mode 100644
index 0000000..d0bf680
Binary files /dev/null and b/static/a2a343d6e0b18eb6e55f81fc6f505d25/c26ae/kanban.png differ
diff --git a/static/a2a343d6e0b18eb6e55f81fc6f505d25/d56b5/kanban.png b/static/a2a343d6e0b18eb6e55f81fc6f505d25/d56b5/kanban.png
new file mode 100644
index 0000000..00b1ce1
Binary files /dev/null and b/static/a2a343d6e0b18eb6e55f81fc6f505d25/d56b5/kanban.png differ
diff --git a/static/a2a343d6e0b18eb6e55f81fc6f505d25/f058b/kanban.png b/static/a2a343d6e0b18eb6e55f81fc6f505d25/f058b/kanban.png
new file mode 100644
index 0000000..11a10b8
Binary files /dev/null and b/static/a2a343d6e0b18eb6e55f81fc6f505d25/f058b/kanban.png differ
diff --git a/static/a30562938e313d345071b4f50e905510/40601/add-webhook.png b/static/a30562938e313d345071b4f50e905510/40601/add-webhook.png
new file mode 100644
index 0000000..f8b15b7
Binary files /dev/null and b/static/a30562938e313d345071b4f50e905510/40601/add-webhook.png differ
diff --git a/static/a30562938e313d345071b4f50e905510/6bdcf/add-webhook.png b/static/a30562938e313d345071b4f50e905510/6bdcf/add-webhook.png
new file mode 100644
index 0000000..70f0460
Binary files /dev/null and b/static/a30562938e313d345071b4f50e905510/6bdcf/add-webhook.png differ
diff --git a/static/a30562938e313d345071b4f50e905510/78597/add-webhook.png b/static/a30562938e313d345071b4f50e905510/78597/add-webhook.png
new file mode 100644
index 0000000..3eba4e3
Binary files /dev/null and b/static/a30562938e313d345071b4f50e905510/78597/add-webhook.png differ
diff --git a/static/a30562938e313d345071b4f50e905510/c26ae/add-webhook.png b/static/a30562938e313d345071b4f50e905510/c26ae/add-webhook.png
new file mode 100644
index 0000000..a4b4278
Binary files /dev/null and b/static/a30562938e313d345071b4f50e905510/c26ae/add-webhook.png differ
diff --git a/static/a30562938e313d345071b4f50e905510/f058b/add-webhook.png b/static/a30562938e313d345071b4f50e905510/f058b/add-webhook.png
new file mode 100644
index 0000000..70fea8e
Binary files /dev/null and b/static/a30562938e313d345071b4f50e905510/f058b/add-webhook.png differ
diff --git a/static/b5beefdabbe43f74578e4af6474e861c/6bdcf/create_fork.png b/static/b5beefdabbe43f74578e4af6474e861c/6bdcf/create_fork.png
new file mode 100644
index 0000000..710968c
Binary files /dev/null and b/static/b5beefdabbe43f74578e4af6474e861c/6bdcf/create_fork.png differ
diff --git a/static/b5beefdabbe43f74578e4af6474e861c/c26ae/create_fork.png b/static/b5beefdabbe43f74578e4af6474e861c/c26ae/create_fork.png
new file mode 100644
index 0000000..40fac53
Binary files /dev/null and b/static/b5beefdabbe43f74578e4af6474e861c/c26ae/create_fork.png differ
diff --git a/static/b5beefdabbe43f74578e4af6474e861c/eb390/create_fork.png b/static/b5beefdabbe43f74578e4af6474e861c/eb390/create_fork.png
new file mode 100644
index 0000000..161989e
Binary files /dev/null and b/static/b5beefdabbe43f74578e4af6474e861c/eb390/create_fork.png differ
diff --git a/static/b5beefdabbe43f74578e4af6474e861c/f058b/create_fork.png b/static/b5beefdabbe43f74578e4af6474e861c/f058b/create_fork.png
new file mode 100644
index 0000000..5592187
Binary files /dev/null and b/static/b5beefdabbe43f74578e4af6474e861c/f058b/create_fork.png differ
diff --git a/static/b6a6a99907ce6e939473565adc2a5be3/0098c/click_gear_icon.png b/static/b6a6a99907ce6e939473565adc2a5be3/0098c/click_gear_icon.png
new file mode 100644
index 0000000..4ab136b
Binary files /dev/null and b/static/b6a6a99907ce6e939473565adc2a5be3/0098c/click_gear_icon.png differ
diff --git a/static/b6a6a99907ce6e939473565adc2a5be3/6bdcf/click_gear_icon.png b/static/b6a6a99907ce6e939473565adc2a5be3/6bdcf/click_gear_icon.png
new file mode 100644
index 0000000..b3ce0c8
Binary files /dev/null and b/static/b6a6a99907ce6e939473565adc2a5be3/6bdcf/click_gear_icon.png differ
diff --git a/static/b6a6a99907ce6e939473565adc2a5be3/c26ae/click_gear_icon.png b/static/b6a6a99907ce6e939473565adc2a5be3/c26ae/click_gear_icon.png
new file mode 100644
index 0000000..c86d177
Binary files /dev/null and b/static/b6a6a99907ce6e939473565adc2a5be3/c26ae/click_gear_icon.png differ
diff --git a/static/b6a6a99907ce6e939473565adc2a5be3/f058b/click_gear_icon.png b/static/b6a6a99907ce6e939473565adc2a5be3/f058b/click_gear_icon.png
new file mode 100644
index 0000000..40321e0
Binary files /dev/null and b/static/b6a6a99907ce6e939473565adc2a5be3/f058b/click_gear_icon.png differ
diff --git a/static/bab82f631204dc5c42b2e30c8705c217/6bdcf/sending-notification.png b/static/bab82f631204dc5c42b2e30c8705c217/6bdcf/sending-notification.png
new file mode 100644
index 0000000..9f98da4
Binary files /dev/null and b/static/bab82f631204dc5c42b2e30c8705c217/6bdcf/sending-notification.png differ
diff --git a/static/bab82f631204dc5c42b2e30c8705c217/a9b70/sending-notification.png b/static/bab82f631204dc5c42b2e30c8705c217/a9b70/sending-notification.png
new file mode 100644
index 0000000..7794353
Binary files /dev/null and b/static/bab82f631204dc5c42b2e30c8705c217/a9b70/sending-notification.png differ
diff --git a/static/bab82f631204dc5c42b2e30c8705c217/c26ae/sending-notification.png b/static/bab82f631204dc5c42b2e30c8705c217/c26ae/sending-notification.png
new file mode 100644
index 0000000..753fa13
Binary files /dev/null and b/static/bab82f631204dc5c42b2e30c8705c217/c26ae/sending-notification.png differ
diff --git a/static/bb776e8e9127b9f8b7db84ec085e8e24/40601/TDD_Global_Lifecycle.png b/static/bb776e8e9127b9f8b7db84ec085e8e24/40601/TDD_Global_Lifecycle.png
new file mode 100644
index 0000000..a15aaa4
Binary files /dev/null and b/static/bb776e8e9127b9f8b7db84ec085e8e24/40601/TDD_Global_Lifecycle.png differ
diff --git a/static/bb776e8e9127b9f8b7db84ec085e8e24/6bdcf/TDD_Global_Lifecycle.png b/static/bb776e8e9127b9f8b7db84ec085e8e24/6bdcf/TDD_Global_Lifecycle.png
new file mode 100644
index 0000000..18b41be
Binary files /dev/null and b/static/bb776e8e9127b9f8b7db84ec085e8e24/6bdcf/TDD_Global_Lifecycle.png differ
diff --git a/static/bb776e8e9127b9f8b7db84ec085e8e24/78612/TDD_Global_Lifecycle.png b/static/bb776e8e9127b9f8b7db84ec085e8e24/78612/TDD_Global_Lifecycle.png
new file mode 100644
index 0000000..89eeaf5
Binary files /dev/null and b/static/bb776e8e9127b9f8b7db84ec085e8e24/78612/TDD_Global_Lifecycle.png differ
diff --git a/static/bb776e8e9127b9f8b7db84ec085e8e24/c26ae/TDD_Global_Lifecycle.png b/static/bb776e8e9127b9f8b7db84ec085e8e24/c26ae/TDD_Global_Lifecycle.png
new file mode 100644
index 0000000..edd1c75
Binary files /dev/null and b/static/bb776e8e9127b9f8b7db84ec085e8e24/c26ae/TDD_Global_Lifecycle.png differ
diff --git a/static/bb776e8e9127b9f8b7db84ec085e8e24/c8551/TDD_Global_Lifecycle.png b/static/bb776e8e9127b9f8b7db84ec085e8e24/c8551/TDD_Global_Lifecycle.png
new file mode 100644
index 0000000..199256b
Binary files /dev/null and b/static/bb776e8e9127b9f8b7db84ec085e8e24/c8551/TDD_Global_Lifecycle.png differ
diff --git a/static/bb776e8e9127b9f8b7db84ec085e8e24/f058b/TDD_Global_Lifecycle.png b/static/bb776e8e9127b9f8b7db84ec085e8e24/f058b/TDD_Global_Lifecycle.png
new file mode 100644
index 0000000..1d197a4
Binary files /dev/null and b/static/bb776e8e9127b9f8b7db84ec085e8e24/f058b/TDD_Global_Lifecycle.png differ
diff --git a/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/6bdcf/or7.png b/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/6bdcf/or7.png
new file mode 100644
index 0000000..3c0b86b
Binary files /dev/null and b/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/6bdcf/or7.png differ
diff --git a/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/9f933/or7.png b/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/9f933/or7.png
new file mode 100644
index 0000000..edc207d
Binary files /dev/null and b/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/9f933/or7.png differ
diff --git a/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/c26ae/or7.png b/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/c26ae/or7.png
new file mode 100644
index 0000000..2ea0be1
Binary files /dev/null and b/static/bebfcfbdf5c6b5e0dcff2da7d6a615d4/c26ae/or7.png differ
diff --git a/static/c0afbbe39292d1e772cf87dc2ac2b4f5/29114/error-in-build-log.png b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/29114/error-in-build-log.png
new file mode 100644
index 0000000..d6c1557
Binary files /dev/null and b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/29114/error-in-build-log.png differ
diff --git a/static/c0afbbe39292d1e772cf87dc2ac2b4f5/40601/error-in-build-log.png b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/40601/error-in-build-log.png
new file mode 100644
index 0000000..98117b7
Binary files /dev/null and b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/40601/error-in-build-log.png differ
diff --git a/static/c0afbbe39292d1e772cf87dc2ac2b4f5/6bdcf/error-in-build-log.png b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/6bdcf/error-in-build-log.png
new file mode 100644
index 0000000..f9ac1c7
Binary files /dev/null and b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/6bdcf/error-in-build-log.png differ
diff --git a/static/c0afbbe39292d1e772cf87dc2ac2b4f5/78612/error-in-build-log.png b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/78612/error-in-build-log.png
new file mode 100644
index 0000000..1dd161b
Binary files /dev/null and b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/78612/error-in-build-log.png differ
diff --git a/static/c0afbbe39292d1e772cf87dc2ac2b4f5/c26ae/error-in-build-log.png b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/c26ae/error-in-build-log.png
new file mode 100644
index 0000000..bd3fae2
Binary files /dev/null and b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/c26ae/error-in-build-log.png differ
diff --git a/static/c0afbbe39292d1e772cf87dc2ac2b4f5/f058b/error-in-build-log.png b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/f058b/error-in-build-log.png
new file mode 100644
index 0000000..dbff087
Binary files /dev/null and b/static/c0afbbe39292d1e772cf87dc2ac2b4f5/f058b/error-in-build-log.png differ
diff --git a/static/cd12abcd7fdd02842fd5225eab7ceaf2/40601/or6.png b/static/cd12abcd7fdd02842fd5225eab7ceaf2/40601/or6.png
new file mode 100644
index 0000000..658116c
Binary files /dev/null and b/static/cd12abcd7fdd02842fd5225eab7ceaf2/40601/or6.png differ
diff --git a/static/cd12abcd7fdd02842fd5225eab7ceaf2/6bdcf/or6.png b/static/cd12abcd7fdd02842fd5225eab7ceaf2/6bdcf/or6.png
new file mode 100644
index 0000000..d4e2ae6
Binary files /dev/null and b/static/cd12abcd7fdd02842fd5225eab7ceaf2/6bdcf/or6.png differ
diff --git a/static/cd12abcd7fdd02842fd5225eab7ceaf2/c26ae/or6.png b/static/cd12abcd7fdd02842fd5225eab7ceaf2/c26ae/or6.png
new file mode 100644
index 0000000..79b28a3
Binary files /dev/null and b/static/cd12abcd7fdd02842fd5225eab7ceaf2/c26ae/or6.png differ
diff --git a/static/cd12abcd7fdd02842fd5225eab7ceaf2/f058b/or6.png b/static/cd12abcd7fdd02842fd5225eab7ceaf2/f058b/or6.png
new file mode 100644
index 0000000..70d3d20
Binary files /dev/null and b/static/cd12abcd7fdd02842fd5225eab7ceaf2/f058b/or6.png differ
diff --git a/static/cd12abcd7fdd02842fd5225eab7ceaf2/f5209/or6.png b/static/cd12abcd7fdd02842fd5225eab7ceaf2/f5209/or6.png
new file mode 100644
index 0000000..5fb7c50
Binary files /dev/null and b/static/cd12abcd7fdd02842fd5225eab7ceaf2/f5209/or6.png differ
diff --git a/static/d272074553e0a809343b4bf35fa04f09/40601/reconcile-publish.png b/static/d272074553e0a809343b4bf35fa04f09/40601/reconcile-publish.png
new file mode 100644
index 0000000..149b39e
Binary files /dev/null and b/static/d272074553e0a809343b4bf35fa04f09/40601/reconcile-publish.png differ
diff --git a/static/d272074553e0a809343b4bf35fa04f09/5d942/reconcile-publish.png b/static/d272074553e0a809343b4bf35fa04f09/5d942/reconcile-publish.png
new file mode 100644
index 0000000..954f676
Binary files /dev/null and b/static/d272074553e0a809343b4bf35fa04f09/5d942/reconcile-publish.png differ
diff --git a/static/d272074553e0a809343b4bf35fa04f09/6bdcf/reconcile-publish.png b/static/d272074553e0a809343b4bf35fa04f09/6bdcf/reconcile-publish.png
new file mode 100644
index 0000000..8030dfd
Binary files /dev/null and b/static/d272074553e0a809343b4bf35fa04f09/6bdcf/reconcile-publish.png differ
diff --git a/static/d272074553e0a809343b4bf35fa04f09/78612/reconcile-publish.png b/static/d272074553e0a809343b4bf35fa04f09/78612/reconcile-publish.png
new file mode 100644
index 0000000..e111ce8
Binary files /dev/null and b/static/d272074553e0a809343b4bf35fa04f09/78612/reconcile-publish.png differ
diff --git a/static/d272074553e0a809343b4bf35fa04f09/c26ae/reconcile-publish.png b/static/d272074553e0a809343b4bf35fa04f09/c26ae/reconcile-publish.png
new file mode 100644
index 0000000..1f65364
Binary files /dev/null and b/static/d272074553e0a809343b4bf35fa04f09/c26ae/reconcile-publish.png differ
diff --git a/static/d272074553e0a809343b4bf35fa04f09/f058b/reconcile-publish.png b/static/d272074553e0a809343b4bf35fa04f09/f058b/reconcile-publish.png
new file mode 100644
index 0000000..382bd05
Binary files /dev/null and b/static/d272074553e0a809343b4bf35fa04f09/f058b/reconcile-publish.png differ
diff --git a/static/d4191f055b104445a4c283e09a9ebec8/40601/or4.png b/static/d4191f055b104445a4c283e09a9ebec8/40601/or4.png
new file mode 100644
index 0000000..d81835b
Binary files /dev/null and b/static/d4191f055b104445a4c283e09a9ebec8/40601/or4.png differ
diff --git a/static/d4191f055b104445a4c283e09a9ebec8/6bdcf/or4.png b/static/d4191f055b104445a4c283e09a9ebec8/6bdcf/or4.png
new file mode 100644
index 0000000..7230c0f
Binary files /dev/null and b/static/d4191f055b104445a4c283e09a9ebec8/6bdcf/or4.png differ
diff --git a/static/d4191f055b104445a4c283e09a9ebec8/ad997/or4.png b/static/d4191f055b104445a4c283e09a9ebec8/ad997/or4.png
new file mode 100644
index 0000000..e216c93
Binary files /dev/null and b/static/d4191f055b104445a4c283e09a9ebec8/ad997/or4.png differ
diff --git a/static/d4191f055b104445a4c283e09a9ebec8/c26ae/or4.png b/static/d4191f055b104445a4c283e09a9ebec8/c26ae/or4.png
new file mode 100644
index 0000000..12de177
Binary files /dev/null and b/static/d4191f055b104445a4c283e09a9ebec8/c26ae/or4.png differ
diff --git a/static/d4191f055b104445a4c283e09a9ebec8/f058b/or4.png b/static/d4191f055b104445a4c283e09a9ebec8/f058b/or4.png
new file mode 100644
index 0000000..877b13a
Binary files /dev/null and b/static/d4191f055b104445a4c283e09a9ebec8/f058b/or4.png differ
diff --git a/static/dcbc73109ed2edf14419bc87466a5342/40601/open-pr-at-w3id.png b/static/dcbc73109ed2edf14419bc87466a5342/40601/open-pr-at-w3id.png
new file mode 100644
index 0000000..eab6ea6
Binary files /dev/null and b/static/dcbc73109ed2edf14419bc87466a5342/40601/open-pr-at-w3id.png differ
diff --git a/static/dcbc73109ed2edf14419bc87466a5342/6bdcf/open-pr-at-w3id.png b/static/dcbc73109ed2edf14419bc87466a5342/6bdcf/open-pr-at-w3id.png
new file mode 100644
index 0000000..918ac9b
Binary files /dev/null and b/static/dcbc73109ed2edf14419bc87466a5342/6bdcf/open-pr-at-w3id.png differ
diff --git a/static/dcbc73109ed2edf14419bc87466a5342/c26ae/open-pr-at-w3id.png b/static/dcbc73109ed2edf14419bc87466a5342/c26ae/open-pr-at-w3id.png
new file mode 100644
index 0000000..e4e5533
Binary files /dev/null and b/static/dcbc73109ed2edf14419bc87466a5342/c26ae/open-pr-at-w3id.png differ
diff --git a/static/dcbc73109ed2edf14419bc87466a5342/eb2af/open-pr-at-w3id.png b/static/dcbc73109ed2edf14419bc87466a5342/eb2af/open-pr-at-w3id.png
new file mode 100644
index 0000000..872d544
Binary files /dev/null and b/static/dcbc73109ed2edf14419bc87466a5342/eb2af/open-pr-at-w3id.png differ
diff --git a/static/dcbc73109ed2edf14419bc87466a5342/f058b/open-pr-at-w3id.png b/static/dcbc73109ed2edf14419bc87466a5342/f058b/open-pr-at-w3id.png
new file mode 100644
index 0000000..afe6051
Binary files /dev/null and b/static/dcbc73109ed2edf14419bc87466a5342/f058b/open-pr-at-w3id.png differ
diff --git a/static/e5a580cc6f532f9f98c427ff90a06429/29114/published-vocab.png b/static/e5a580cc6f532f9f98c427ff90a06429/29114/published-vocab.png
new file mode 100644
index 0000000..9bc52ef
Binary files /dev/null and b/static/e5a580cc6f532f9f98c427ff90a06429/29114/published-vocab.png differ
diff --git a/static/e5a580cc6f532f9f98c427ff90a06429/40601/published-vocab.png b/static/e5a580cc6f532f9f98c427ff90a06429/40601/published-vocab.png
new file mode 100644
index 0000000..03cbe95
Binary files /dev/null and b/static/e5a580cc6f532f9f98c427ff90a06429/40601/published-vocab.png differ
diff --git a/static/e5a580cc6f532f9f98c427ff90a06429/6bdcf/published-vocab.png b/static/e5a580cc6f532f9f98c427ff90a06429/6bdcf/published-vocab.png
new file mode 100644
index 0000000..e94b58f
Binary files /dev/null and b/static/e5a580cc6f532f9f98c427ff90a06429/6bdcf/published-vocab.png differ
diff --git a/static/e5a580cc6f532f9f98c427ff90a06429/78612/published-vocab.png b/static/e5a580cc6f532f9f98c427ff90a06429/78612/published-vocab.png
new file mode 100644
index 0000000..ad95c46
Binary files /dev/null and b/static/e5a580cc6f532f9f98c427ff90a06429/78612/published-vocab.png differ
diff --git a/static/e5a580cc6f532f9f98c427ff90a06429/c26ae/published-vocab.png b/static/e5a580cc6f532f9f98c427ff90a06429/c26ae/published-vocab.png
new file mode 100644
index 0000000..855c894
Binary files /dev/null and b/static/e5a580cc6f532f9f98c427ff90a06429/c26ae/published-vocab.png differ
diff --git a/static/e5a580cc6f532f9f98c427ff90a06429/f058b/published-vocab.png b/static/e5a580cc6f532f9f98c427ff90a06429/f058b/published-vocab.png
new file mode 100644
index 0000000..23bff1a
Binary files /dev/null and b/static/e5a580cc6f532f9f98c427ff90a06429/f058b/published-vocab.png differ
diff --git a/static/e5af98f7ca69f0c23efe94e8e10c44c3/3c492/upload.png b/static/e5af98f7ca69f0c23efe94e8e10c44c3/3c492/upload.png
new file mode 100644
index 0000000..06e6bdb
Binary files /dev/null and b/static/e5af98f7ca69f0c23efe94e8e10c44c3/3c492/upload.png differ
diff --git a/static/e5af98f7ca69f0c23efe94e8e10c44c3/40601/upload.png b/static/e5af98f7ca69f0c23efe94e8e10c44c3/40601/upload.png
new file mode 100644
index 0000000..e0420a1
Binary files /dev/null and b/static/e5af98f7ca69f0c23efe94e8e10c44c3/40601/upload.png differ
diff --git a/static/e5af98f7ca69f0c23efe94e8e10c44c3/6bdcf/upload.png b/static/e5af98f7ca69f0c23efe94e8e10c44c3/6bdcf/upload.png
new file mode 100644
index 0000000..283c7d2
Binary files /dev/null and b/static/e5af98f7ca69f0c23efe94e8e10c44c3/6bdcf/upload.png differ
diff --git a/static/e5af98f7ca69f0c23efe94e8e10c44c3/78612/upload.png b/static/e5af98f7ca69f0c23efe94e8e10c44c3/78612/upload.png
new file mode 100644
index 0000000..8354fa7
Binary files /dev/null and b/static/e5af98f7ca69f0c23efe94e8e10c44c3/78612/upload.png differ
diff --git a/static/e5af98f7ca69f0c23efe94e8e10c44c3/c26ae/upload.png b/static/e5af98f7ca69f0c23efe94e8e10c44c3/c26ae/upload.png
new file mode 100644
index 0000000..8dfedf2
Binary files /dev/null and b/static/e5af98f7ca69f0c23efe94e8e10c44c3/c26ae/upload.png differ
diff --git a/static/e5af98f7ca69f0c23efe94e8e10c44c3/f058b/upload.png b/static/e5af98f7ca69f0c23efe94e8e10c44c3/f058b/upload.png
new file mode 100644
index 0000000..400a96b
Binary files /dev/null and b/static/e5af98f7ca69f0c23efe94e8e10c44c3/f058b/upload.png differ
diff --git a/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/058d6/cypress.png b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/058d6/cypress.png
new file mode 100644
index 0000000..adbebf9
Binary files /dev/null and b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/058d6/cypress.png differ
diff --git a/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/40601/cypress.png b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/40601/cypress.png
new file mode 100644
index 0000000..2ab1e6b
Binary files /dev/null and b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/40601/cypress.png differ
diff --git a/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/6bdcf/cypress.png b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/6bdcf/cypress.png
new file mode 100644
index 0000000..2ec9e06
Binary files /dev/null and b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/6bdcf/cypress.png differ
diff --git a/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/78612/cypress.png b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/78612/cypress.png
new file mode 100644
index 0000000..220451e
Binary files /dev/null and b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/78612/cypress.png differ
diff --git a/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/c26ae/cypress.png b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/c26ae/cypress.png
new file mode 100644
index 0000000..73f7726
Binary files /dev/null and b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/c26ae/cypress.png differ
diff --git a/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/f058b/cypress.png b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/f058b/cypress.png
new file mode 100644
index 0000000..6f9858b
Binary files /dev/null and b/static/e91ebe3f4fb4ad6ce6b58ae82f8e00b7/f058b/cypress.png differ
diff --git a/static/ec0c8783a270bd477ef034281c1c4796/29114/prepopulated-webform.png b/static/ec0c8783a270bd477ef034281c1c4796/29114/prepopulated-webform.png
new file mode 100644
index 0000000..1a6f0b3
Binary files /dev/null and b/static/ec0c8783a270bd477ef034281c1c4796/29114/prepopulated-webform.png differ
diff --git a/static/ec0c8783a270bd477ef034281c1c4796/40601/prepopulated-webform.png b/static/ec0c8783a270bd477ef034281c1c4796/40601/prepopulated-webform.png
new file mode 100644
index 0000000..d392410
Binary files /dev/null and b/static/ec0c8783a270bd477ef034281c1c4796/40601/prepopulated-webform.png differ
diff --git a/static/ec0c8783a270bd477ef034281c1c4796/6bdcf/prepopulated-webform.png b/static/ec0c8783a270bd477ef034281c1c4796/6bdcf/prepopulated-webform.png
new file mode 100644
index 0000000..3fb8606
Binary files /dev/null and b/static/ec0c8783a270bd477ef034281c1c4796/6bdcf/prepopulated-webform.png differ
diff --git a/static/ec0c8783a270bd477ef034281c1c4796/78612/prepopulated-webform.png b/static/ec0c8783a270bd477ef034281c1c4796/78612/prepopulated-webform.png
new file mode 100644
index 0000000..9898c39
Binary files /dev/null and b/static/ec0c8783a270bd477ef034281c1c4796/78612/prepopulated-webform.png differ
diff --git a/static/ec0c8783a270bd477ef034281c1c4796/c26ae/prepopulated-webform.png b/static/ec0c8783a270bd477ef034281c1c4796/c26ae/prepopulated-webform.png
new file mode 100644
index 0000000..98b7d0a
Binary files /dev/null and b/static/ec0c8783a270bd477ef034281c1c4796/c26ae/prepopulated-webform.png differ
diff --git a/static/ec0c8783a270bd477ef034281c1c4796/f058b/prepopulated-webform.png b/static/ec0c8783a270bd477ef034281c1c4796/f058b/prepopulated-webform.png
new file mode 100644
index 0000000..276e2b7
Binary files /dev/null and b/static/ec0c8783a270bd477ef034281c1c4796/f058b/prepopulated-webform.png differ
diff --git a/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/302a4/set_gh_pages.png b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/302a4/set_gh_pages.png
new file mode 100644
index 0000000..09b8928
Binary files /dev/null and b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/302a4/set_gh_pages.png differ
diff --git a/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/40601/set_gh_pages.png b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/40601/set_gh_pages.png
new file mode 100644
index 0000000..70cc2d4
Binary files /dev/null and b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/40601/set_gh_pages.png differ
diff --git a/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/6bdcf/set_gh_pages.png b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/6bdcf/set_gh_pages.png
new file mode 100644
index 0000000..9accd32
Binary files /dev/null and b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/6bdcf/set_gh_pages.png differ
diff --git a/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/c26ae/set_gh_pages.png b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/c26ae/set_gh_pages.png
new file mode 100644
index 0000000..bbd06b0
Binary files /dev/null and b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/c26ae/set_gh_pages.png differ
diff --git a/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/f058b/set_gh_pages.png b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/f058b/set_gh_pages.png
new file mode 100644
index 0000000..434fc13
Binary files /dev/null and b/static/ffc58a76b23bb9ae2b5cd91d14bfe5e6/f058b/set_gh_pages.png differ
diff --git a/static/merriweather-latin-300-58b18067ebbd21fda77b67e73c241d3b.woff b/static/merriweather-latin-300-58b18067ebbd21fda77b67e73c241d3b.woff
new file mode 100644
index 0000000..7c8fa9a
Binary files /dev/null and b/static/merriweather-latin-300-58b18067ebbd21fda77b67e73c241d3b.woff differ
diff --git a/static/merriweather-latin-300-fc117160c69a8ea0851b26dd14748ee4.woff2 b/static/merriweather-latin-300-fc117160c69a8ea0851b26dd14748ee4.woff2
new file mode 100644
index 0000000..fdf59df
Binary files /dev/null and b/static/merriweather-latin-300-fc117160c69a8ea0851b26dd14748ee4.woff2 differ
diff --git a/static/merriweather-latin-300italic-23c3f1f88683618a4fb8d265d33d383a.woff b/static/merriweather-latin-300italic-23c3f1f88683618a4fb8d265d33d383a.woff
new file mode 100644
index 0000000..0955afa
Binary files /dev/null and b/static/merriweather-latin-300italic-23c3f1f88683618a4fb8d265d33d383a.woff differ
diff --git a/static/merriweather-latin-300italic-fe29961474f8dbf77c0aa7b9a629e4bc.woff2 b/static/merriweather-latin-300italic-fe29961474f8dbf77c0aa7b9a629e4bc.woff2
new file mode 100644
index 0000000..d07054a
Binary files /dev/null and b/static/merriweather-latin-300italic-fe29961474f8dbf77c0aa7b9a629e4bc.woff2 differ
diff --git a/static/merriweather-latin-400-040426f99ff6e00b86506452e0d1f10b.woff b/static/merriweather-latin-400-040426f99ff6e00b86506452e0d1f10b.woff
new file mode 100644
index 0000000..2f2ab42
Binary files /dev/null and b/static/merriweather-latin-400-040426f99ff6e00b86506452e0d1f10b.woff differ
diff --git a/static/merriweather-latin-400-d9479e8023bef9cbd9bf8d6eabd6bf36.woff2 b/static/merriweather-latin-400-d9479e8023bef9cbd9bf8d6eabd6bf36.woff2
new file mode 100644
index 0000000..0998409
Binary files /dev/null and b/static/merriweather-latin-400-d9479e8023bef9cbd9bf8d6eabd6bf36.woff2 differ
diff --git a/static/merriweather-latin-400italic-2de7bfeaf08fb03d4315d49947f062f7.woff2 b/static/merriweather-latin-400italic-2de7bfeaf08fb03d4315d49947f062f7.woff2
new file mode 100644
index 0000000..3c26191
Binary files /dev/null and b/static/merriweather-latin-400italic-2de7bfeaf08fb03d4315d49947f062f7.woff2 differ
diff --git a/static/merriweather-latin-400italic-79db67aca65f5285964ab332bd65f451.woff b/static/merriweather-latin-400italic-79db67aca65f5285964ab332bd65f451.woff
new file mode 100644
index 0000000..fe2e3dc
Binary files /dev/null and b/static/merriweather-latin-400italic-79db67aca65f5285964ab332bd65f451.woff differ
diff --git a/static/merriweather-latin-700-22fb8afba4ab1f093b6ef9e28a9b6e92.woff b/static/merriweather-latin-700-22fb8afba4ab1f093b6ef9e28a9b6e92.woff
new file mode 100644
index 0000000..afd6fec
Binary files /dev/null and b/static/merriweather-latin-700-22fb8afba4ab1f093b6ef9e28a9b6e92.woff differ
diff --git a/static/merriweather-latin-700-4b08e01d805fa35d7bf777f1b24314ae.woff2 b/static/merriweather-latin-700-4b08e01d805fa35d7bf777f1b24314ae.woff2
new file mode 100644
index 0000000..ead7c09
Binary files /dev/null and b/static/merriweather-latin-700-4b08e01d805fa35d7bf777f1b24314ae.woff2 differ
diff --git a/static/merriweather-latin-700italic-cd92541b177652fffb6e3b952f1c33f1.woff2 b/static/merriweather-latin-700italic-cd92541b177652fffb6e3b952f1c33f1.woff2
new file mode 100644
index 0000000..051450d
Binary files /dev/null and b/static/merriweather-latin-700italic-cd92541b177652fffb6e3b952f1c33f1.woff2 differ
diff --git a/static/merriweather-latin-700italic-f87f3d87cea0dd0979bfc8ac9ea90243.woff b/static/merriweather-latin-700italic-f87f3d87cea0dd0979bfc8ac9ea90243.woff
new file mode 100644
index 0000000..a572cca
Binary files /dev/null and b/static/merriweather-latin-700italic-f87f3d87cea0dd0979bfc8ac9ea90243.woff differ
diff --git a/static/merriweather-latin-900-5d4e42cb44410674acd99153d57df032.woff b/static/merriweather-latin-900-5d4e42cb44410674acd99153d57df032.woff
new file mode 100644
index 0000000..51ffedd
Binary files /dev/null and b/static/merriweather-latin-900-5d4e42cb44410674acd99153d57df032.woff differ
diff --git a/static/merriweather-latin-900-f813fc6a4bee46eda5224ac7ebf1b7be.woff2 b/static/merriweather-latin-900-f813fc6a4bee46eda5224ac7ebf1b7be.woff2
new file mode 100644
index 0000000..a671a48
Binary files /dev/null and b/static/merriweather-latin-900-f813fc6a4bee46eda5224ac7ebf1b7be.woff2 differ
diff --git a/static/merriweather-latin-900italic-9647f9fdab98756989a8a5550eb205c3.woff b/static/merriweather-latin-900italic-9647f9fdab98756989a8a5550eb205c3.woff
new file mode 100644
index 0000000..9d086ab
Binary files /dev/null and b/static/merriweather-latin-900italic-9647f9fdab98756989a8a5550eb205c3.woff differ
diff --git a/static/merriweather-latin-900italic-b7901d85486871c1779c0e93ddd85656.woff2 b/static/merriweather-latin-900italic-b7901d85486871c1779c0e93ddd85656.woff2
new file mode 100644
index 0000000..1771fd9
Binary files /dev/null and b/static/merriweather-latin-900italic-b7901d85486871c1779c0e93ddd85656.woff2 differ
diff --git a/static/montserrat-latin-100-8d7d79679b70dbe27172b6460e7a7910.woff2 b/static/montserrat-latin-100-8d7d79679b70dbe27172b6460e7a7910.woff2
new file mode 100644
index 0000000..ea9b882
Binary files /dev/null and b/static/montserrat-latin-100-8d7d79679b70dbe27172b6460e7a7910.woff2 differ
diff --git a/static/montserrat-latin-100-ec38980a9e0119a379e2a9b3dbb1901a.woff b/static/montserrat-latin-100-ec38980a9e0119a379e2a9b3dbb1901a.woff
new file mode 100644
index 0000000..0860eed
Binary files /dev/null and b/static/montserrat-latin-100-ec38980a9e0119a379e2a9b3dbb1901a.woff differ
diff --git a/static/montserrat-latin-100italic-3b325a3173c8207435cd1b76e19bf501.woff b/static/montserrat-latin-100italic-3b325a3173c8207435cd1b76e19bf501.woff
new file mode 100644
index 0000000..8081773
Binary files /dev/null and b/static/montserrat-latin-100italic-3b325a3173c8207435cd1b76e19bf501.woff differ
diff --git a/static/montserrat-latin-100italic-e279051046ba1286706adc886cf1c96b.woff2 b/static/montserrat-latin-100italic-e279051046ba1286706adc886cf1c96b.woff2
new file mode 100644
index 0000000..703efd2
Binary files /dev/null and b/static/montserrat-latin-100italic-e279051046ba1286706adc886cf1c96b.woff2 differ
diff --git a/static/montserrat-latin-200-2d8ba08717110d27122e54c34b8a5798.woff b/static/montserrat-latin-200-2d8ba08717110d27122e54c34b8a5798.woff
new file mode 100644
index 0000000..52c02a8
Binary files /dev/null and b/static/montserrat-latin-200-2d8ba08717110d27122e54c34b8a5798.woff differ
diff --git a/static/montserrat-latin-200-9d266fbbfa6cab7009bd56003b1eeb67.woff2 b/static/montserrat-latin-200-9d266fbbfa6cab7009bd56003b1eeb67.woff2
new file mode 100644
index 0000000..74cce9e
Binary files /dev/null and b/static/montserrat-latin-200-9d266fbbfa6cab7009bd56003b1eeb67.woff2 differ
diff --git a/static/montserrat-latin-200italic-6e5b3756583bb2263eb062eae992735e.woff2 b/static/montserrat-latin-200italic-6e5b3756583bb2263eb062eae992735e.woff2
new file mode 100644
index 0000000..3e4081b
Binary files /dev/null and b/static/montserrat-latin-200italic-6e5b3756583bb2263eb062eae992735e.woff2 differ
diff --git a/static/montserrat-latin-200italic-a0d6f343e4b536c582926255367a57da.woff b/static/montserrat-latin-200italic-a0d6f343e4b536c582926255367a57da.woff
new file mode 100644
index 0000000..271af27
Binary files /dev/null and b/static/montserrat-latin-200italic-a0d6f343e4b536c582926255367a57da.woff differ
diff --git a/static/montserrat-latin-300-00b3e893aab5a8fd632d6342eb72551a.woff2 b/static/montserrat-latin-300-00b3e893aab5a8fd632d6342eb72551a.woff2
new file mode 100644
index 0000000..6b1bddc
Binary files /dev/null and b/static/montserrat-latin-300-00b3e893aab5a8fd632d6342eb72551a.woff2 differ
diff --git a/static/montserrat-latin-300-ea303695ceab35f17e7d062f30e0173b.woff b/static/montserrat-latin-300-ea303695ceab35f17e7d062f30e0173b.woff
new file mode 100644
index 0000000..dfa5a04
Binary files /dev/null and b/static/montserrat-latin-300-ea303695ceab35f17e7d062f30e0173b.woff differ
diff --git a/static/montserrat-latin-300italic-54b0bf2c8c4c12ffafd803be2466a790.woff b/static/montserrat-latin-300italic-54b0bf2c8c4c12ffafd803be2466a790.woff
new file mode 100644
index 0000000..9d0ca66
Binary files /dev/null and b/static/montserrat-latin-300italic-54b0bf2c8c4c12ffafd803be2466a790.woff differ
diff --git a/static/montserrat-latin-300italic-56f34ea368f6aedf89583d444bbcb227.woff2 b/static/montserrat-latin-300italic-56f34ea368f6aedf89583d444bbcb227.woff2
new file mode 100644
index 0000000..1d0f3c5
Binary files /dev/null and b/static/montserrat-latin-300italic-56f34ea368f6aedf89583d444bbcb227.woff2 differ
diff --git a/static/montserrat-latin-400-0659a9f4e90db5cf51b50d005bff1e41.woff b/static/montserrat-latin-400-0659a9f4e90db5cf51b50d005bff1e41.woff
new file mode 100644
index 0000000..676a065
Binary files /dev/null and b/static/montserrat-latin-400-0659a9f4e90db5cf51b50d005bff1e41.woff differ
diff --git a/static/montserrat-latin-400-b71748ae4f80ec8c014def4c5fa8688b.woff2 b/static/montserrat-latin-400-b71748ae4f80ec8c014def4c5fa8688b.woff2
new file mode 100644
index 0000000..70788c2
Binary files /dev/null and b/static/montserrat-latin-400-b71748ae4f80ec8c014def4c5fa8688b.woff2 differ
diff --git a/static/montserrat-latin-400italic-6eed6b4cbb809c6efc7aa7ddad6dbe3e.woff2 b/static/montserrat-latin-400italic-6eed6b4cbb809c6efc7aa7ddad6dbe3e.woff2
new file mode 100644
index 0000000..469aede
Binary files /dev/null and b/static/montserrat-latin-400italic-6eed6b4cbb809c6efc7aa7ddad6dbe3e.woff2 differ
diff --git a/static/montserrat-latin-400italic-7583622cfde30ae49086d18447ab28e7.woff b/static/montserrat-latin-400italic-7583622cfde30ae49086d18447ab28e7.woff
new file mode 100644
index 0000000..67f1e85
Binary files /dev/null and b/static/montserrat-latin-400italic-7583622cfde30ae49086d18447ab28e7.woff differ
diff --git a/static/montserrat-latin-500-091b209546e16313fd4f4fc36090c757.woff2 b/static/montserrat-latin-500-091b209546e16313fd4f4fc36090c757.woff2
new file mode 100644
index 0000000..9dc5c7f
Binary files /dev/null and b/static/montserrat-latin-500-091b209546e16313fd4f4fc36090c757.woff2 differ
diff --git a/static/montserrat-latin-500-edd311588712a96bbf435fad264fff62.woff b/static/montserrat-latin-500-edd311588712a96bbf435fad264fff62.woff
new file mode 100644
index 0000000..1c83d85
Binary files /dev/null and b/static/montserrat-latin-500-edd311588712a96bbf435fad264fff62.woff differ
diff --git a/static/montserrat-latin-500italic-5146cbfe02b1deea5dffea27a5f2f998.woff b/static/montserrat-latin-500italic-5146cbfe02b1deea5dffea27a5f2f998.woff
new file mode 100644
index 0000000..71476d8
Binary files /dev/null and b/static/montserrat-latin-500italic-5146cbfe02b1deea5dffea27a5f2f998.woff differ
diff --git a/static/montserrat-latin-500italic-c90ced68b46050061d1a41842d6dfb43.woff2 b/static/montserrat-latin-500italic-c90ced68b46050061d1a41842d6dfb43.woff2
new file mode 100644
index 0000000..0fb9838
Binary files /dev/null and b/static/montserrat-latin-500italic-c90ced68b46050061d1a41842d6dfb43.woff2 differ
diff --git a/static/montserrat-latin-600-0480d2f8a71f38db8633b84d8722e0c2.woff2 b/static/montserrat-latin-600-0480d2f8a71f38db8633b84d8722e0c2.woff2
new file mode 100644
index 0000000..29cc1a9
Binary files /dev/null and b/static/montserrat-latin-600-0480d2f8a71f38db8633b84d8722e0c2.woff2 differ
diff --git a/static/montserrat-latin-600-b77863a375260a05dd13f86a1cee598f.woff b/static/montserrat-latin-600-b77863a375260a05dd13f86a1cee598f.woff
new file mode 100644
index 0000000..e7f8a31
Binary files /dev/null and b/static/montserrat-latin-600-b77863a375260a05dd13f86a1cee598f.woff differ
diff --git a/static/montserrat-latin-600italic-c4fcfeeb057724724097167e57bd7801.woff b/static/montserrat-latin-600italic-c4fcfeeb057724724097167e57bd7801.woff
new file mode 100644
index 0000000..e00bd8b
Binary files /dev/null and b/static/montserrat-latin-600italic-c4fcfeeb057724724097167e57bd7801.woff differ
diff --git a/static/montserrat-latin-600italic-cf46ffb11f3a60d7df0567f8851a1d00.woff2 b/static/montserrat-latin-600italic-cf46ffb11f3a60d7df0567f8851a1d00.woff2
new file mode 100644
index 0000000..08f3960
Binary files /dev/null and b/static/montserrat-latin-600italic-cf46ffb11f3a60d7df0567f8851a1d00.woff2 differ
diff --git a/static/montserrat-latin-700-7dbcc8a5ea2289d83f657c25b4be6193.woff2 b/static/montserrat-latin-700-7dbcc8a5ea2289d83f657c25b4be6193.woff2
new file mode 100644
index 0000000..3d0b409
Binary files /dev/null and b/static/montserrat-latin-700-7dbcc8a5ea2289d83f657c25b4be6193.woff2 differ
diff --git a/static/montserrat-latin-700-99271a835e1cae8c76ef8bba99a8cc4e.woff b/static/montserrat-latin-700-99271a835e1cae8c76ef8bba99a8cc4e.woff
new file mode 100644
index 0000000..8573760
Binary files /dev/null and b/static/montserrat-latin-700-99271a835e1cae8c76ef8bba99a8cc4e.woff differ
diff --git a/static/montserrat-latin-700italic-6779372f04095051c62ed36bc1dcc142.woff b/static/montserrat-latin-700italic-6779372f04095051c62ed36bc1dcc142.woff
new file mode 100644
index 0000000..e9f1940
Binary files /dev/null and b/static/montserrat-latin-700italic-6779372f04095051c62ed36bc1dcc142.woff differ
diff --git a/static/montserrat-latin-700italic-c41ad6bdb4bd504a843d546d0a47958d.woff2 b/static/montserrat-latin-700italic-c41ad6bdb4bd504a843d546d0a47958d.woff2
new file mode 100644
index 0000000..0997e79
Binary files /dev/null and b/static/montserrat-latin-700italic-c41ad6bdb4bd504a843d546d0a47958d.woff2 differ
diff --git a/static/montserrat-latin-800-4e3c615967a2360f5db87d2f0fd2456f.woff b/static/montserrat-latin-800-4e3c615967a2360f5db87d2f0fd2456f.woff
new file mode 100644
index 0000000..79203dd
Binary files /dev/null and b/static/montserrat-latin-800-4e3c615967a2360f5db87d2f0fd2456f.woff differ
diff --git a/static/montserrat-latin-800-db9a3e0ba7eaea32e5f55328ace6cf23.woff2 b/static/montserrat-latin-800-db9a3e0ba7eaea32e5f55328ace6cf23.woff2
new file mode 100644
index 0000000..0abb707
Binary files /dev/null and b/static/montserrat-latin-800-db9a3e0ba7eaea32e5f55328ace6cf23.woff2 differ
diff --git a/static/montserrat-latin-800italic-bf45bfa14805969eda318973947bc42b.woff2 b/static/montserrat-latin-800italic-bf45bfa14805969eda318973947bc42b.woff2
new file mode 100644
index 0000000..674e6ea
Binary files /dev/null and b/static/montserrat-latin-800italic-bf45bfa14805969eda318973947bc42b.woff2 differ
diff --git a/static/montserrat-latin-800italic-fe82abb0bcede51bf724254878e0c374.woff b/static/montserrat-latin-800italic-fe82abb0bcede51bf724254878e0c374.woff
new file mode 100644
index 0000000..6541557
Binary files /dev/null and b/static/montserrat-latin-800italic-fe82abb0bcede51bf724254878e0c374.woff differ
diff --git a/static/montserrat-latin-900-8211f418baeb8ec880b80ba3c682f957.woff b/static/montserrat-latin-900-8211f418baeb8ec880b80ba3c682f957.woff
new file mode 100644
index 0000000..e5f4347
Binary files /dev/null and b/static/montserrat-latin-900-8211f418baeb8ec880b80ba3c682f957.woff differ
diff --git a/static/montserrat-latin-900-e66c7edc609e24bacbb705175669d814.woff2 b/static/montserrat-latin-900-e66c7edc609e24bacbb705175669d814.woff2
new file mode 100644
index 0000000..d3cfc4a
Binary files /dev/null and b/static/montserrat-latin-900-e66c7edc609e24bacbb705175669d814.woff2 differ
diff --git a/static/montserrat-latin-900italic-4454c775e48152c1a72510ceed3603e2.woff2 b/static/montserrat-latin-900italic-4454c775e48152c1a72510ceed3603e2.woff2
new file mode 100644
index 0000000..ad217a8
Binary files /dev/null and b/static/montserrat-latin-900italic-4454c775e48152c1a72510ceed3603e2.woff2 differ
diff --git a/static/montserrat-latin-900italic-efcaa0f6a82ee0640b83a0916e6e8d68.woff b/static/montserrat-latin-900italic-efcaa0f6a82ee0640b83a0916e6e8d68.woff
new file mode 100644
index 0000000..d7da403
Binary files /dev/null and b/static/montserrat-latin-900italic-efcaa0f6a82ee0640b83a0916e6e8d68.woff differ
diff --git a/static/ubuntu-v20-latin-700-6788e6ed863a8e559ef28b3913554383.woff b/static/ubuntu-v20-latin-700-6788e6ed863a8e559ef28b3913554383.woff
new file mode 100644
index 0000000..e58e9da
Binary files /dev/null and b/static/ubuntu-v20-latin-700-6788e6ed863a8e559ef28b3913554383.woff differ
diff --git a/static/ubuntu-v20-latin-700-7ceec6fd3e7d00630b2568986c915532.woff2 b/static/ubuntu-v20-latin-700-7ceec6fd3e7d00630b2568986c915532.woff2
new file mode 100644
index 0000000..2c08bc6
Binary files /dev/null and b/static/ubuntu-v20-latin-700-7ceec6fd3e7d00630b2568986c915532.woff2 differ
diff --git a/static/ubuntu-v20-latin-700-b0c73ddaf0479ad915d9ad30e1f4a5a7.svg b/static/ubuntu-v20-latin-700-b0c73ddaf0479ad915d9ad30e1f4a5a7.svg
new file mode 100644
index 0000000..7aa9c38
--- /dev/null
+++ b/static/ubuntu-v20-latin-700-b0c73ddaf0479ad915d9ad30e1f4a5a7.svg
@@ -0,0 +1,363 @@
+
+
+
diff --git a/static/ubuntu-v20-latin-700-e7cf86fbe61557f0dcd34f4489f86840.eot b/static/ubuntu-v20-latin-700-e7cf86fbe61557f0dcd34f4489f86840.eot
new file mode 100644
index 0000000..84c5e77
Binary files /dev/null and b/static/ubuntu-v20-latin-700-e7cf86fbe61557f0dcd34f4489f86840.eot differ
diff --git a/static/ubuntu-v20-latin-700-fc38f31d636ef7ea94b75741b6d493c8.ttf b/static/ubuntu-v20-latin-700-fc38f31d636ef7ea94b75741b6d493c8.ttf
new file mode 100644
index 0000000..d04bb45
Binary files /dev/null and b/static/ubuntu-v20-latin-700-fc38f31d636ef7ea94b75741b6d493c8.ttf differ
diff --git a/static/ubuntu-v20-latin-regular-254e551b53a2ccebe384cd3af3a50c22.ttf b/static/ubuntu-v20-latin-regular-254e551b53a2ccebe384cd3af3a50c22.ttf
new file mode 100644
index 0000000..46c113f
Binary files /dev/null and b/static/ubuntu-v20-latin-regular-254e551b53a2ccebe384cd3af3a50c22.ttf differ
diff --git a/static/ubuntu-v20-latin-regular-2f02effe392a63dc07a5cdf187b2ef09.woff2 b/static/ubuntu-v20-latin-regular-2f02effe392a63dc07a5cdf187b2ef09.woff2
new file mode 100644
index 0000000..8070e4f
Binary files /dev/null and b/static/ubuntu-v20-latin-regular-2f02effe392a63dc07a5cdf187b2ef09.woff2 differ
diff --git a/static/ubuntu-v20-latin-regular-5fa0d860656d0e40c0861657b07f82e0.eot b/static/ubuntu-v20-latin-regular-5fa0d860656d0e40c0861657b07f82e0.eot
new file mode 100644
index 0000000..a483e2e
Binary files /dev/null and b/static/ubuntu-v20-latin-regular-5fa0d860656d0e40c0861657b07f82e0.eot differ
diff --git a/static/ubuntu-v20-latin-regular-6a317272decf6d2470386df117732fa6.woff b/static/ubuntu-v20-latin-regular-6a317272decf6d2470386df117732fa6.woff
new file mode 100644
index 0000000..24cf9ae
Binary files /dev/null and b/static/ubuntu-v20-latin-regular-6a317272decf6d2470386df117732fa6.woff differ
diff --git a/static/ubuntu-v20-latin-regular-f00252d5cc9aed7459bd59a61c740eba.svg b/static/ubuntu-v20-latin-regular-f00252d5cc9aed7459bd59a61c740eba.svg
new file mode 100644
index 0000000..66a4f6b
--- /dev/null
+++ b/static/ubuntu-v20-latin-regular-f00252d5cc9aed7459bd59a61c740eba.svg
@@ -0,0 +1,363 @@
+
+
+
diff --git a/styles.7d6cd8f07fdbd498b042.css b/styles.7d6cd8f07fdbd498b042.css
new file mode 100644
index 0000000..33f1980
--- /dev/null
+++ b/styles.7d6cd8f07fdbd498b042.css
@@ -0,0 +1,4 @@
+@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:100;src:local("Montserrat Thin "),local("Montserrat-Thin"),url(/static/montserrat-latin-100-8d7d79679b70dbe27172b6460e7a7910.woff2) format("woff2"),url(/static/montserrat-latin-100-ec38980a9e0119a379e2a9b3dbb1901a.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:100;src:local("Montserrat Thin italic"),local("Montserrat-Thinitalic"),url(/static/montserrat-latin-100italic-e279051046ba1286706adc886cf1c96b.woff2) format("woff2"),url(/static/montserrat-latin-100italic-3b325a3173c8207435cd1b76e19bf501.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:200;src:local("Montserrat Extra Light "),local("Montserrat-Extra Light"),url(/static/montserrat-latin-200-9d266fbbfa6cab7009bd56003b1eeb67.woff2) format("woff2"),url(/static/montserrat-latin-200-2d8ba08717110d27122e54c34b8a5798.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:200;src:local("Montserrat Extra Light italic"),local("Montserrat-Extra Lightitalic"),url(/static/montserrat-latin-200italic-6e5b3756583bb2263eb062eae992735e.woff2) format("woff2"),url(/static/montserrat-latin-200italic-a0d6f343e4b536c582926255367a57da.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:300;src:local("Montserrat Light "),local("Montserrat-Light"),url(/static/montserrat-latin-300-00b3e893aab5a8fd632d6342eb72551a.woff2) format("woff2"),url(/static/montserrat-latin-300-ea303695ceab35f17e7d062f30e0173b.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:300;src:local("Montserrat Light italic"),local("Montserrat-Lightitalic"),url(/static/montserrat-latin-300italic-56f34ea368f6aedf89583d444bbcb227.woff2) format("woff2"),url(/static/montserrat-latin-300italic-54b0bf2c8c4c12ffafd803be2466a790.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:400;src:local("Montserrat Regular "),local("Montserrat-Regular"),url(/static/montserrat-latin-400-b71748ae4f80ec8c014def4c5fa8688b.woff2) format("woff2"),url(/static/montserrat-latin-400-0659a9f4e90db5cf51b50d005bff1e41.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:400;src:local("Montserrat Regular italic"),local("Montserrat-Regularitalic"),url(/static/montserrat-latin-400italic-6eed6b4cbb809c6efc7aa7ddad6dbe3e.woff2) format("woff2"),url(/static/montserrat-latin-400italic-7583622cfde30ae49086d18447ab28e7.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:500;src:local("Montserrat Medium "),local("Montserrat-Medium"),url(/static/montserrat-latin-500-091b209546e16313fd4f4fc36090c757.woff2) format("woff2"),url(/static/montserrat-latin-500-edd311588712a96bbf435fad264fff62.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:500;src:local("Montserrat Medium italic"),local("Montserrat-Mediumitalic"),url(/static/montserrat-latin-500italic-c90ced68b46050061d1a41842d6dfb43.woff2) format("woff2"),url(/static/montserrat-latin-500italic-5146cbfe02b1deea5dffea27a5f2f998.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:600;src:local("Montserrat SemiBold "),local("Montserrat-SemiBold"),url(/static/montserrat-latin-600-0480d2f8a71f38db8633b84d8722e0c2.woff2) format("woff2"),url(/static/montserrat-latin-600-b77863a375260a05dd13f86a1cee598f.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:600;src:local("Montserrat SemiBold italic"),local("Montserrat-SemiBolditalic"),url(/static/montserrat-latin-600italic-cf46ffb11f3a60d7df0567f8851a1d00.woff2) format("woff2"),url(/static/montserrat-latin-600italic-c4fcfeeb057724724097167e57bd7801.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:700;src:local("Montserrat Bold "),local("Montserrat-Bold"),url(/static/montserrat-latin-700-7dbcc8a5ea2289d83f657c25b4be6193.woff2) format("woff2"),url(/static/montserrat-latin-700-99271a835e1cae8c76ef8bba99a8cc4e.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:700;src:local("Montserrat Bold italic"),local("Montserrat-Bolditalic"),url(/static/montserrat-latin-700italic-c41ad6bdb4bd504a843d546d0a47958d.woff2) format("woff2"),url(/static/montserrat-latin-700italic-6779372f04095051c62ed36bc1dcc142.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:800;src:local("Montserrat ExtraBold "),local("Montserrat-ExtraBold"),url(/static/montserrat-latin-800-db9a3e0ba7eaea32e5f55328ace6cf23.woff2) format("woff2"),url(/static/montserrat-latin-800-4e3c615967a2360f5db87d2f0fd2456f.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:800;src:local("Montserrat ExtraBold italic"),local("Montserrat-ExtraBolditalic"),url(/static/montserrat-latin-800italic-bf45bfa14805969eda318973947bc42b.woff2) format("woff2"),url(/static/montserrat-latin-800italic-fe82abb0bcede51bf724254878e0c374.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:normal;font-weight:900;src:local("Montserrat Black "),local("Montserrat-Black"),url(/static/montserrat-latin-900-e66c7edc609e24bacbb705175669d814.woff2) format("woff2"),url(/static/montserrat-latin-900-8211f418baeb8ec880b80ba3c682f957.woff) format("woff")}@font-face{font-display:swap;font-family:Montserrat;font-style:italic;font-weight:900;src:local("Montserrat Black italic"),local("Montserrat-Blackitalic"),url(/static/montserrat-latin-900italic-4454c775e48152c1a72510ceed3603e2.woff2) format("woff2"),url(/static/montserrat-latin-900italic-efcaa0f6a82ee0640b83a0916e6e8d68.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:normal;font-weight:300;src:local("Merriweather Light "),local("Merriweather-Light"),url(/static/merriweather-latin-300-fc117160c69a8ea0851b26dd14748ee4.woff2) format("woff2"),url(/static/merriweather-latin-300-58b18067ebbd21fda77b67e73c241d3b.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:italic;font-weight:300;src:local("Merriweather Light italic"),local("Merriweather-Lightitalic"),url(/static/merriweather-latin-300italic-fe29961474f8dbf77c0aa7b9a629e4bc.woff2) format("woff2"),url(/static/merriweather-latin-300italic-23c3f1f88683618a4fb8d265d33d383a.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:normal;font-weight:400;src:local("Merriweather Regular "),local("Merriweather-Regular"),url(/static/merriweather-latin-400-d9479e8023bef9cbd9bf8d6eabd6bf36.woff2) format("woff2"),url(/static/merriweather-latin-400-040426f99ff6e00b86506452e0d1f10b.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:italic;font-weight:400;src:local("Merriweather Regular italic"),local("Merriweather-Regularitalic"),url(/static/merriweather-latin-400italic-2de7bfeaf08fb03d4315d49947f062f7.woff2) format("woff2"),url(/static/merriweather-latin-400italic-79db67aca65f5285964ab332bd65f451.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:normal;font-weight:700;src:local("Merriweather Bold "),local("Merriweather-Bold"),url(/static/merriweather-latin-700-4b08e01d805fa35d7bf777f1b24314ae.woff2) format("woff2"),url(/static/merriweather-latin-700-22fb8afba4ab1f093b6ef9e28a9b6e92.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:italic;font-weight:700;src:local("Merriweather Bold italic"),local("Merriweather-Bolditalic"),url(/static/merriweather-latin-700italic-cd92541b177652fffb6e3b952f1c33f1.woff2) format("woff2"),url(/static/merriweather-latin-700italic-f87f3d87cea0dd0979bfc8ac9ea90243.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:normal;font-weight:900;src:local("Merriweather Black "),local("Merriweather-Black"),url(/static/merriweather-latin-900-f813fc6a4bee46eda5224ac7ebf1b7be.woff2) format("woff2"),url(/static/merriweather-latin-900-5d4e42cb44410674acd99153d57df032.woff) format("woff")}@font-face{font-display:swap;font-family:Merriweather;font-style:italic;font-weight:900;src:local("Merriweather Black italic"),local("Merriweather-Blackitalic"),url(/static/merriweather-latin-900italic-b7901d85486871c1779c0e93ddd85656.woff2) format("woff2"),url(/static/merriweather-latin-900italic-9647f9fdab98756989a8a5550eb205c3.woff) format("woff")}
+
+
+/*! normalize.css v8.0.1 | MIT License | github.com/necolas/normalize.css */html{-webkit-text-size-adjust:100%;line-height:1.15}body{margin:0}main{display:block}h1{font-size:2em;margin:.67em 0}hr{box-sizing:content-box;height:0;overflow:visible}pre{font-family:monospace,monospace;font-size:1em}a{background-color:transparent}abbr[title]{border-bottom:none;text-decoration:underline;-webkit-text-decoration:underline dotted;text-decoration:underline dotted}b,strong{font-weight:bolder}code,kbd,samp{font-family:monospace,monospace;font-size:1em}small{font-size:80%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sub{bottom:-.25em}sup{top:-.5em}img{border-style:none}button,input,optgroup,select,textarea{font-family:inherit;font-size:100%;line-height:1.15;margin:0}button,input{overflow:visible}button,select{text-transform:none}[type=button],[type=reset],[type=submit],button{-webkit-appearance:button}[type=button]::-moz-focus-inner,[type=reset]::-moz-focus-inner,[type=submit]::-moz-focus-inner,button::-moz-focus-inner{border-style:none;padding:0}[type=button]:-moz-focusring,[type=reset]:-moz-focusring,[type=submit]:-moz-focusring,button:-moz-focusring{outline:1px dotted ButtonText}fieldset{padding:.35em .75em .625em}legend{box-sizing:border-box;color:inherit;display:table;max-width:100%;padding:0;white-space:normal}progress{vertical-align:baseline}textarea{overflow:auto}[type=checkbox],[type=radio]{box-sizing:border-box;padding:0}[type=number]::-webkit-inner-spin-button,[type=number]::-webkit-outer-spin-button{height:auto}[type=search]{-webkit-appearance:textfield;outline-offset:-2px}[type=search]::-webkit-search-decoration{-webkit-appearance:none}::-webkit-file-upload-button{-webkit-appearance:button;font:inherit}details{display:block}summary{display:list-item}[hidden]{display:none}:root{--maxWidth-none:"none";--maxWidth-xl:52rem;--maxWidth-wrapper:var(--maxWidth-xl);--spacing-px:"1px";--spacing-0:0;--spacing-1:0.25rem;--spacing-2:0.5rem;--spacing-3:0.75rem;--spacing-4:1rem;--spacing-5:1.25rem;--spacing-6:1.5rem;--spacing-8:2rem;--spacing-10:2.5rem;--spacing-12:3rem;--spacing-16:4rem;--spacing-20:5rem;--spacing-24:6rem;--spacing-32:8rem;--fontWeight-normal:300;--fontWeight-bold:700;--fontSize-root:16px;--lineHeight-none:1;--lineHeight-tight:1.1;--lineHeight-normal:1.5;--lineHeight-relaxed:1.625;--fontSize-1:1rem;--fontSize-2:1.2rem;--fontSize-3:1.44rem;--fontSize-4:1.728rem;--fontSize-5:2.074rem;--fontSize-6:2.488rem;--fontSize-7:2.986rem;--color-primary:#e6007d;--color-hover:#14968c;--color-text:#0f554b;--color-text-light:#14968c;--color-heading:#0f554b;--color-accent:#c8c8c8;--color-white:#fff}@font-face{font-family:Ubuntu;font-style:normal;font-weight:400;src:url(/static/ubuntu-v20-latin-regular-5fa0d860656d0e40c0861657b07f82e0.eot);src:local(""),url(/static/ubuntu-v20-latin-regular-5fa0d860656d0e40c0861657b07f82e0.eot?#iefix) format("embedded-opentype"),url(/static/ubuntu-v20-latin-regular-2f02effe392a63dc07a5cdf187b2ef09.woff2) format("woff2"),url(/static/ubuntu-v20-latin-regular-6a317272decf6d2470386df117732fa6.woff) format("woff"),url(/static/ubuntu-v20-latin-regular-254e551b53a2ccebe384cd3af3a50c22.ttf) format("truetype"),url(/static/ubuntu-v20-latin-regular-f00252d5cc9aed7459bd59a61c740eba.svg#Ubuntu) format("svg")}@font-face{font-family:Ubuntu;font-style:normal;font-weight:700;src:url(/static/ubuntu-v20-latin-700-e7cf86fbe61557f0dcd34f4489f86840.eot);src:local(""),url(/static/ubuntu-v20-latin-700-e7cf86fbe61557f0dcd34f4489f86840.eot?#iefix) format("embedded-opentype"),url(/static/ubuntu-v20-latin-700-7ceec6fd3e7d00630b2568986c915532.woff2) format("woff2"),url(/static/ubuntu-v20-latin-700-6788e6ed863a8e559ef28b3913554383.woff) format("woff"),url(/static/ubuntu-v20-latin-700-fc38f31d636ef7ea94b75741b6d493c8.ttf) format("truetype"),url(/static/ubuntu-v20-latin-700-b0c73ddaf0479ad915d9ad30e1f4a5a7.svg#Ubuntu) format("svg")}*,:after,:before{box-sizing:border-box}html{font-size:var(--fontSize-root);line-height:var(--lineHeight-normal)}body,html{-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}body{color:var(--color-text);counter-reset:figcaption;font-family:Ubuntu,sans-serif;font-size:var(--fontSize-1)}hr{background:var(--color-accent);border:0;height:1px}h1,h2,h3,h4,h5,h6{font-family:var(--font-heading);letter-spacing:-.025em;line-height:var(--lineHeight-tight);margin-bottom:var(--spacing-6);margin-top:var(--spacing-12)}h2,h3,h4,h5,h6{color:var(--color-heading)}h1,h2,h3,h4,h5,h6{font-weight:var(--fontWeight-bold)}h1{font-size:var(--fontSize-6)}h2{font-size:var(--fontSize-5)}h3{font-size:var(--fontSize-4)}h4{font-size:var(--fontSize-3)}h5{font-size:var(--fontSize-2)}h6{font-size:var(--fontSize-1)}h1>a,h2>a,h3>a,h4>a,h5>a,h6>a{color:inherit;text-decoration:none}figcaption{text-align:center}figcaption:before{content:"Figure " counter(figcaption) ": ";counter-increment:figcaption}p{--baseline-multiplier:0.179;--x-height-multiplier:0.35;line-height:var(--lineHeight-relaxed);margin:var(--spacing-0) var(--spacing-0) var(--spacing-8) var(--spacing-0)}ol,p,ul{padding:var(--spacing-0)}ol,ul{list-style-image:none;margin-bottom:var(--spacing-8);margin-left:var(--spacing-4);margin-right:var(--spacing-0)}ol li,ul li{padding-left:var(--spacing-0)}li>p,ol li,ul li{margin-bottom:calc(var(--spacing-8)/2)}li :last-child{margin-bottom:var(--spacing-0)}li>ul{margin-left:var(--spacing-8);margin-top:calc(var(--spacing-8)/2)}blockquote{border-left:var(--spacing-1) solid var(--color-primary);color:var(--color-text-light);font-size:var(--fontSize-2);font-style:italic;margin-bottom:var(--spacing-8);margin-left:var(--spacing-4);margin-right:var(--spacing-8);padding:var(--spacing-0) var(--spacing-0) var(--spacing-0) var(--spacing-6)}blockquote>:last-child{margin-bottom:var(--spacing-0)}blockquote>ol,blockquote>ul{list-style-position:inside}table{border-collapse:collapse;border-spacing:.25rem;margin-bottom:var(--spacing-8);width:100%}table thead tr th{border-bottom:1px solid var(--color-accent)}a{color:var(--color-primary)}a:focus,a:hover{color:var(--color-hover);text-decoration:underline}.skohub-logo{display:inline-block;margin:0 0 var(--spacing-3) 0;width:100%}.skohub-logo a{color:var(--color-text);text-decoration:none}.skohub-img{height:50px;width:50px}.skohub-img,.skohub-title{display:inline-block;vertical-align:middle}.skohub-title{font-size:var(--fontSize-6);font-weight:var(--fontWeight-bold);padding:0 0 0 var(--spacing-3)}.skohub-nav{list-style:none;margin:var(--spacing-0);padding:var(--spacing-3) 0 0 0}.skohub-nav li{display:inline-block;margin:0 var(--spacing-2) 0 0}.skohub-nav li a{border:1px solid var(--color-text);border-radius:30px;color:var(--color-text);display:block;font-weight:var(--fontWeight-bold);padding:var(--spacing-2) var(--spacing-4);text-decoration:none}.skohub-nav li a:hover{border:1px solid var(--color-text-light);color:var(--color-text-light)}.global-wrapper{margin:var(--spacing-0) auto;max-width:var(--maxWidth-wrapper);padding:var(--spacing-10) var(--spacing-5) var(--spacing-0) var(--spacing-5)}.global-wrapper[data-is-root-path=true] .bio{margin-bottom:var(--spacing-12)}.global-header{margin-bottom:var(--spacing-3)}.main-heading{font-size:var(--fontSize-7);margin:0}.blog-posts{list-style:none;margin:var(--spacing-0) var(--spacing-0) var(--spacing-32) var(--spacing-0)}.is-blog-post{background:linear-gradient(180deg,#ebebeb,#fff 70%);border-radius:30px;padding:var(--spacing-8)}.post-list-item{margin-bottom:var(--spacing-8)}.post-list-item p{margin-bottom:var(--spacing-0)}.post-list-item h2{color:var(--color-primary);font-size:var(--fontSize-4);margin-bottom:var(--spacing-2);margin-top:var(--spacing-0)}.post-list-item header{margin-bottom:var(--spacing-4)}.header-link{font-family:var(--font-heading);font-size:var(--fontSize-2);font-weight:var(--fontWeight-bold);text-decoration:none}.bio{display:flex;margin-bottom:var(--spacing-8)}.bio p,.bio-avatar{margin-bottom:var(--spacing-0)}.bio-avatar{border-radius:100%;margin-right:var(--spacing-4);min-width:50px}.blog-post{margin:var(--spacing-12) var(--spacing-0) var(--spacing-0) var(--spacing-0)}.blog-post,.main-content{background:linear-gradient(180deg,#ebebeb,#fff 70%);border-radius:30px;padding:var(--spacing-8)}.main-content{margin:var(--spacing-12) var(--spacing-0) var(--spacing-32) var(--spacing-0)}.blog-post header h1,.main-content h1{margin:var(--spacing-0) var(--spacing-0) var(--spacing-4) var(--spacing-0)}.blog-post header p{font-family:var(--font-heading);font-size:var(--fontSize-2)}.blog-post-nav ul{margin:var(--spacing-0)}.gatsby-highlight{margin-bottom:var(--spacing-8)}.blog-post-nav{margin:var(--spacing-0) var(--spacing-0) var(--spacing-32) var(--spacing-0)}.blog-post-nav a{border:1px solid var(--color-primary);border-radius:30px;color:var(--color-primary);display:block;font-weight:var(--fontWeight-bold);padding:var(--spacing-2) var(--spacing-4);text-decoration:none}.blog-post-nav a:hover{border:1px solid var(--color-text-light);color:var(--color-text-light)}.wrapper-footer{bottom:0;left:50%;margin:var(--spacing-0) auto;max-width:var(--maxWidth-wrapper);padding:var(--spacing-10) var(--spacing-5) var(--spacing-0) var(--spacing-5);position:fixed;transform:translate(-50%);width:100%}.footer-navigation{background:var(--color-text-light);border-top-left-radius:30px;border-top-right-radius:30px}.footer-navigation ul{list-style:none;margin:var(--spacing-0);padding:var(--spacing-8) var(--spacing-0);text-align:center}.footer-navigation li{display:inline-block;margin:var(--spacing-2)}.footer-navigation li a{border-bottom:1px solid transparent;color:var(--color-white);text-decoration:none}.footer-navigation li a:hover{border-bottom:1px solid var(--color-white)}.svg-inline--fa{display:var(--fa-display,inline-block);height:1em;overflow:visible;vertical-align:-.125em}@media (max-width:42rem){blockquote{margin-left:var(--spacing-0);padding:var(--spacing-0) var(--spacing-0) var(--spacing-0) var(--spacing-4)}.skohub-title{font-size:var(--fontSize-5)}.footer-navigation ul{padding:var(--spacing-2) var(--spacing-0)}ol,ul{list-style-position:inside}}code[class*=language-],pre[class*=language-]{word-wrap:normal;background:none;color:#000;font-family:Consolas,Monaco,Andale Mono,Ubuntu Mono,monospace;font-size:1em;-webkit-hyphens:none;hyphens:none;line-height:1.5;tab-size:4;text-align:left;text-shadow:0 1px #fff;white-space:pre;word-break:normal;word-spacing:normal}code[class*=language-] ::selection,code[class*=language-]::selection,pre[class*=language-] ::selection,pre[class*=language-]::selection{background:#b3d4fc;text-shadow:none}@media print{code[class*=language-],pre[class*=language-]{text-shadow:none}}pre[class*=language-]{margin:.5em 0;overflow:auto;padding:1em}:not(pre)>code[class*=language-],pre[class*=language-]{background:#f5f2f0}:not(pre)>code[class*=language-]{border-radius:.3em;padding:.1em;white-space:normal}.token.cdata,.token.comment,.token.doctype,.token.prolog{color:#708090}.token.punctuation{color:#999}.token.namespace{opacity:.7}.token.boolean,.token.constant,.token.deleted,.token.number,.token.property,.token.symbol,.token.tag{color:#905}.token.attr-name,.token.builtin,.token.char,.token.inserted,.token.selector,.token.string{color:#690}.language-css .token.string,.style .token.string,.token.entity,.token.operator,.token.url{background:hsla(0,0%,100%,.5);color:#9a6e3a}.token.atrule,.token.attr-value,.token.keyword{color:#07a}.token.class-name,.token.function{color:#dd4a68}.token.important,.token.regex,.token.variable{color:#e90}.token.bold,.token.important{font-weight:700}.token.italic{font-style:italic}.token.entity{cursor:help}
\ No newline at end of file
diff --git a/using-typescript/index.html b/using-typescript/index.html
new file mode 100644
index 0000000..f3447d0
--- /dev/null
+++ b/using-typescript/index.html
@@ -0,0 +1,10 @@
+Using TypeScript | Skohub Blog
This means that you can create and write .ts/.tsx files for your pages, components etc. Please note that the gatsby-*.js files (like gatsby-node.js) currently don't support TypeScript yet.
For type checking you'll want to install typescript via npm and run tsc --init to create a .tsconfig file.
You're currently on the page "/*" which was built on 2024-03-27 07:26 am UTC.