- isValid: For a graph G, a schema Sch and a fixed ShapeMap ism, isValid(G, Sch, ism) indicates that for every RDFnode/shapeLabel pair (n, sl) in ism, the node n satisfies the shape expression identified by sl.
- The latter is captured by the expression satisfies(n, s, G, Sch, completeTyping(G, Sch)).
+ isValid: For a graph G, a schema Sch and a fixed ShapeMap ism, isValid(G, Sch, ism) indicates that for every RDF node/shapeLabel pair (n, sl) in ism, the node n satisfies the shape expression identified by sl.
+ The latterThis is captured by the expression satisfies(n, s, G, Sch, completeTyping(G, Sch))where s is the shape expression identified by sl.
The function satisfies is defined for every kind of shape expression.
The validation of an RDF graph G against a ShEx schema Sch is based on the existence of completeTyping(G, Sch).
- For an RDF graph G and a shapes schema Sch, a typing is a set of pairs of the form (n, s) where n is a node in G and s is a Shape that appears in some shape expression in the shapes mapping of Sch.
- A correct typing is a typing such that for every RDFnode/shape pair (n,s) in typing, matchesShape(n, s, G, Sch, typing) holds.
+ For an RDF graph G and a shapes schema Sch, a typing is a set of pairs of the form (n, s) where n is a node in G and s is a Shape that appears in some shape expression in Sch.shapes.
+ A correct typing is a typing such that for every RDF node/shape pair (n,s) in typing, matchesShape(n, s, G, Sch, typing) holds.
completeTyping(G, Sch) is a unique correct typing that exists for every graph and every ShEx schema that satisfies the schema requirements.
@@ -506,14 +517,14 @@
Validation Definition
- completeTypingOn(1, G, Sch) is the union of all correct typings that contain only RDFnode/shape pairs (n,s) with stratum(s) = 1;
+ completeTypingOn(1, G, Sch) is the union of all correct typings that contain only RDF node/shape pairs (n,s) with stratum(s) = 1;
for every i between 2 and k, completeTypingOn(i, G, Sch) is the union of all correct typings that:
-
contain only RDFnode/shape pairs (n,s) with stratum(s) ≤ i
+
contain only RDF node/shape pairs (n,s) with stratum(s) ≤ i
-
are equal to completeTypingOn(i-1, G, Sch) when restricted to their RDFnode/shape pairs (n1,s1) for which stratum(s1) < i.
+
are equal to completeTypingOn(i-1, G, Sch) when restricted to their RDF node/shape pairs (n1,s1) for which stratum(s1) < i.
This property is reminiscent of the use of stratified negation in Datalog.
- In order to decide isValid(Sch, G, m), it is sufficient to compute only a portion of completeTyping using an appropriate algorithm.
+ In order to decide isValid(Sch, G, m), it is sufficient to compute only a portion of the complete typing using an appropriate algorithm.
- A shape expression is composed of four kinds of objects combined with the algebraic operators And, Or and Not:
+ A shape expression is composed of four kinds of objects combined with the algebraic operators `And`, `Or` and `Not`:
- a node constraint (NodeConstraint) that defines the set of allowed values of a node.
+ A node constraint (NodeConstraint) defines the set of allowed values of a node.
These include specification of RDF node kind, literal datatype, XML String and numeric facets and enumeration of value sets.
- a shape constraint (Shape) that defines a constraint on the allowed neighbourhood of a node, that is, the allowed triples that contain this node as subject or object.
+ A shape constraint (Shape) defines a constraint on the allowed neighbourhood of a node, that is, the allowed triples that contain this node as subject or object.
- an external shape (ShapeExternal) which is an extension mechanism to externally define e.g. functional shapes or prohibitively large value sets.
+ An external shape (ShapeExternal) is an extension mechanism to externally define e.g. functional shapes or prohibitively large value sets.
In this ShapeOr's shapeExprs, "http://schema.example/#IssueShape" is a reference to the shape expression with the id "http://schema.example/#IssueShape".
-
+
+
+
Preliminary defintions
+
+
+ Consider a fixed schema `Sch`.
+
+
+
+ A schema MUST NOT contain any shapeExprLabel that has a negated reference to itself, either directly or transitively.
+ This is formalized by the requirement that the dependency graph of a schema MUST NOT have a cycle that traverses some negated reference.
+ Is this (partially) redundant against the stratification?
+
+
+
+ We construct a graph which nodes are the shapeExprLabels that appear in `Sch`.
+ We say that a label `L` has a reference to a label `L'` if there exists a TripleConstraint `tc` in `tcs(def(L))` such that a shapeExprRef `L'` appears in tc.valueExpr either directly or by traversing any ShapeAnd, ShapeOr, ShapeNot, tripleExprRef, in te TripleContraint's valueExpr, but without traversing shapeExprRefs.
+ A reference is a negated reference if:
+
+
+
+
an odd number of ShapeNot is traversed before reaching `tc`, or
+
an odd number of ShapeNot after tc.valueExpr, or
+
[TODO : something about extra]
+
+
+
+ The dependency graph ... as before, using negated reference.
+
+
-
Semantics
+
Semantics
+
+
+ For a shape expression `se` we define its set of shapes `shapes(se)` recursively on the structure of `se`:
+
+
+
if `se` is a NodeConstraint, then `shapes(se) = emptyset`
+
if `se` is a Shape, then `shapes(se) = {se}`
+
if `se` is a ShapeNot, then `shapes(se) = shapes(se.shapeExpr)`
+
if `se` is ShapeAnd or ShapeOr, then `shapes(se)` is the union of the sets `shapes(se')` for all `se'` in `se.shapeExprs`
+
if `se` is a shapeExprRef with label `L`, then `shapes(se) = shapes(def(L))`
+
+
+
+
For a tripleExpr `te` we define its set of TripleConstraints `tcs(te)` recursively on the structure of `te`:
+
+
if `te` is a TripleConstr, then `tcs(te)` is the singleton set `{te}`,
+
if `te` is OneOf or EachOf, then `tcs(te)` is the union of the sets `tcs(te')` for all the `se'` in `te.trileExprs`
+
if `te` is a tripleExpreRef with label `L`, then `tcs(te) = tcs(def(L))`
+
+
+ For a Shape `s`, we define `tcs(s) = tcs(s.expression)`.
+
+
+ For a shapeExpr `se`, we define `tcs(se)` as the union of the sets `tcs(s)` for all Shape `s` in `shapes(se)`.
+
+
+Finally, for a triple expression or a shape expression `e` we define `predicates(e)` as the set that contains exactly the `tc.predicate` for all the TripleConstraints `tc` in `tcs(e)`.
+
+
+
Definition of extension hierarchy graph
+
+
+ A shape expression label is called abstract if its definition is marked with `ABSTRACT`.
+ For shape expression labels `L_1`, `L_2`, we say that `L_2` directly extends `L_1` if `shapes(def(L_2))` contains a Shape `s` s.t. `s.extends` contains `L_2`.
+The extension hierarchy graph of a shapes schema is a directed graph which nodes are the shape expression labels of the schema and that has an edge from `L_2` to `L_1` whenever `L_2` directly extends `L_1`.
+
+
+
+
+ Schema requirement: the extension hierarchy graph must be acyclic.
+
+
+ For a shape label `L`, we define
+
+
+
the set `supertypes(L)` is the set of labels `L'` such that there is a possibly empty path from `L` to `L'` in the extension hierarchy graph,
+
the set `baseSubtypes(L)` is the set of labels `L'` which definition is not abstract, and s.t. there is a possibly empty path from `L'` to `L` in the extension hierarchy graph.
+
+
+ Note that because an empty path is allowed in the above definitions, every label belongs to its set of supertypes and every non abstract label belongs to its set of base subtypes.
+
+
+
+
Definition of *extendable shape expression*
+
+A shapeExpr is called *extendable* if:
+
+
+
it is named, say with label `L`,
+
it is of the form either `s` or `s AND se`, where `s` is a Shape and `se` is a shapeExpr. In this case we denote `s` as `mainShape(L)` and `se` as `constraint(L)`:
+
`se` does not contain an EXTENDS, that is, `s'.extends` is empty for every `s'` in `shapes(se)`,
+
`def(L')` is an extendable shape expression for every `L'` in `s.extends` (note that this condition is trivially met when `s.extends` is empty),
+
the set `predicates(se)` is included the union of the sets `predicates(mainShape(L'))` for all shape expression names `L'` that belong to `supertypes(L)`.
+
+
+Note that a named Shape is always an extendable shape expression.
+
+
+For an extendable shape expression with label `L` we define:
+
+
+
`superTcs(L)` as the union of the sets `tcs(mainShape(L'))` for all shape expression names `L'` that belong to `supertypes(L)`,
+
`superPredicates(L)` as the union of the sets `predicates(mainShape(L'))` for all shape expression names `L'` that belong to `supertypes(L)`.
+
+
+ Schema requirement EXTENDS appears only in extendable shape expressions. That is, for every Shape `s` that appears in the schema, if `s.extends` is non empty and for every shapeExpr `se` in the schema, if `s` belongs to `shapes(se)`, then `se` is an extendable shape expression.
+
+
+
+
Stratified schema
+
+ [TODO precise definition + how negation and stratification are to be adapted]
+
+
+Briefly: stratification is defined on shape expression labels (and not on Shapes as it used to be).
+
+
+# Semantics
+@@Indicates this should be moved?
+
+
+
+
Definitions
+
+Consider a ShEx schema `Sch` and a graph `G`, and let `U` be the set of shape expression labels of `Sch` and `N` be the set of nodes of `G`.(TODO : which labels to consider ? Only the top level ones, or any label ?)
+Then a *typing* over `G` and `Sch` is a subset of `N x U`.
+
+
+The satisfaction of the schema is defined w.r.t. a so called maximal typing, which is guaranteed to exist.
+The maximal typing is such that whenever `(n,L)` belongs to it, it holds that the neighborhood of the node `n` *satisfies* `def(L)` w.r.t. the maximal typing. Formally, `satisfies(G, Sch, maximalTyping, n, def(L), _)` holds, where `satisfies` is the function defined below and the underscore indicates that no value is given for the corresponding optional parameter.
+
+
+
+
The satisfies function
+
+It takes six parameters: the graph `G`, the schema `Sch`, a typing `typing` over `G` and `Sch`, a node `n` in `G`, a shapeExpr `se` from `Sch`, and an optional neighborhood set of triples `npart` : `satisfies(n, se, typing, npart, G, Sch)`. The value of the function is a Boolean. The function is defined recursively on the structure of the shape expression:
+
+
+[TODO : fix the order of the parameters in the satisfies function as it is not always the same]
+
+
+
`se` is node constraint and `satisfies2(n, se)` ... as before
+
`se` is ShapeOr and `satisfies(n, se', typing, npart, G, Sch)` holds for some `se'` in `se.shapeExprs`
+
`se` is a ShapeNot and `satisfies(n, npart, se, typing, G, Sch)` does not hold
+
`se` is a ShapeExternal and implementation-specific mechanisms not defined in this specification indicate success
+
`se` is a shapeExprRef with label `L` and `satisfies(n, npart, def(L), typing, G, Sch)` **[Note : here we unfold the definition of L and do not look for base subtypes. Base subtypes are considered only for the whole neighborhood, that is in TripleConstr]**
+
`se` is an extendible shape expression, let `L` be the label of `se`. Let `neigh = npart` if `npart` is given, or `neigh` be the neighbourhood of the node `n` is `npart = _`. Then
+
there exist sets of triples `matchables`, `unmatchables` and `remainder` s.t.
+
`matchables`, `unmatchables` and `remainder` are pairwise dijoint and their union is equal to `neigh`,
+
`remainder` is the set of triples from the neighborhood of `n` whose predicates do not appear neither in `superPredicates(L)` nor in `mainShape(L).extra`
+
if `mainShape(s).closed`, then `remainder` is empty
+
`matchables` contains all the triples `t` s.t. `matches(n, {t}, tc, typing, G, Sch)` holds for some `tc` in `superTcs(L)`,
+
(therefore, `unmatchables` contains all the triples `t` whose predicate is in `superPredicates(L)` or in `mainShape(L).extra`, and such that `matches(n, {t}, tc, typing)` does not hold for no `tc` in `superTcs(L)`),
+
all predicates in `unmatchables` appear in `mainShape(L).extra`,
+
there exists a partition of `matchables` that with every `L'` in `supertypes(L)` associates a set of triples denoted `npart(L')` s.t.
+
+
`matchables` is the union of the sets `npart(L')` for all the `L'`s in `supertypes(L)` and these sets are mutually disjoint,
+
`matches(n, npart(L'), mainShape(L').tripleExpr, typing, G, Sch)` holds for every `L'` in `supertypes(L)`,
+
`satisfies(n, supernpart(L'), constraint(L'), typing, G, Sch)` holds for every `L'` in `supertypes(L)` whenever `constraint(L')` exists, where `supernparts(L')` is the union of the `npart(L'')` for all the shape expression labels `L''` in `supertypes(L')`
+
`se` is a ShapeAnd without EXTENDS and `satisfies(n, se', typing, npart, G, Sch)` holds for every `se'` in `se`.shapeExprs
+
satisfies: The expression satisfies(n, se, G, Sch, t) indicates that a node n and a graphG satisfy a shape expressionse with typingt for schema Sch. notSatisfies: Conversely, notSatisfies(n, se, G, Sch, t) indicates that n and G do not satisfy se with the given typingt.
@@ -683,7 +854,7 @@
+ It takes six parameters: the graph `G`, the schema `Sch`, a typing `t` over `G` and `Sch`, a node `n` in `G`, a tripleExpr `te` from `Sch`, and a (non optional) neighborhood set of triples `npart` : `matches(n, te, typing, npart, G, Sch)`.
+ The value of the function is a Boolean. The function is defined recursively on the structure of the triple expression:
+
+
+
`te` is a TripleConstraint and `npart` is a singleton set with unique triple `t` and the predicate of `t` is the same as the predicate of `te` and ... TODO define inverse or not as before ... and:
+
+
`te` does not have a `valueExpr`, or
+
`te` has a valueExpr that is a shapeExprRef with label `L` and **there exists a label `L'` in `baseSubtypes(L)` such that `(n', L')` belongs to `typing` (note: this is the unique place in the recursive definition of the semantics where substitutability is considered.
+ Substitution is also considered when validating a shape map)**, or
+
`te` has a valueExpr that is not a shapeExprRef, then `satisfies(n', te.valueExpr, typing, _, G, Sch)` holds, where `n'` is the object ( ... TODO ... or the subject if inverse ) of `t`
+
+
+
+
`te` is tripleExprRef with label `L`, and `matches(n, def(L), typing, npart, G, Sch)` holds
+
`te` has a cardinality `[min, max]` and `npart` can be partitioned into `k` pairwise disjoint sets `npart_1` ... `npart_k` for some `min <= k <= max` and `matches(n, te, typing, npart_i, G, Sch)` for every `i` in `1..k`
+
`te` is a OneOf and `matches(n, te', typing, npart, G, Sch)` for some `te'` in `te`.tripleExprs
+
`te` is EachOf and `matches(n, te', typing, npart, G, Sch)` for every `te'` in `te`.tripleExprs
+
+
+
+
Validating a shape map
+
+ A shape map is ... TODO precise definition ... a kind of a typing, i.e. a set of pairs `(n, L)` where `n` is a node and `L` is a shape expression label.
+ A shape map is satisfied if for every `(n, L)` in the map, there exists a label `L'` in `baseSubtypes(L)` s.t. `(n', L')` belongs to the maximal typing
+
+
+
+
Definition of the maximal typing
+
+ *Similar to before, through stratification, but now the elements of the typing are pairs (node, label) where label is a **shapeExprLabel**. Before we had pairs (node, Shape) in the typing. With this new definition of typing the stratification is easier to define. What we lose is that some schemas that would have been considered stratified with the old definition won't be considered stratified with the new one. This breaks backwards compatibility, provided that there exists somewhere a schema with negations that has non trivial stratification structure, ... which would surprise me a lot*.
+
matches: asserts that a triple expression is matched by a set of triples that come from the neighbourhood of a node in an RDF graph.
The expression matches(T, expr, m) indicates that a set of triples T can satisfy these rules:
@@ -2258,7 +2463,7 @@
Negation Requirement
- The dependency graph of the schema Sch is the graph which vertices are all the Shapes that appear in some shape expression in the shapes of Sch, and that has two kinds of edges: negative and positive.
+ The dependency graph of the schema Sch is the graph whose vertices are all the Shapes that appear in some shape expression in the shapes of Sch, and that has two kinds of edges: negative and positive.
There is a negative edge from s1 to s2 if s1 has a negated reference to s2.
There is a positive edge from s1 to s2 if s1 has a reference but not a negated reference to s2.
+ As with Turtle and SPARQL, ShExC offers URL resolution relative to a base per [[RFC3986]] and prefixes map to provide shorthand ways to write IRI identifiers.
+
When used in a prefixDecl production, the prefix is a potentially empty unicode string matching the first argument of the rule and serves as a key into the prefixes map.
When used in a prefixDecl production, the prefix is a potentially empty unicode string matching the first argument of the rule and serves as a key into the prefixes map.
";return t?Promise.resolve(e):n?void n(null,e):e}if(t)return Promise.reject(r);if(!n)throw r;n(r)}}((r={...Rt.defaults,...o}).silent,r.async,s);if(null==n)return i(new Error("marked(): input parameter is undefined or null"));if("string"!=typeof n)return i(new Error("marked(): input parameter is of type "+Object.prototype.toString.call(n)+", string expected"));if(function(e){e&&e.sanitize&&!e.silent&&console.warn("marked(): sanitize and sanitizer parameters are deprecated since version 0.7.0, should not be used and will be removed in the future. Read more here: https://marked.js.org/#/USING_ADVANCED.md#options")}(r),r.hooks&&(r.hooks.options=r),s){const o=r.highlight;let a;try{r.hooks&&(n=r.hooks.preprocess(n)),a=e(n,r)}catch(e){return i(e)}const c=function(e){let n;if(!e)try{r.walkTokens&&Rt.walkTokens(a,r.walkTokens),n=t(a,r),r.hooks&&(n=r.hooks.postprocess(n))}catch(t){e=t}return r.highlight=o,e?i(e):s(null,n)};if(!o||o.length<3)return c();if(delete r.highlight,!a.length)return c();let l=0;return Rt.walkTokens(a,(function(e){"code"===e.type&&(l++,setTimeout((()=>{o(e.text,e.lang,(function(t,n){if(t)return c(t);null!=n&&n!==e.text&&(e.text=n,e.escaped=!0),l--,0===l&&c()}))}),0))})),void(0===l&&c())}if(r.async)return Promise.resolve(r.hooks?r.hooks.preprocess(n):n).then((t=>e(t,r))).then((e=>r.walkTokens?Promise.all(Rt.walkTokens(e,r.walkTokens)).then((()=>e)):e)).then((e=>t(e,r))).then((e=>r.hooks?r.hooks.postprocess(e):e)).catch(i);try{r.hooks&&(n=r.hooks.preprocess(n));const s=e(n,r);r.walkTokens&&Rt.walkTokens(s,r.walkTokens);let o=t(s,r);return r.hooks&&(o=r.hooks.postprocess(o)),o}catch(e){return i(e)}}}function Rt(e,t,n){return Lt($t.lex,Et.parse)(e,t,n)}Rt.options=Rt.setOptions=function(e){var t;return Rt.defaults={...Rt.defaults,...e},t=Rt.defaults,Ke=t,Rt},Rt.getDefaults=Ge,Rt.defaults=Ke,Rt.use=function(...e){const t=Rt.defaults.extensions||{renderers:{},childTokens:{}};e.forEach((e=>{const n={...e};if(n.async=Rt.defaults.async||n.async||!1,e.extensions&&(e.extensions.forEach((e=>{if(!e.name)throw new Error("extension name required");if(e.renderer){const n=t.renderers[e.name];t.renderers[e.name]=n?function(...t){let r=e.renderer.apply(this,t);return!1===r&&(r=n.apply(this,t)),r}:e.renderer}if(e.tokenizer){if(!e.level||"block"!==e.level&&"inline"!==e.level)throw new Error("extension level must be 'block' or 'inline'");t[e.level]?t[e.level].unshift(e.tokenizer):t[e.level]=[e.tokenizer],e.start&&("block"===e.level?t.startBlock?t.startBlock.push(e.start):t.startBlock=[e.start]:"inline"===e.level&&(t.startInline?t.startInline.push(e.start):t.startInline=[e.start]))}e.childTokens&&(t.childTokens[e.name]=e.childTokens)})),n.extensions=t),e.renderer){const t=Rt.defaults.renderer||new _t;for(const n in e.renderer){const r=t[n];t[n]=(...s)=>{let o=e.renderer[n].apply(t,s);return!1===o&&(o=r.apply(t,s)),o}}n.renderer=t}if(e.tokenizer){const t=Rt.defaults.tokenizer||new yt;for(const n in e.tokenizer){const r=t[n];t[n]=(...s)=>{let o=e.tokenizer[n].apply(t,s);return!1===o&&(o=r.apply(t,s)),o}}n.tokenizer=t}if(e.hooks){const t=Rt.defaults.hooks||new At;for(const n in e.hooks){const r=t[n];At.passThroughHooks.has(n)?t[n]=s=>{if(Rt.defaults.async)return Promise.resolve(e.hooks[n].call(t,s)).then((e=>r.call(t,e)));const o=e.hooks[n].call(t,s);return r.call(t,o)}:t[n]=(...s)=>{let o=e.hooks[n].apply(t,s);return!1===o&&(o=r.apply(t,s)),o}}n.hooks=t}if(e.walkTokens){const t=Rt.defaults.walkTokens;n.walkTokens=function(n){let r=[];return r.push(e.walkTokens.call(this,n)),t&&(r=r.concat(t.call(this,n))),r}}Rt.setOptions(n)}))},Rt.walkTokens=function(e,t){let n=[];for(const r of e)switch(n=n.concat(t.call(Rt,r)),r.type){case"table":for(const e of r.header)n=n.concat(Rt.walkTokens(e.tokens,t));for(const e of r.rows)for(const r of e)n=n.concat(Rt.walkTokens(r.tokens,t));break;case"list":n=n.concat(Rt.walkTokens(r.items,t));break;default:Rt.defaults.extensions&&Rt.defaults.extensions.childTokens&&Rt.defaults.extensions.childTokens[r.type]?Rt.defaults.extensions.childTokens[r.type].forEach((function(e){n=n.concat(Rt.walkTokens(r[e],t))})):r.tokens&&(n=n.concat(Rt.walkTokens(r.tokens,t)))}return n},Rt.parseInline=Lt($t.lexInline,Et.parseInline),Rt.Parser=Et,Rt.parser=Et.parse,Rt.Renderer=_t,Rt.TextRenderer=St,Rt.Lexer=$t,Rt.lexer=$t.lex,Rt.Tokenizer=yt,Rt.Slugger=Ct,Rt.Hooks=At,Rt.parse=Rt,Rt.options,Rt.setOptions,Rt.use,Rt.walkTokens,Rt.parseInline;"undefined"!=typeof globalThis?globalThis:"undefined"!=typeof window?window:"undefined"!=typeof global?global:"undefined"!=typeof self&&self;function Tt(e){return e&&e.__esModule&&Object.prototype.hasOwnProperty.call(e,"default")?e.default:e}var Dt,It={exports:{}};Dt=It,function(e,t){Dt.exports=t()}(0,(function(){var e=[],t=[],n={},r={},s={};function o(e){return"string"==typeof e?new RegExp("^"+e+"$","i"):e}function i(e,t){return e===t?t:e===e.toLowerCase()?t.toLowerCase():e===e.toUpperCase()?t.toUpperCase():e[0]===e[0].toUpperCase()?t.charAt(0).toUpperCase()+t.substr(1).toLowerCase():t.toLowerCase()}function a(e,t){return e.replace(/\$(\d{1,2})/g,(function(e,n){return t[n]||""}))}function c(e,t){return e.replace(t[0],(function(n,r){var s=a(t[1],arguments);return i(""===n?e[r-1]:n,s)}))}function l(e,t,r){if(!e.length||n.hasOwnProperty(e))return t;for(var s=r.length;s--;){var o=r[s];if(o[0].test(t))return c(t,o)}return t}function u(e,t,n){return function(r){var s=r.toLowerCase();return t.hasOwnProperty(s)?i(r,s):e.hasOwnProperty(s)?i(r,e[s]):l(s,r,n)}}function d(e,t,n,r){return function(r){var s=r.toLowerCase();return!!t.hasOwnProperty(s)||!e.hasOwnProperty(s)&&l(s,s,n)===s}}function p(e,t,n){return(n?t+" ":"")+(1===t?p.singular(e):p.plural(e))}return p.plural=u(s,r,e),p.isPlural=d(s,r,e),p.singular=u(r,s,t),p.isSingular=d(r,s,t),p.addPluralRule=function(t,n){e.push([o(t),n])},p.addSingularRule=function(e,n){t.push([o(e),n])},p.addUncountableRule=function(e){"string"!=typeof e?(p.addPluralRule(e,"$0"),p.addSingularRule(e,"$0")):n[e.toLowerCase()]=!0},p.addIrregularRule=function(e,t){t=t.toLowerCase(),e=e.toLowerCase(),s[e]=t,r[t]=e},[["I","we"],["me","us"],["he","they"],["she","they"],["them","them"],["myself","ourselves"],["yourself","yourselves"],["itself","themselves"],["herself","themselves"],["himself","themselves"],["themself","themselves"],["is","are"],["was","were"],["has","have"],["this","these"],["that","those"],["echo","echoes"],["dingo","dingoes"],["volcano","volcanoes"],["tornado","tornadoes"],["torpedo","torpedoes"],["genus","genera"],["viscus","viscera"],["stigma","stigmata"],["stoma","stomata"],["dogma","dogmata"],["lemma","lemmata"],["schema","schemata"],["anathema","anathemata"],["ox","oxen"],["axe","axes"],["die","dice"],["yes","yeses"],["foot","feet"],["eave","eaves"],["goose","geese"],["tooth","teeth"],["quiz","quizzes"],["human","humans"],["proof","proofs"],["carve","carves"],["valve","valves"],["looey","looies"],["thief","thieves"],["groove","grooves"],["pickaxe","pickaxes"],["passerby","passersby"]].forEach((function(e){return p.addIrregularRule(e[0],e[1])})),[[/s?$/i,"s"],[/[^\u0000-\u007F]$/i,"$0"],[/([^aeiou]ese)$/i,"$1"],[/(ax|test)is$/i,"$1es"],[/(alias|[^aou]us|t[lm]as|gas|ris)$/i,"$1es"],[/(e[mn]u)s?$/i,"$1s"],[/([^l]ias|[aeiou]las|[ejzr]as|[iu]am)$/i,"$1"],[/(alumn|syllab|vir|radi|nucle|fung|cact|stimul|termin|bacill|foc|uter|loc|strat)(?:us|i)$/i,"$1i"],[/(alumn|alg|vertebr)(?:a|ae)$/i,"$1ae"],[/(seraph|cherub)(?:im)?$/i,"$1im"],[/(her|at|gr)o$/i,"$1oes"],[/(agend|addend|millenni|dat|extrem|bacteri|desiderat|strat|candelabr|errat|ov|symposi|curricul|automat|quor)(?:a|um)$/i,"$1a"],[/(apheli|hyperbat|periheli|asyndet|noumen|phenomen|criteri|organ|prolegomen|hedr|automat)(?:a|on)$/i,"$1a"],[/sis$/i,"ses"],[/(?:(kni|wi|li)fe|(ar|l|ea|eo|oa|hoo)f)$/i,"$1$2ves"],[/([^aeiouy]|qu)y$/i,"$1ies"],[/([^ch][ieo][ln])ey$/i,"$1ies"],[/(x|ch|ss|sh|zz)$/i,"$1es"],[/(matr|cod|mur|sil|vert|ind|append)(?:ix|ex)$/i,"$1ices"],[/\b((?:tit)?m|l)(?:ice|ouse)$/i,"$1ice"],[/(pe)(?:rson|ople)$/i,"$1ople"],[/(child)(?:ren)?$/i,"$1ren"],[/eaux$/i,"$0"],[/m[ae]n$/i,"men"],["thou","you"]].forEach((function(e){return p.addPluralRule(e[0],e[1])})),[[/s$/i,""],[/(ss)$/i,"$1"],[/(wi|kni|(?:after|half|high|low|mid|non|night|[^\w]|^)li)ves$/i,"$1fe"],[/(ar|(?:wo|[ae])l|[eo][ao])ves$/i,"$1f"],[/ies$/i,"y"],[/\b([pl]|zomb|(?:neck|cross)?t|coll|faer|food|gen|goon|group|lass|talk|goal|cut)ies$/i,"$1ie"],[/\b(mon|smil)ies$/i,"$1ey"],[/\b((?:tit)?m|l)ice$/i,"$1ouse"],[/(seraph|cherub)im$/i,"$1"],[/(x|ch|ss|sh|zz|tto|go|cho|alias|[^aou]us|t[lm]as|gas|(?:her|at|gr)o|[aeiou]ris)(?:es)?$/i,"$1"],[/(analy|diagno|parenthe|progno|synop|the|empha|cri|ne)(?:sis|ses)$/i,"$1sis"],[/(movie|twelve|abuse|e[mn]u)s$/i,"$1"],[/(test)(?:is|es)$/i,"$1is"],[/(alumn|syllab|vir|radi|nucle|fung|cact|stimul|termin|bacill|foc|uter|loc|strat)(?:us|i)$/i,"$1us"],[/(agend|addend|millenni|dat|extrem|bacteri|desiderat|strat|candelabr|errat|ov|symposi|curricul|quor)a$/i,"$1um"],[/(apheli|hyperbat|periheli|asyndet|noumen|phenomen|criteri|organ|prolegomen|hedr|automat)a$/i,"$1on"],[/(alumn|alg|vertebr)ae$/i,"$1a"],[/(cod|mur|sil|vert|ind)ices$/i,"$1ex"],[/(matr|append)ices$/i,"$1ix"],[/(pe)(rson|ople)$/i,"$1rson"],[/(child)ren$/i,"$1"],[/(eau)x?$/i,"$1"],[/men$/i,"man"]].forEach((function(e){return p.addSingularRule(e[0],e[1])})),["adulthood","advice","agenda","aid","aircraft","alcohol","ammo","analytics","anime","athletics","audio","bison","blood","bream","buffalo","butter","carp","cash","chassis","chess","clothing","cod","commerce","cooperation","corps","debris","diabetes","digestion","elk","energy","equipment","excretion","expertise","firmware","flounder","fun","gallows","garbage","graffiti","hardware","headquarters","health","herpes","highjinks","homework","housework","information","jeans","justice","kudos","labour","literature","machinery","mackerel","mail","media","mews","moose","music","mud","manga","news","only","personnel","pike","plankton","pliers","police","pollution","premises","rain","research","rice","salmon","scissors","series","sewage","shambles","shrimp","software","species","staff","swine","tennis","traffic","transportation","trout","tuna","wealth","welfare","whiting","wildebeest","wildlife","you",/pok[eé]mon$/i,/[^aeiou]ese$/i,/deer$/i,/fish$/i,/measles$/i,/o[iu]s$/i,/pox$/i,/sheep$/i].forEach(p.addUncountableRule),p}));var Nt=Tt(It.exports),Pt=function(e){var t={};try{t.WeakMap=WeakMap}catch(u){t.WeakMap=function(e,t){var n=t.defineProperty,r=t.hasOwnProperty,s=o.prototype;return s.delete=function(e){return this.has(e)&&delete e[this._]},s.get=function(e){return this.has(e)?e[this._]:void 0},s.has=function(e){return r.call(e,this._)},s.set=function(e,t){return n(e,this._,{configurable:!0,value:t}),this},o;function o(t){n(this,"_",{value:"_@ungap/weakmap"+e++}),t&&t.forEach(i,this)}function i(e){this.set(e[0],e[1])}}(Math.random(),Object)}var n=t.WeakMap,r={};try{r.WeakSet=WeakSet}catch(u){!function(e,t){var n=s.prototype;function s(){t(this,"_",{value:"_@ungap/weakmap"+e++})}n.add=function(e){return this.has(e)||t(e,this._,{value:!0,configurable:!0}),this},n.has=function(e){return this.hasOwnProperty.call(e,this._)},n.delete=function(e){return this.has(e)&&delete e[this._]},r.WeakSet=s}(Math.random(),Object.defineProperty)}function s(e,t,n,r,s,o){for(var i=("selectedIndex"in t),a=i;r>>0;ni;)--c;l=a+r-c;var y=Array(l),w=u[c];for(--n;w;){for(var v=w.newi,k=w.oldi;v"+e+"